Monday, September 18, 2017

Tattoos, Architecture, and Copyright

In my IP seminar, I ask students to pick an article to present in class for a critical style and substance review. This year, one of my students picked an article about copyright and tattoos, a very live issue. The article was decent enough, raising many concerns about tattoos: Is human skin fixed? Is it a copy? How do you deposit it at the Library of Congress? (answer: photographs) What rights are there to modify it? To photograph it? Why is it ok for photographers to take pictures, but not ok for video game companies to emulate them? Can they be removed or modified under VARA (which protects against such things for visual art)?

It occurred to me that we ask many of these same questions with architecture, and that the architectural rules have solved the problem. You can take pictures of buildings. You can modify and destroy buildings. You register buildings by depositing plans and photographs. Standard features are not protectible (sorry, no teardrop, RIP, and Mom tattoo protection). But you can't copy building designs. If we view tattoos on the body as a design incorporated into a physical structure (the human body), it all makes sense, and solves many of our definitional and protection problems.

Clever, right? I was going to write and article about it, maybe. Except then I discovered that somebody else had. In That Old Familiar Sting: Tattoos, Publicity and Copyright, Matthew Parker writes:

Tattoos have experienced a significant rise in popularity over the last several decades, and in particular an explosion in popularity in the 2000s and 2010s. Despite this rising popularity and acceptance, the actual mechanics of tattoo ownership and copyright remain very much an issue of first impression before the courts. A series of high-priced lawsuits involving famous athletes and celebrities have come close to the Supreme Court at times, but were ultimately settled before any precedent could be set. This article describes a history of tattoos and how they might be seen to fit in to existing copyright law, and then proposes a scheme by which tattoo copyrights would be bifurcated similar to architecture under the Architectural Works Copyright Protection Act.
It's a whole article, so Parker spends more time developing the theory and dealing with topics such as joint ownership than I do in my glib  recap. For those interested in this topic, it's certainly a thought-provoking analogy worth considering.

Barton Beebe: Bleistein and the Problem of Aesthetic Progress in American Copyright Law



Bleistein v. Donaldson Lithographing Co., is a well-known early twentieth century copyright decision of the U.S. Supreme Court. In his opinion for the majority, Justice Holmes is taken to have articulated two central propositions about the working of copyright law. The first is the idea that copyright's originality requirement may be satisfied by the notion of "personality," or the "personal reaction of an individual upon nature," which was satisfied in just about every work of authorship. The second is the principle of aesthetic neutrality, according to which "[it] would be a dangerous undertaking for persons trained only to the law to constitute themselves final judges of the worth of pictorial illustrations, outside of the narrowest and most obvious limits." Both of these propositions are today understood as relating to copyright's relatively toothless originality requirement, which few works ever fail to satisfy.

In a paper recently published in the Columbia Law Review, Barton Beebe (NYU) unravels the intellectual history of Bleistein and concluded that for over a century, American copyright jurisprudence has relied on a misreading (and misunderstanding) of what Holmes was trying to do in his opinion. On the first proposition, he shows that Holmes was deeply influenced by American (rather than British or European) literary romanticism, which constructed the author in a "distinctively democratic—and more particularly, Emersonian—image of everyday, common genius." (p. 370). On the second, Beebe argues that Holmes' comments on neutrality had little to do with the originality requirement, but were instead a response to the dissenting opinion that had sought to deny protection to the work at issue (an advertisement for a circus) because it did not "promote the progress," as mandated by the Constitution. The paper then examines how this misunderstanding (about both propositions) came to influence copyright jurisprudence, and Beebe then proceeds to suggest ways in which an accurate understanding of Bleistein may be used to reform crucial aspects of modern copyright law. The paper is well worth a read for anyone interested in copyright.

Beebe's examination of Holmes' views on progress, personality and literary romanticism did however raise a question for me about the unity (or coherence) of Holmes' views, especially given that he was a polymath. Long-regarded as a Legal Realist who thought about legal doctrine in largely functional and instrumental terms, Bleistein's commonly (mis)understood insights about originality comport well with Holmes' pragmatic worldview. His treatment of originality as a narrow (and normatively empty) concept, for instance, sits well with his anti-conceptualism and critique of formalist thinking. But if Holmes really did not intend for originality to be a banal and content-less standard (as Beebe suggests), how might he have squared its innate indeterminacy with his Realist thinking? Does Beebe's reading of Bleistein suggest that Holmes was not a Legal Realist after all when it came to questions of copyright law and its relationship to aesthetic progress? This of course isn't Beebe's inquiry in the paper (nor should it be, given the other important questions that it addresses), but the possibility of revising our view of Holmes intrigued me.

Wednesday, September 13, 2017

Tribal Sovereign Immunity and Patent Law

Guest post by Professor Greg Ablavsky, Stanford Law School

In Property, I frequently hedge my answers to student questions by cautioning that I am not an expert in intellectual property. I’m writing on an IP blog today because, with Allergan’s deal with the Saint Regis Mohawk Tribe, IP scholars have suddenly become interested in an area of law I do know something about: federal Indian law.

Two principles lie at the core of federal Indian law. First, tribes possess inherent sovereignty, although their authority can be restricted through treaty, federal statute, or when inconsistent with their dependent status. Second, Congress possesses plenary power over tribes, which means it can alter or even abolish tribal sovereignty at will.

Tribal sovereign immunity flows from tribes’ sovereign status. Although the Supreme Court at one point described tribal sovereign immunity as an “accident,” the doctrine’s creation in the late nineteenth century in fact closely paralleled contemporaneous rationales for the development of state, federal, and foreign sovereign immunity. But the Court’s tone is characteristic of its treatment of tribal sovereign immunity: even as the Court has upheld the principle, it has done so reluctantly, even hinting to Congress that it should cabin its scope. This language isn’t surprising. The Court hasn’t been a friendly place for tribes for nearly forty years, with repeated decisions imposing ever-increasing restrictions on tribes’ jurisdiction and authority. What is surprising is that tribal sovereign immunity has avoided this fate. The black-letter law has remained largely unchanged, narrowly surviving a 2014 Court decision that saw four Justices suggest that the doctrine should be curtailed or even abolished.

Monday, September 11, 2017

Reexamining the Private and Social Costs of NPEs

It's good to be returning from a longish hiatus. I've just taken over as the Associate Dean for Faculty Research; needless to say, it's kept me busier that I would like. But I'm back, and hope to resume regular blogging.

My first entry has been sitting on my desk (errrr, my email) for about six months. In 2011 Bessen, Meurer, and Ford published The Private and Social Costs of Patent Trolls, which was received with much fanfare. Its findings of nearly $500 billion in market value decrease over a 20 year period, and $80 billion losses a year for four years in the late 2000's garnered significant attention; the paper has been downloaded more than 5000 times on SSRN.

Enter Emiliano Giudici and Justin Robert Blount, both of Stephen F. Austin Business School. They have attempted to replicate the findings of Bessen, Meurer, and Ford with newer data. The results are pretty stark: they find no significant evidence of loss at all. They also attribute the findings of the prior paper to a few outliers, among other possible explanations. These are really important findings. Their paper has fewer than 50 downloads. The abstract is here: 
An ongoing debate in patent law involves the role that “non-practicing entities,” sometimes called “patent trolls” serve in the patent system. Some argue that they serve as valuable market intermediaries and other argue that they are a drain on innovation and an impediment to a well-functioning patent system. In this article, we add to the data available in this debate by conducting an event study that analyzes the market reaction to patent litigation filed by large, “mass-aggregator” NPE entities against large publicly traded companies. This study advances the literature by attempting to reproduce the results of previous event studies done in this area on newer market data and also by subjecting the event study results to more rigorous statistical analysis. In contrast to a previous event study, in our study we found that the market reacted little, if at all, to the patent litigation filed by large NPEs.
This paper is a useful read beyond the empirics. It does a good job explaining the background, the prior study, and critiques of the prior study. It is also circumspect in its critique - focusing more on the inferences to be drawn from the study than the methods. This is a key point: I'm not a fan of event studies for a variety of reasons. But that doesn't mean that I think event studies are somehow unsound methodologically. It just means that our takeaways from them have to be tempered by the limitations. And I've always been troubled that the key takeaways from Bessen, Meurer & Ford were outsized (especially in the media) compared to the method.

But Giudici and Blount embrace the event study, weaknesses and all, and do not find the same results. This, I think, is an important finding and worthy of publicity. That said, there are some critiques, which I'll note after the break.

Natalie Ram: Innovating Criminal Justice

Natalie Ram (Baltimore Law) applies the tools of innovation policy to the problem of criminal justice technology in her latest article, Innovating Criminal Justice (forthcoming in the Northwestern University Law Review), which is worth a read by innovation and criminal law scholars alike. Her dive into privately developed criminal justice technologies—"[f]rom secret stingray devices that can pinpoint a suspect’s location to source code secrecy surrounding alcohol breath test machines, advanced forensic DNA analysis tools, and recidivism risk statistic software"—provides both a useful reminder that optimal innovation policy is context specific and a worrying depiction of the problems that over-reliance on trade secrecy has wrought in this field.

She recounts how trade secrecy law has often been used to shield criminal justice technologies from outside scrutiny. For example, criminal defense lawyers have been unable to examine the source code for TrueAllele, a private software program for analyzing difficult DNA mixtures. Similarly, the manufacturer of Intoxilyzer, a breath test, has fought efforts for disclosure of its source code. But access to the algorithms and other technical details used for generating incriminating evidence is important for identifying errors and weaknesses, increasing confidence in their reliability (and in the criminal justice system more broadly), and promoting follow-on innovations. Ram also argues that in some cases, secrecy may raise constitutional concerns under the Fourth Amendment, the Due Process Clause, or the Confrontation Clause.

Drawing on the full innovation policy toolbox, Ram argues that contrary to the claims of developers of these technologies, trade secret protection is not essential for the production of useful innovation in this field: "The government has at its disposal a multitude of alternative policy mechanisms to spur innovation, none of which mandate secrecy and most of which will easily accommodate a robust disclosure requirement." Patent law, for example, has the advantage of increased disclosure compared with trade secrecy. Although some of the key technologies Ram discusses are algorithms that may not be patentable subject matter post-Alice, to the extent patent-like protection is desirable, regulatory exclusivities could be created for approved (and disclosed) technologies. R&D tax incentives for such technologies also could be conditioned on public disclosure.

But one of Ram's most interesting points is that the main advantage of patents and taxes over other innovation policy tools—eliciting information about the value of technologies based their market demand—is significantly weakened for most criminal justice technologies for which the government is the only significant purchaser. For example, there is little private demand for recidivism risk statistical packages. Thus, to the extent added incentives are needed, this may be a field in which the most effective tools are government-set innovation rewards—grants, other direct spending, and innovation inducement prizes—that are conditioned on public accessibility of the resulting algorithms and other technologies. In some cases, agencies looking for innovations may even be able to collaborate at no financial cost with academics such as law professors or other social scientists who are looking for opportunities to conduct rigorous field tests.

Criminal justice technologies are not the only field of innovation in which trade secrecy can pose significant social costs, though most prior discussions I have seen are focused on purely medical technologies. For instance, Nicholson Price and Arti Rai have argued that secrecy in biologic manufacturing is a major public policy problem, and a number of scholars (including Bob Cook-Deegan et al., Dan Burk, and Brenda Simon & Ted Sichelman) have discussed the problems with secrecy over clinical data such as genetic testing information. It may be worth thinking more broadly about the competing costs and benefits of trade secrecy and disclosure in certain areas—while keeping in mind that the inability to keep secrets does not mean the end of innovation in a given field.

Tuesday, September 5, 2017

Adam Mossoff: Trademarks As Property

There are two dominant utilitarian frameworks for justifying trademark law. Some view trademark protection as necessary to shield consumers from confusion about the source of market offerings, and to reduce consumers' "search costs" in finding things they want. Others view trademark protection as necessary to secure producers' incentives to invest in "quality". I personally am comfortable with both justifications for this field of law. But I have always been unclear as to how trademarks work as propertyWith certain caveats, I do not find it difficult to conceive of the patented and copyrighted aspects of inventions and creative writings as "property" on the theory that we generally create property rights in subject matter that we want more of.  But surely Congress did not pass the Lanham Act in 1946 and codify common law trademark protection simply because Congress wanted companies to invest in catchy names and fancy logos?

In his new paper, Trademark As A Property Right, Adam Mossoff seeks to clarify this confusion and convince people that trademarks are property rights based on Locke's labor theory. In short, Mossoff's view is that trademarks are not a property right on their own; rather, trademarks are a property right derived from the underlying property right of goodwill. Read more at the jump.

Saturday, September 2, 2017

Petra Moser and Copyright Empirics

I thought this short Twitter thread was such a helpful, concise summary of some of NYU economist Petra Moser's excellent work—and the incentive/access tradeoff of IP laws—that it was worth memorializing in a blog post. You can read more about Moser's work on her website.

Monday, August 28, 2017

Dinwoodie & Dreyfuss on Brexit & IP

In prior work such as A Neofederalist Vision of TRIPS, Graeme Dinwoodie and Rochelle Dreyfuss have critiqued one-size-fits-all IP regimes and stressed the value of member state autonomy. In theory, the UK's exit from the EU could promote these autonomy values by allowing the UK to revise its IP laws in ways that enhance its national interests. But in Brexit and IP: The Great Unraveling?, Dinwoodie and Dreyfuss argue that these gains are mostly illusory: "the UK will, to maintain a robust creative sector, be forced to recreate much of what it previously enjoyed" through the EU, raising the question "whether the transaction costs of the bureaucratic, diplomatic, and private machinations necessary to duplicate EU membership are worth the candle."

The highlight of the piece for me is that Dinwoodie and Dreyfuss give numerous specific examples of how post-Brexit UK might depart from EU IP policy in ways that serve its perceived national policy interests, which nicely illustrate some of the ways in which the EU has harmonized IP law. For example, in the copyright context, it could resist the expansion in copyrightable subject matter suggested by EU Court of Justices cases; re-enact its narrow, compensation-free private copying exception; or reinstate section 52 of its Copyright, Designs and Patents Act, which limited the term of copyright for designs to the maximum term available under registered design law. In the trademark context, Dinwoodie and Dreyfuss describe how UK courts have grudgingly accepted more protectionist EU trademark policies that would not be required post-Brexit, such as limits on comparative advertising. Patent law is the area "where the UK will formally re-acquire the least sovereignty as a result of Brexit," given that it will continue to be part of the European Patent Convention (EPC) and that it still intends to ratify the Unified Patent Court Agreement—though the extent of UK involvement remains unclear.

Of course, whether such changes to copyright or trademark law would in fact further UK interests in an economic sense is highly debatable—but if UK policymakers think they would, why would they nonetheless recreate existing harmonization? I think Dinwoodie and Dreyfuss would respond that these these national policy interests are outweighed by the benefits of coordination on IP, which "have been substantial and well recognized for more than a century." Their argument is perhaps grounded more in political economy than economic efficiency, as their examples of the benefits of coordination are all benefits for content producers rather than overall welfare benefits. In any case, they note that coordination became even easier within the institutional structures of the EU, and that after Brexit, "the UK will have to seek the benefits of harmonization through the same international process that has been the subject of sustained resistance as well as scholarly critique, rather than under these more efficient EU mechanisms." While it is plausible that the lack of these efficiency gains will tilt the cost-benefit balance in favor of IP law tailored to national interests, Dinwoodie and Dreyfuss suggest that a desire for continuity and commercial certainty will override autonomy concerns.

With all the uncertainties regarding Brexit (as recently reviewed by John Oliver), intellectual property might seem low on the list of things to worry about. But the companies with significant financial stakes in UK-based IP are anxiously awaiting greater clarity in this area.

Sunday, August 20, 2017

Gugliuzza & Lemley on Rule 36 Patentable-Subject-Matter Decisions

Paul Gugliuzza (BU) and Mark Lemley (Stanford) have posted Can a Court Change the Law by Saying Nothing? on the Federal Circuit's many affirmances without opinion in patentable subject matter cases. They note a remarkable discrepancy: "Although the court has issued over fifty Rule 36 affirmances finding the asserted patent to be invalid, it has not issued a single Rule 36 affirmance when finding in favor of a patentee. Rather, it has written an opinion in every one of those cases. As a result, the Federal Circuit’s precedential opinions provide an inaccurate picture of how disputes over patentable subject matter are actually resolved."

Of course, this finding alone does not prove that the Federal Circuit's Rule 36 practice is changing substantive law. The real question isn't how many cases fall on each side of the line, but where that line is. As the authors note, the skewed use of opinions might simply be responding to the demand from patent applicants, litigants, judges, and patent examiners for examples of inventions that remain eligible post-Alice. And the set of cases reaching a Federal Circuit disposition tells us little about cases that settle or aren't appealed or in which subject-matter issues aren't raised. But their data certainly show that patentees have done worse at the Federal Circuit than it appears from counting opinions.

Perhaps most troublingly, Gugliuzza and Lemley find some suggestive evidence that Federal Circuit judges' substantive preferences on patent eligibility are affecting their choice of whether to use Rule 36: Judges who are more likely to find patents valid against § 101 challenges are also more likely to cast invalidity votes via Rule 36. When both active and senior judges are included, this correlation is significant at the five-percent level. The judges on either extreme are Judge Newman (most likely to favor validity, and most likely to cast invalidity votes via Rule 36) and Chief Judge Prost (among least likely to favor validity, and least likely to cast invalidity vote via Rule 36), who also happen to be the two judges who are most likely to preside on the panels they sit. Daniel Hemel and Kyle Rozema recently posted an article on the importance of the assignment power across the 13 federal circuits; this may be one concrete example of that power in practice.

Gugliuzza and Lemley do not call for precedential opinions in all cases, but they do argue for more transparency, such as using short, nonprecedential opinions to at least list the arguments raised by the appellant. For lawyers without the time and money to find the dockets and briefs of Rule 36 cases, this practice would certainly provide a richer picture of how the Federal Circuit disposes of subject-matter issues.

Monday, August 14, 2017

Research Handbook on the Economics of IP (Depoorter, Menell & Schwartz)

Many IP professors have posted chapters of the forthcoming Research Handbook on the Economics of Intellectual Property Law. As described in a 2015 conference for the project, it "draws together leading economics, legal, and empirical scholars to codify and synthesize research on the economics of intellectual property law." This should be a terrific starting point for those new to these fields. I'll link to new chapters as they become available, so if you are interested in this project, you might want to bookmark this post. (If I've missed one, let me know!)

Volume I – Theory (Ben Depoorter & Peter Menell eds.)


Volume II – Analytical Methods (Peter Menell & David Schwartz eds.)

Patents

Wednesday, August 2, 2017

Kevin Collins on Patent Law's Authorship Screen

Numerous scholars have examined the various functionality screens that are used to prevent non-utility-patent areas of IP from usurping what is properly the domain of utility patent law (see, e.g., the terrific recent articles by Chris Buccafusco and Mark Lemley and by Mark McKenna and Chris Sprigman). But hardly anyone has asked the inverse question: How should utility patent law screen out things that should be protected by non-patent IP? In Patent Law's Authorship Screen (forthcoming U. Chi. L. Rev.), Kevin Collins focuses on the patent/copyright boundary, and he coins the term "authorship screen" as the mirror image of copyright's functionality screen. As with pretty much everything Collins writes, it is thought provoking and well worth reading.

Wednesday, July 26, 2017

Kuhn & Thompson on Measuring Patent Scope by Word Count

I've seen a number of recent papers that attempt to algorithmically measure patent scope by counting the number of words in the patent's first claim and comparing to other patents in the same technological field (with longer claims → more details → narrower scope). In their new paper, The Ways We've Been Measuring Patent Scope are Wrong: How to Measure and Draw Causal Inferences with Patent Scope, Jeffrey Kuhn (UNC) and Neil Thompson (MIT Sloan) argue that this measure is superior to prior scope measures.

They validate the word-count measure by comparing with survey responses from seven patent attorneys (below). In comparison, they find that previous measures of patent scope—the number of classes, the number of citations by future patents, and the number of claims—are uncorrelated or negatively correlated with their attorneys' subjective responses.


Of course, there are lots of reasons that word count is an imperfect measure, and additional validation would be helpful. (It would also be good to confirm that the attorneys in this study were blinded to the study design.) Those planning empirical patent studies should approach this variable with caution (and with good advice from patent law experts), but it is a potential scope measure that patent empiricists should at least have on their radar screens.

Wednesday, July 19, 2017

Liscow & Karpilow on Innovation Snowballing and Climate Law

Patent scholars are often skeptical of the government "picking winners," but in Innovation Snowballing and Climate Law, Zach Liscow and Quentin Karpilow (Yale Law) argue that the government should target specific technologies to address social harms like climate change.

It is well known that green technologies present a double externality problem. Both innovation and environmentally friendly goods have significant positive spillovers (and thus will be undersupplied absent government intervention), and the problem is magnified for environmentally friendly innovations. The standard policy solution is to correct each externality, such as through carbon taxes (or cap and trade) and innovation subsidies (e.g., patents, grants, and R&D tax incentives).

Liscow and Karpilow argue that this approach misses the dynamics of cumulative innovation. We know that innovators stand on the shoulders of giants, but Innovation Snowballing describes recent research on how innovators "prefer to stand on the tallest shoulders in order to get the quickest, largest financial returns." Specific "clean" technologies (like solar) thus need a big push to snowball past "dirty" technologies (like fossil fuels):

Thursday, July 13, 2017

Judge Dyk on the Supreme Court and Patent Law, with Responses

Judge Timothy Dyk of the Federal Circuit has long welcomed the Supreme Court's involvement in patent law—see, e.g., essays in 2008 and 2014. In a new Chicago-Kent symposium essay, he states that he "continue[s] to believe that Supreme Court review of our patent cases has been critical to the development of patent law and likewise beneficial to our court," such as by "reconciling [Federal Circuit] jurisprudence with jurisprudence in other areas."

Four pieces were published in response to Judge Dyk, and while Michael previously noted Greg Reilly's argument that the Supreme Court does understand patent law, the others are also worth a quick read. Tim Holbrook (Emory) argues that some of the Court's interest reflects "suspicion about the Federal Circuit as an institution" but that the result is "a mixed bags" (with some interventions having "gone off the rails"). Don Dunner (Finnegan) is even more critical of the Supreme Court's involvement, arguing that "it has created uncertainty and a lack of predictability in corporate boardrooms, the very conditions that led to the Federal Circuit's creation." And Paul Gugliuzza (BU) argues that "the Supreme Court's effect on patent law has actually been more limited" because its decisions "have rarely involved the fundamental legal doctrines that directly ensure the inventiveness of patents and regulate their scope" and because its "minimalist approach to opinion writing in patent cases frequently enables the Federal Circuit to ignore the Court's changes to governing doctrine."

Monday, July 3, 2017

USPTO Economists on Patent Litigation Predictors

Alan Marco (USPTO Chief Economist) and Richard Miller (USPTO Senior Economist) have recently posted Patent Examination Quality and Litigation: Is There a Link?, which compares the characteristics of litigated patents with various matched controls. The litigation data was from RPX, the patent data was from various USPTO datasets, and the controls were either chosen randomly from the same art unit and grant year or were chosen with propensity score matching based on various observable characteristics. They are interested in whether examination-related variables that can be controlled by the USPTO are related to later litigation, and they conclude that "some examination characteristics predict litigation, but that the bulk of the predictive power in the model comes from filing characteristics."

Marco and Miller report that patents filed by small entities are more than twice as likely to be litigated than those filed by large entities, and patents with longer continuation histories and application pendency are also more likely to be litigated. Government-interest patents and foreign-priority patents are much less likely to be litigated than other similar patents. Other characteristics that indicate higher probability of subsequent litigation include having more independent claims and shorter independent claims (proxies for broader patents), being allowed by examiners with signatory authority, not being allowed on first action, having more IDS filings or examiner interviews.