Welcome back to What's Next, where we report on the intersection of law and technology. In today's briefing, we have a Q&A with USC law professor Orin Kerr on the Fourth Amendment post Carpenter. Also on deck: A dose of dystopia, drone law, and a new FTC task force filled with lawyers.


➤ Would you like to receive What's Next by email. Sign up here.

Life After Carpenter

It's been nearly a year since the U.S. Supreme Court ruled in Carpenter v. U.S, establishing murky rules to govern law enforcement access to the cell-site records that map the movements of anyone who carries a cellphone. To USC law professor Orin Kerr, the decision signals a new era in Fourth Amendment jurisprudence. We caught up with Kerr by email to ask about the pressures that digital technology places on the Fourth Amendment and where things go next.

 You write that the Supreme Court's decision in Carpenter v U.S. signals a “major break from the traditional understanding” of the Fourth Amendment. Can you explain why it's so impactful? After all, it's not the first case to deal with transformational technology and policing.

Carpenter introduces a new way to think about the reasonable expectation of privacy test. Before Carpenter, Fourth Amendment rights were tied to places or things. The legal question was whether a person had a reasonable expectation of privacy in a place like a home, or in a thing like a car, that was invaded by the government's access of information from inside it. Under Carpenter, people have a reasonable expectation of privacy against certain levels of government power. If technology expands government power, the law adds constitutional protection to restore the level of privacy people had before the technological change. Now you can have Fourth Amendment rights in records that you didn't know existed, held by companies thousands of miles away, that might have information about you.

 Does Carpenter provide a clear framework for lower courts to apply in evaluating other forms of digital records? What other kinds of digital age data do you expect will come up in future cases?

No, it doesn't. The law professor cliché is true: Carpenter raises more questions than it answers. First, we don't know what other records about us are protected. For example, what if the government collects IP addresses? Or tracks website visits? Or engages in public camera surveillance? Or obtains numbers dialed? Carpenter says that the Fourth Amendment now protects some kind of non-content third party records, but we don't know which ones. Second, we don't know when collection of those protected records is a search. Is only long-term surveillance covered? What if the government obtains only a single record at a single time? We just don't yet know if any surveillance is protected or if some kind of long-term surveillance is needed.

Where does this leave the companies that are collecting user data? If they receive a subpoena, should they comply?

It leaves everyone with a lot of uncertainty. Companies with user data often can't know if the records they have collected are protected under the Fourth Amendment. Many of the records will also be protected by a federal statute called the Stored Communications Act, which has an immunity provision. As a result, companies shouldn't face legal liability for complying with that statute as long as the Fourth Amendment law remains uncertain. But some companies will want to err on the side of caution and demand warrants. At that point law enforcement has three choices. First, if they have probable cause, they can obtain warrants. Second, they can bring the company to court and try to enforce the lesser legal process. Third, they can give up trying to get that information.

We recently reported on “reverse location search warrants” where a company like Google is asked to identify all mobile devices in the area of a crime scene at a particular time. What Fourth Amendment considerations are raised by this type of investigation tactic and does Carpenter provide any guidance? 

Reverse location warrants raise a lot of legal issues. Carpenter doesn't so much provide guidance as add yet another hard question: Are the location coordinates protected by the Fourth Amendment at all? Before Carpenter, it was likely the case that the location records were not protected. They would have been considered metadata voluntarily communicated to the provider, and therefore outside Fourth Amendment protection. Carpenter leaves unclear whether these sorts of location records are still unprotected. It depends on how you read the Carpenter decision, I think, and in particular what you make of the Court's analysis of the voluntariness of generating records. If you take voluntary steps to generate records, then ordinarily that is on you and the records are not protected. Carpenter says that cell phone users don't make cell-site records voluntarily, however, because they have to use a cell phone to participate in modern life—and all cell phone use generates cell-site records. Creating such records is unavoidable unless you want to be a hermit, the Court reasons, so doing so is not voluntary in a “meaningful” sense. But is that also true for precise location coordinates stored in a Google account?

 What is the next digital Fourth Amendment issue that you expect to confront the Supreme Court? 

There are two deep circuit splits ready for the Court's attention.

The first issue is how to apply the Fourth Amendment to searching computers that cross the border. The traditional rule is that the government does not need any cause to search and read paper documents crossing the border. The Eleventh Circuit has held that same rule applies to computer searches. But other lower courts have disagreed. The Ninth Circuit has required reasonable suspicion to conduct a forensic search of a computer crossing the border, and the Fourth Circuit has also said that some cause is needed.

The second circuit split considers how the Fourth Amendment applies when a private person brings someone else's computer to the government and claims to have seen evidence of crime in a particular file on the machine. The traditional rule is that the government can reenact the private search without a warrant but cannot exceed the private search. Lower courts are divided on how that rule applies. Can the police only look at that one file? Can they look at the entire computer? The question is, what is the unit courts should use to determine when a government search exceeded a private search?

Stay tuned for Kerr's forthcoming book “A Digital Fourth Amendment.” Read two draft chapters here.

— Vanessa Blum


|

Dose of Dystopia

Readers of this newsletter are probably well familiar with “deepfakes”—those doctored video and sound recordings that, because they're edited using sophisticated algorithms, are terrifyingly convincing. Reported on in depth by RadioLab and Motherboard a couple years back, deepfakes have caught the attention of state and federal lawmakers who want to put legal restrictions on the use of the powerful technology (ring any bells?).

But in an article published this week, The Verge policy editor Russell Brandom argues that the fake news apocalypse that so many people had been expecting is all just media hyperventilation. Why would that be the case? Because you can fight the technology with, well, more technology:

“It's a good question why deepfakes haven't taken off as a propaganda technique. Part of the issue is that they're too easy to track. The existing deepfake architectures leave predictable artifacts on doctored video, which are easy for a machine learning algorithm to detect. Some detection algorithms are publicly available, and Facebook has been using its own proprietary system to filter for doctored video since September.”

So, yay? Not so much. Brandom notes that deepfakes technology is still being used to superimpose unwitting victims' faces onto porn videos—which is its own dark problem. Full article on The Verge.

 Ben Hancock

 


|

Back to Dealing with Drones

With all of the hubbub about intelligent appliances, autonomous cars, and other futuristic technologies, one of the first pieces of tech to make regulatory waves has fallen a bit by the wayside: drones. But that doesn't mean nobody's paying attention, it's just that drones are becoming more ingrained in both commercial use and regulatory actions.

At the “All About Drones: Legal, Practical and Educational Applications” panel atlast week's ABA Tech Show, the panelists encouraged attendees to call drones what they are: a tool. Even though drone usage is fancy and new, it's still a means to an end, be it delivering packages, inspecting infrastructure, or conducting search and rescue missions. As a result, attorneys have a mountain of precedent to examine for their use, be it the United States v. Causby U.S. Supreme Court case governing ownership of airspace over private property, or the recent Supreme Court decision in Riley v. California that cell phone searches require warrants.

As an example, 13 states require search warrants in drone usage, but Russ Cochran, general counsel for the Oklahoma Bureau of Narcotics and Dangerous Drugs Control, expects that number to grow. “You see the trend where case law is going. We've got to do it right. That means that you need to develop your probable cause,” Cochran said.

For counsel, that means working with regulators. In January, State Farm received the first national waiver allowing it to conduct drone operations over people and flights beyond the pilot's visual line of sight through November 2022. But it wasn't easy getting to that point—State Farm first received FAA permission to test drones in March 2015, and receiving the waiver took four years of close cooperation with the federal government.

“It's been a team effort to make drone technology a reality,” said Robert Yi, State Farm senior vice president for property & casualty claims. “The waiver will provide our claims specialists with another way to efficiently help customers.”

— Zach Warren


|

3 Things to Know With the FTC's Big-Tech Task Force

The U.S. Federal Trade Commission announced last week that its Bureau of Competition is naming a 17-lawyer task force to examine antitrust issues around big tech, which could possibly include reviewing past mergers and acquisitions. Some antitrust lawyers believe that could include Facebook's acquisition of WhatsApp and Instagram. Here are three things to know about the task force:

The FTC Chairman is Joseph Simons, a Republican who was appointed to the role by President Donald Trump last year. He formerly was an antitrust partner and co-chair of the practice group at Paul, Weiss, Rifkind, Wharton & Garrison. He served at the FTC twice before, where he was director of the FTC's Bureau of Competition from 2001 until 2003. He also served at the bureau from 1987 to 1989 in three different positions.

The announcement came as Facebook is reportedly negotiating with the FTC over a multibillion-dollar fine in connection with the Cambridge Analytica privacy scandal. Facebook shared millions of users' data with an app developer in 2014, who in turn sold it to Cambridge Analytica. The FTC began investigating the incident last March. Facebook also has come under fire from German antitrust regulators for its advertising business model.

Some antitrust lawyers, and privacy and consumer advocates have said they believe the task force is being created to appease Congress. They point out there are several instances where the FTC approved big mergers after holding hearings. Henry C. Su, an antitrust trial and appellate litigation partner at Constantine Cannon in Washington, D.C., and San Francisco, who formerly was a trial lawyer at the FTC from 2011 to 2017, part of that time in the competition bureau, said Friday that, “there has been a lot of pressure in particular from Democrats for the FTC to do more to investigate big tech. One could interpret this as Chairman Simon's way of responding to that and let them know the agency is listening.” Su pointed out that the Clayton Act and the Sherman Act empower the commission to regulate antitrust, but data privacy is more the purview of the consumer protection bureau.

— MP McQueen

 

On the Radar

Meme Woes: Memes and musical covers could be in jeopardy under the European Union's new copyright directives, according to Google. Kent Walker, senior vice president of global affairs at the search giant, asked European policy makers to amend copyright directives that he said will stifle online creativity. One of those directives will require online platforms to filter unlicensed copyrighted material, which could nix those popular Grumpy Cat memes (or GIFs) everyone loves to share. Read more from Caroline Spiezio here.

Your IPO Has Arrived: Lyft, under general counsel and secretary Kristin Sverchek, has filed to go public and valued itself at more than $2 billion. Lyft will be treated as an “emerging growth company,” which reduces its disclosure obligations. This latest development means it could beat Uber in the ride-hailing companies' race to go public. Read more from Caroline Spiezio here.

California Dreaming: The California Consumer Privacy Act is slated to go into effect Jan. 1, 2020, and it's not too early to get ready. The privacy regulations, which bear a striking resemblance to the European Union's General Data Protection Regulation, are meant to give Californians more control over the way their data is collected, shared and viewed. The implementation has raised key questions for businesses over what the law entails. But with so much still pending a year out, Dechert's Kevin Cahill said people are still trying to figure out what the rules will be. Read more from Frank Ready here.

|

Thanks for reading. We'll be back with more What's Next, next week.