Hey there everyone!

Ian Lopez back with some interesting techy events in law this week. First up, we look at a big win for privacy rights in the form of biometrics. Also on deck, the government takes a look at drone rights, plus a question as to whether now is the time for the law to dive deeper into deepfakes.

But there's one last thing—this will be my final edition of What's Next (or, as my editor called it, my “swan song”), as I'm moving on to a new opportunity. But fear not, there's still a “next” for What's NextStay tuned for more reports from the ever-fascinating intersection of law and technology.

It's been great, everyone. Now, here's What's Next:


Judge Won't Give Finger to Feds in Biometrics Battle for Warrant

|

The government may have ways to get its hands on your phone, but a federal magistrate judge says it can't force your hands onto your phone to unlock it.

In an order denying an application for a search warrant out of the Northern District of California, U.S. Magistrate Judge Kandis Westmore says the government compelling “any individual present” during a search “to press a finger” or take other biometrics step to unlock a digital device is out of stepwith the Fourth and Fifth Amendments.

In her ruling, Westmore refers to SCOTUS' Carpenter decision, which maintains that courts are obligated to ensure constitutional rights aren't “diminished merely due to the advance of technology” (which I wrote about here). As Ars Technica's Cyrus Farivar reports, Westmore's order “is reminiscent” of a federal case out of Illinois in which a magistrate judge “also denied government efforts to conduct a nearly identical biometric dragnet.”

And while a district court judge could overturn the decision, the case could be viewed as significant in the tension between privacy and policingEFF's Andrew Crocker told Forbes, which first reported on the order, “it's important that courts are beginning to look at these issues on their own terms.”

Forbes' Thomas Brewster also provides this interesting little nugget:

“Previously, courts had decided biometric features, unlike passcodes, were not 'testimonial.' That was because a suspect would have to willingly and verbally give up a passcode, which is not the case with biometrics. A password was therefore deemed testimony, but body parts were not, and so not granted Fifth Amendment protections against self-incrimination.”

And, Brewster notes, judges had in the past decided that police “were allowed to force unlock devices like Apple's iPhone with biometrics, such as fingerprints, faces or irises. That was despite the fact feds weren't permittedto force a suspect to divulge a passcode.” But now, “all logins are equal.”

Takeaway: Westmore handed a major victory to privacy advocates by essentially granting biometric device locking methods the same constitutional protection as a password. How this changes the equation for law enforcement remains to be seen, though.


➤➤  Would you like to receive What's Next as an email? Sign up here.


On the Radar: Three Things to Know

|

Flyover Fracas. I don't know if I'd call the United States drone friendly, but the Federal Aviation Administration is apparently weighing whether to loosen restrictions around allowing the unmanned aircrafts to fly over crowdsThe Verge reports the agency first released proposals on Monday that categorizing drones by “weight and the amount of damage they could do to a person.” The U.S. isn't alone in having to deal with drones. Just last week, London's Heathrow airport had to temporarily stop departures after a nearby drone sighting, and that's not the first time drones have disrupted an airport across the pond.

Served on Twitter. You may recall my previous mention of Cohen Milstein serving up a complaint to Wikileaks via a tweet. This fun little tactic popped up again in a San Mateo Superior Court lawsuit tweeted into the ether by San Francisco's Baker Curtis & Schwartz. And, as NYU Law's Arthur R. Miller told Legaltech News, “given the mobility of modern life,” we may see this approach used more frequently, though he notes he doesn't think “they will take over the world.”

Oh, EDGAR. Things are getting steamy in the clandestine world of the SEC's Electronic Data Gathering, Analysis and Retrieval System. But seriously—federal prosecutors out of New Jersey unsealed a 16-count federal indictment charging two Ukrainian men with hacking into the EDGAR database to spread private info to a group of traders to illegally generate a profit. (Ross Todd has a write up for the New Jersey Law Journal that you should check out.) So how'd the hackers breach the gates of the SEC? Turns out emails loaded with malware sent to SEC employees were key.


Should the Law Dive Deeper Into Deepfakes?

|

Fortune's Jeff John Roberts touches on that very topic, focusing in on how the artificial intelligence technology that creates convincing images and video with audio is being used to terrorize women with fake porn that uses their likeness.

Roberts points out that victims may face a tough road in using the law to get deepfakes knocked from the web—one expert tells him that it takes about $50K to punish a troll with a lawsuit. What's more, Section 230, which provides immunity for sites like Craigslist for what their users post, is a significant roadblock.

Yet Roberts does talk to experts about finding some avenues, such as an expansion of existing state laws banning “revenge porn,” or even drafting a narrow law to help deepfake victims. University of Idaho's Annemarie Bridy told Robert this could be a worthwhile approach, though notes that “to get the balance right, we'd also need an immediate, meaningful right of appeal and safeguards against abusive notices intended to censor legitimate content under false pretenses.”

Deepfakes pose other threats as well. Fox Rothschild's Scott Vernick told Legaltech News' Victoria Hutchins there are concerns over “how easy it is to replicate in ways that look quite genuine a business leader or political leader saying or doing something that's not true,” as such could “move markets.” Deepfake dabbling with public figures' likeness actually arose last week, when a Seattle Fox affiliate TV station broadcast doctored footage of President Donald Trump's border wall address, making his head bigger, skin oranger, and tongue stick out between sentences. As deepfakes continue to get more convincing, they're likely to affect more people. Whether the law can keep up remains to be seen.

Takeaway: The law is in a tough spot with addressing deepfakes, as the little existing framework that may be used in tackling with them isn't entirely stacked for the threats they pose.


It's been a pleasure, all. Here's to What's Next!