➤➤ Would you like to receive What's Next as an email? Sign up here.


|

|

Watch This Space: Who's Policing Our DNA?

Over the next few weeks we'll be checking in with legal futurists, technologists and other experts for perspectives on the thorny issues that arise where new tech meets old laws. Last week, we heard from Santa Clara University law professor Eric Goldman on the ways emojis are creeping into court cases. For this week, we reached out to Marcia Hofmann, founder of Zeitgeist Law in San Francisco and special counsel to the Electronic Frontier Foundation.

Hofmann is currently teaching a course on computer crime at the University of Colorado that covers electronic surveillance law and new interpretations of the Fourth Amendment. When we asked what's cutting edge in her world, Hofmann went straight to the recent headlines involving criminal investigators and consumer genetic services.

We've been hearing recently about police using genetic and ancestry sites like 23andMe or Ancestry.com to solve cases. In what ways is this a legal gray area? 

Hofmann: More and more people choose to share DNA samples and profiles in hopes of learning about their backgrounds and perhaps discovering new family. I think it came as a surprise to many that investigators are using this information to solve crimes.

That raises an interesting question: Should there be limits on whether, how, and when the police can use genetic information that people share with ancestry sites?

At this point, any limits on the federal level would come from Fourth Amendment, which generally requires investigators to get a warrant to perform a search that intrudes on a reasonable expectation of privacy. Historically, the Supreme Court has said people don't have a reasonable expectation of privacy in things they voluntarily share with third parties, which would suggest the Fourth Amendment doesn't offer much protection here.

And yet the Fourth Amendment landscape has shifted radically over the last few years in a string of cases where the Supreme Court has re-visited old rules in light of new technologies. For example, just last year in Carpenter v. United Statesthe majority said that at least some information is so unique and intimate that disclosing it to others doesn't snuff out expectations of privacy.

So when it comes to genetic information shared with ancestry sites, maybe there's potential for Fourth Amendment theories that wouldn't have seemed all that viable before. Are genetic information and familial connections so inherently sensitive that a warrant should be a default Fourth Amendment requirement, even if a person shares their DNA profile with others through an ancestry site? When police access a person's genetic information shared with a company, could that be considered a form trespass on personhood or property that requires a warrant? The Supreme Court's recent opinions suggest some interesting directions to take these theories.

It's hard to argue against using every tool available to solve violent crimes. And yet, there's clearly tension here with individual expectations of privacy. Do you think legislation will be required to set an appropriate balance—and what might that look like?

Hofmann: I think legislation is definitely a possibility — plus it could be crafted to include special protections above and beyond what the Fourth Amendment might offer.

Legislation could require law enforcement to get a warrant before collecting DNA information from ancestry sites for criminal investigations, and if the legislature saw fit, it could include requirements beyond a showing a probable cause (as the Wiretap Act does). For example, the law might require searches of genetic information to be done in a way that minimizes the privacy impact on people whose information isn't relevant to an investigation.

Legislation could also limit how ancestry and genetic testing companies can share genetic information with third parties other than law enforcement. And it could require opt-in consent from users before information can be shared for purposes other than what it was originally collected for.

We're nearing a day, according to researchers, when basically everyone's DNA can be matched regardless of whether they've ever used one of these commercial services. It seems like this makes the privacy questions even more urgent but also more complicated. (I'm thinking, for example, that a criminal suspect might assert a reasonable expectation of privacy in his personal cell phone location data but how does a suspect assert a reasonable expectation of privacy in genetic material provided voluntarily by a distant relative?) Is this likely to be a complicating factor in the Fourth Amendment analysis? 

Hofmann: It is complicating factor, but the Fourth Amendment test evolves over time. It has the flexibility to adjust to advances in technology and shifts in societal norms.

I imagine that if Fourth Amendment protection were to cover genetic material provided voluntarily by a distant relative, the theory would be based on an expectation of privacy in family relationships or maybe personhood. It might be a particularly important factor that the suspect never consented to the relative's decision to offer up the genetic material, since the suspect wouldn't have done anything to diminish their own expectation of privacy.

I could also see a Fourth Amendment theory based on the idea that genetic information is so inherently sensitive that a warrant is required to search it by default. That's the rule we have now for cell phones after Riley v. California.

Outside of the criminal law arena, are there worrisome implications related to the growth of what's essentially a global DNA registry controlled by commercial entities?

Hofmann: Absolutely. It's critically important to make sure commercial entities don't use predictive genetic information to deny individuals—or entire families— insurance, employment, health care, the freedom to make their own life choices, etc. We already have some laws, such as the Genetic Information Nondiscrimination Act and state statutes, that limit genetic discrimination to varying extents. But those laws need to be agile and keep up as genetic technology develops and new use cases arise.

And what happens if genetic testing and ancestry companies partner up with businesses in other industries, or get acquired, or go bankrupt? Could all that genetic information be put to different uses that consumers didn't agree to? The idea of finding new uses for consumer genetic information isn't purely hypothetical—for instance, 23andMe already has collaborations with pharmaceutical companies to help develop new drugs. Regulators will need to keep a close eye on the re-purposing of genetic information to make sure people and their blood relatives don't suffer unavoidable harms.

— Vanessa Blum


|

On the Radar: 3 Things to Know

Back to School: Alston & Bird is proactively trying to stay ahead of the tech curve with its partnership with the Legal Analytics Lab at Georgia State University. The law firm is receiving tutoring on data analytics tools like machine learning and text mining. In exchange, the firm's attorneys will guest-lecture in graduate-level classes and participate in analytics programs. Frank Ready has more on the collaboration here.

Different Strokes: It's not enough to adapt to changes in the marketplace, law firms must also learn how to think differently to evolve. That was the takeaway from the opening of the Association of Legal Technologists' second ctrl ALT del conference. CEO and legal technology writer and speaker Zach Abramowitzhighlighted his work consulting with Reed Smith as an example. The firm developed an internal solution called Periscope that helped attorneys analyze their data and recommended, implemented and optimized legal technology systems. But the firm pushed it a step further in 2018, selling the service on the market. Reed Smith just landed another law firm as a client last month. Zach Warren has more here.

Into the Breach: There have been 59,000 data breaches in the European Union since the General Data Protection Regulation took effect about eight months ago. But the lawyers behind the survey said many EU countries are still digging out from beneath a backlog of breach reports, which likely means more enforcement actions and fines on the horizon. Fines for GDPR violations have been pretty paltry thus far, but that's expected to change going forward. More from Phillip Bantz here.

— Nate Robson


|

#IRL: The Risk of AI Is Humans

Longtime readers of What's Next know that #IRL is the header we use to report on the events and conversations we're having (you guessed it!) in real life. This past week, Law.com legal technology editor Zach Warren was in Scottsdale, Arizona for the second annual ctrl ALT delete conference on legal technology and design. Here's Zach's report:

One of my favorite speakers at the ctrl ALT delete conference, put on by the Association of Legal Technologists (ALT, thus the pun) was Sunday's keynote talk from Shawnna Hoffman, one of IBM Watson's legal gurus. She ran through the positives of artificial intelligence, of course—she does have a vested interest in Watson, after all. But what interested me even more were the risks she laid out. None of those risks were of the “robots taking your job/life/soul” variety. Instead, Hoffman posited that today's AI risks are more people-oriented—design or planning flaws that a little education (hopefully) could fix, but yet they persist.

I ran down a few of those risks—from not anticipating possible regulations to the need for clean data—in this article from the conference. But there's one I didn't include that has stuck with me days later. It has to do with design decisions and risk. Take autonomous vehicles as an example. Hoffman noted that the average connected car generates an outrageous 70 TB of data, making controlling further data space and software costs a natural business goal. So how long before an engineer bent on efficiency overlooks a safety measure?

>>Think Ahead: The future of AI litigation isn't likely to come from someone trying to control Skynet. The future of AI litigation will come from a trade-off of risks, and what happens when one of those risks turns bad. “Because of the way we're building these AI systems… we're working with IT staff that love to push the limit,” Hoffman said in a push for lawyers to become more involved with AI development. And in this case, “It's worth overcollecting to make sure nobody gets hurt.”

— Zach Warren


|

Protocol: Cryptic Canadian Crypto Case

The mystery surrounding the apparent loss to investors of about $190 million Canadian dollars worth of cryptocurrency plus millions more in cash presents a cautionary tale even as the case attracts the attention of law firms. About 115,000 clients are reported to have lost funds after Vancouver-based digital exchange Quadriga CX's founder, 30-year-old CEO Gerald Cotter, reportedly died in India in December without leaving access to the exchange's digital keys with anyone, not even his wife, according to Bloomberg News and other accounts.

Skepticism about the circumstances abound with exchange customers calling for an official investigation, and Canadian law firms including Bennett Jones LLPand McInnes Cooper jockeying for the right to represent clients who say they lost Bitcoin, Ether, and other digital currencies and cash. he platform filed for creditor protection in Halifax Supreme Court in Nova Scotia earlier this

“I would not expect a sophisticated exchange to manage its custody in that way,” said Steve Bunnell, chair of the data security and privacy practice at O'Melveny & Myers in Washington, D.C.

Bunnell, who has no involvement in the matter, said “One of the typical solutions is a multi-signature wallet, which you can think of as a safe deposit box with more than one key and any three of the five, say, you can open [it]. Here we have the single key, or password with one individual.”

Bunnell said cryptocurrency investors need to closely examine the backgrounds and experience of senior executives and key personnel at any exchange before entrusting funds to it. “It highlights the need of dealing with people who know what they are doing and as the industry matures there will be more widely accepted best practices that will become market standards.”

— MP McQueen


|

Dose of Dystopia: Bandersnatch Edition

The GDPR is shining some light into “Black Mirror: Bandersnatch.” Motherboardwrites that a tech policy researcher in the UK named Michael Veale used provisions in the EU-wide privacy law to figure out that Netflix saved data on every selection he had made in watching the dystopian choose-your-own-adventure show. How very … fitting?

Veale used this right of access to ask Netflix questions about Bandersnatch and revealed the answers in a Twitter thread. He found that Netflix is tracking the decisions its users make (which makes sense considering how the film works), and that it is keeping those decisions long after a user has finished the film. It is also stores aggregated forms of the users choices to “help [Netflix] determine how to improve this model of storytelling in the context of a show or movie,” the company said in its email response to him.

If you put on your tinfoil hat for a moment, you might wonder what else Netflix could derive from this data; might it, perhaps, enable the company to build a sort of behavior profile of its users based on their selections (as if they couldn't already do a fair bit of that by knowing everything you watch on the platform).

Either way, the point that Veale makes is that EU citizens ought to exercise their new rights under the GDPR to get this kind of data, and maybe bring about some greater transparency generally in how it's being used. How's that for an alternate reality?

— Ben Hancock

|