Welcome back for another week of What's Next, where we report on the intersection of law and technology. Today, we have a look at biometrics law and privacy. Also, tech companies are moving into the financial services arena, which could create some regulatory headaches if legal departments are not looped in early. And does a new copyright law in the European Union mean the end of memes and gifs? All of that, and more, below.

For those tracking litigation over biometric data, a decision from the Illinois Supreme Court in late January was a game changer. In Rosenbach v. Six Flags Entertainment, the court held that a violation of the state's Biometric Information Privacy Act, or BIPA, is enough to confer standing even without a concrete showing of harm. Cue the class actions!

Morrison & Foerster partner Julie O'Neill and associate Max Phillip Zidel recently wrote about the decision, highlighting the emerging conflict on BIPA between state and federal courts. We checked in for their take on the standing question and to ask how companies can collect and use biometric data without becoming an easy target for litigation.

What types of businesses need to be aware of the laws around biometric data? Are more companies collecting this data than one might assume?

All companies that in any way collect or use biometric data (e.g., facial scans, iris/retinal scans, fingerprints, voiceprints or any other identifier derived from biological characteristics), regardless of whether such data is, for example, from consumers, employees, or other individuals, need to consider biometric privacy laws. We are seeing more and more companies come to us to carry out a preliminary assessment under the various biometric privacy statutes. Some of the uses of biometric data we are seeing are more overt, such as companies employing facial recognition technology in their consumer-facing apps. In other instances, there is more gray area, such as where clients ask us about identifying employees through photographs or video footage, or where a medical device company's activities do not fit perfectly within one of the exceptions. On the whole, we tend to see less of the former and more of the latter, but this could change as more and more companies think about building biometric access features into consumer-facing products and workplace processes.

What makes the Illinois Biometric Information Privacy Act (BIPA) so significant in this emerging area?

BIPA stands out because of its private right of action. The act provides not only for actual damages, but also for statutory damages of up to $5,000 per violation, which makes it an attractive target for plaintiffs' attorneys seeking to bring a class action. As a result, we have seen a very large number of BIPA class actions in the past couple years, and this trend continues to accelerate—especially after the recent Illinois Supreme Court ruling discussed below. Given BIPA's status as the oldest and most active biometric privacy statute, it also comes as no surprise that many other states thinking of passing legislation in this area are looking to BIPA as a model. As more substantive case law under BIPA is developed by the courts (thus far it has mostly centered on the issue of standing), this case law could also serve as the foundation for the interpretation of similar laws in other states.

The Supreme Court of Illinois ruled in January that an alleged violation of BIPA alone is sufficient for standing under Illinois law. Does that concern you?

A primary concern is the lack of clarity. The Supreme Court ruling contradicts a handful of other decisions coming out of the federal court, which have held that merely alleging a violation of BIPA is not enough to confer standing to sue under the statute. Of course, as a ruling on a strictly Illinois state law issue, the Illinois Supreme Court carries a great deal of weight; however, it is not yet clear whether federal courts sitting on the basis of diversity jurisdiction will attempt to try and move away from this ruling by characterizing it as one on a purely procedural matter, over which they retain their own jurisdiction. Further, the focus on procedural issues to date has meant that there is essentially no guidance or interpretation on what the various prohibitions and requirements under BIPA actually mean. For instance, the statute prohibits use of biometric information for profit, but what exactly does “profit” signify in this context? What if biometric data is simply being used to improve a consumer-facing product, which of course might indirectly lead to increased profits for the relevant company? All of the above make it difficult to assess the risks of liability under BIPA.

In any case, the Supreme Court of Illinois ruling means we are definitely going to see more activity in this area (in fact we already have), and that more than ever, organizations will need to be confident they are compliant with the various notice, consent, disclosure and other requirements under BIPA in order to avoid potentially significant liability. Without an explicit requirement for standing, it will be much easier for plaintiffs to form, settle and even prevail in large class actions on the basis of the most basic statutory violations—even in the absence of any actual harm whatsoever.

 Under current law, what steps should a company take if it plans to collect biometric data from employees? A company must first determine whether any of the state laws apply to its proposed collection of such data. The scope and coverage vary for each. Assuming that a law applies, a company must then determine how to comply with the applicable notice, consent, use, disclosure, and retention requirements. These are fairly similar across the laws, but there are some key differences. For instance, while all three require notice and consent for the collection and use of biometric data, BIPA is much more restrictive than its Washington and Texas counterparts.

Specifically, BIPA requires that notice be given and consent obtained from each employee in writing and that such notice include the specific reasons and intended duration for the collection and use of the data. In contrast, Washington and Texas do not prescribe any particular form of notice and consent. BIPA also requires that a company develop a publicly available written policy that includes a retention schedule and guidelines for the permanent deletion of biometric data, whereas the other two states have no such requirement.

A covered company must also closely review any restrictions on its ability to disclose the data. All three states generally prohibit such disclosure except where the employee has given consent or where the disclosure falls under an exception (such as complying with the law or completing a financial transaction requested by the employee). Unlike the others, however, BIPA also contains a wholesale prohibition on the sale or other disclosure of biometric data for profit, irrespective of whether an employee has consented.

Of course, these are only highlights of the requirements and nuances under the laws. The important takeaway for a company that proposes to collect biometric data is that it will have to carefully consider the applicable law(s) to determine whether any changes to its practices are necessary.

 Given the lack of clarity, would companies prefer to have a federal law that sets a national standard? In our experience, it is often easier for a company to comply with one federal standard, rather than a patchwork of state laws; however, federal consumer protection laws usually do not completely preempt state laws, such that compliance with perhaps similar but not identical standards is often necessary.

We are actively monitoring biometric privacy developments at both the state and federal levels. Just a couple weeks ago, on March 14, a bill was introduced in the U.S. Senate, aimed at regulating the collection and use of data in connection with facial recognition technology. Much like the existing state laws, the bill, if passed, would impose various notice, consent, use, disclosure, and retention requirements—though only with respect to one category of biometric data and solely to the extent such data is used for identification purposes. The bill has a couple novel elements, such as requirements aimed at preventing discrimination and other “offensive” processing in connection with the use of facial recognition technologies.

So far, the bill appears to have attracted a substantial amount of bipartisan and industry support, so we will be watching it closely.

What about the GDPR? The GDPR treats biometric data as a form of “sensitive data,” which means that it is subject to heightened protections. The collection of sensitive data is generally prohibited, unless a company can rely on one of the exceptions provided under the GDPR. For example, it may be possible to collect biometric data with the explicit consent of the individual. Consent may not, however, be a valid option in the employment context, as European data protection authorities have generally taken the position that an employee is, by virtue of her position and the employer's power over her, unable to provide consent in the “freely given” manner required by the law.

—Vanessa Blum


|

Follow the Money 

In the end, everything always comes back to money—and in the case of the tech industry recently, that's quite literally. Instagram introduced a “checkout” feature on its app last week, allowing users to pay for products without ever leaving the platform. Then on Monday, Apple dove deeper into the financial services game by revealing the Apple Card credit card through the Apple Wallet app that promises no fees, lower interest rates, and better rewards.

It's a new area of focus for many of the largest tech companies, and one that is sure to include legal hurdles. For tech companies that were raised on a culture of conducting experiments and taking risks, moving into the highly-regulated financial space may require a shift in how the legal department approaches problems.

“The idea of 'move fast and break things' doesn't work as well when you're in a highly regulated [industry],” Rebecca Simmons, a partner in Sullivan & Cromwell's financial services and capital markets groups, told Law.com's Caroline Spiezio. “So we do see people sometimes surprised at the amount of time it takes.”

Apple and Facebook, the owner of Instagram, likely won't face those worries given the size of their respective legal departments. But we wouldn't be surprised if the next wave of tech companies do, especially if they don't know enough to do the nitty-gritty compliance work up front.

Simmons noted it's easier to build a financial services program or partnership that's compliant from the start rather than fix legal issues afterward, so lawyers should be in the room with design teams from day one. She explained: “Do it early enough that you can integrate it into your design and your approach, so that you don't have to undo the work you've already done. You can design it in from the beginning.”

—Zach Warren


|

Dose of Dystopia – A Future Without Memes/GIFs

Well, there goes internet.

That's the sentiment of disappointed open internet advocates and tech giant after the European Parliament voted in its plenary session this week to approve a new EU-wide “Copyright Directive” that requires web platforms to proactively remove copyright-infringing content.

Although early language that strictly required the use of automatic content filtering was removed from the bill, some critics say the current language means the provisions essentially require filters anyway. “Put it this way: if I pass a law requiring you to produce a large African mammal with four legs, a trunk, and tusks, we definitely have an elephant in the room,” writes Corey Doctorow of the Electronic Frontier Foundation in a blog post.

So what's the big deal? The legislation has been widely panned as leading to uncertainty and over-blocking legitimate content, not to mention killing internet memes. Then, of course, there's the threat of new legal liability for companies.

Kent Walker, Google's lead lawyerwrites, “Platforms making a good-faith effort to help rights holders identify and protect works should not face liability for every piece of content a user uploads, especially when neither the rights-holder nor the platform specifically knows who actually owns that content. The final text includes language that recognizes that principle.”

Not everyone sees the outcome of this long-roiling legislative debate as, well, dystopian, notes The VergeXavier Bouckaert, president of the European Magazine Media Association, is quoted as saying in a statement, “Publishers of all sizes and other creators will now have the right to set terms and conditions for others to re-use their content commercially, as is only fair and appropriate.”

EU Directives, unlike capital-R “Regulations”, have to be put into place via national legislation by every EU member state, and the EU legislative process isn't done either. But it seems the battle is largely over. So cherish those memes and GIFs while you can?

—Ben Hancock


|

Cyber-heart attacks?

In a scenario that could have been swiped from a spy series, the U.S. Food and Drug Administration issued a safety alert last week about a hacking vulnerability affecting up to 750,000 implantable heart defibrillators.

The March 21 alert flagged a weakness in the wireless technology of Medtronic implantable cardiac defibrillators that could leave the devices exposed to hacking. The FDA said the Minnesota-based company's telemetry protocol doesn't use encryption, authorization or authentication. That could allow an unauthorized person to change the settings on the defibrillators, the home monitors or clinic programmers, the company said in its security bulletin. So far, the FDA said it has not received any reports of anyone being harmed by the vulnerability.

We sat down with two lawyers to get their thoughts on this situation, and what it means for inside and outside counsel.

Jeffrey Rosenthal, a partner at BlankRome who specializes in privacy and consumer protection class-action defense: “It is clear that Medtronics is working with the FDA and it seems like a very upfront discussion of the issues…Getting out ahead of this as Medtronic appears to have done appears to be a reasonable approach.” Rosenthal pointed out that California legislation taking effect next year regulating the Internet of Things will require security and privacy in such devices, as would legislation under consideration in both chambers of Congress. “It is indicative of our time as this is becoming more and more prevalent an issue as IoT devices become more and more a part of our modern existence.”

John J. Sullivan, Cozen O'Connor commercial litigation member, meanwhile, said that while it is premature to speculate on possible lawsuits because no adverse events have been reported, the issue around the security of internet-connected medical devices bears watching: “It is hard to see a mass tort in this, but that doesn't mean there won't eventually be limited efforts by attorneys interested in bringing single claims or a few claims. I foresee a lot of viable defenses. While the claims themselves would be cutting edge, the defenses would likely be more tried-and-true: causation, such as whether any hacking or alleged vulnerability of the device caused the injury; was there a legally cognizable defect; were proper warnings given; are the product liability claims preempted; are there cognizable damages? Those are defenses we have seen in the past, and we would expect to see them again here. In the meantime, the companies have already been addressing this internally, and they are determining what kinds of steps need to be taken, which could include addressing warnings or instructions for use, so as to continue to address safety and prepare for any challenges.”

—MP McQueen



|

On the Radar

 Not There Yet: Few companies, executives and managers are ready for the California Consumer Privacy Act, which goes into effect in 2020, according to a new survey. Nearly 71 percent of the respondents plan to spend at least $100,000 in compliance efforts. Read more from Frank Ready here.

Staffing Up: O'Melveny & Myers is picking up Lisa Monaco as a partner to co-chair the firm's data security and privacy group. Monaco's career has included stops at the U.S. Justice Department, Federal Bureau of Investigation and the Obama White House. Read more from Ryan Lovelace here.

Funding Opportunities: Litigation funder Therium Capital Management is launching a new $430 million fund, it's largest to date. Therium expects to use the money over the next two years on litigation and arbitration matters. The company's pipeline in the U.S. has grown by 26 percent over the last year, according to U.S. CEO Eric Blinderman. Read more from Dan Packel here.