Back in January, if a tech giant asked consumers if they wanted to share their location data and health status with strangers, or an employer asked workers to take daily temperatures, many folks probably would've told them to shove off. But as the U.S. death toll from the coronavirus pandemic passes 80,000, more Americans might prioritize public health and put aside a distrust of Big Tech and everyday privacy intrusions.

Heather Federman, the vice president of privacy and policy for BigID, which uses machine learning to help companies protect their customer and employee data, says it's unclear how long these pandemic-era privacy incursions will remain in place.

"I've stopped using the words 'going back to normal,' because I don't think we're going back to whatever it was before," she said.

Federman, a lawyer by training who began her career at the Future of Privacy Forum as a legal and privacy fellow, and led privacy teams at Macy's and American Express, shared her thoughts on the legal and privacy implications of contact-tracing apps and other data-centric initiatives used to flatten the curve.

Answers have been edited for length and clarity.

What are your thoughts on Apple and Google's proposal for a contact-tracing app?

I have mixed feelings about it. On one hand, these companies feel like they have to do something. On the other hand, there's still a lot of questions as to how this will work. One interesting positive is that this is the first time you've had two major competitors coming together to create a system that is interoperable. That's something we're seeing pop up in various data protection laws—the ability to import your data into another system. This happens to be an interesting example of that actually happening. It does seem that they're trying to do their best to make this as privacy preserving as possible. We're using Bluetooth technology, they're trying to collect as limited information as possible. It's decentralized, so rather than it being on a central server, it is on your device. So while there are some governments taking issue with that, that is a step in the right direction.

I think the other question is yes, you're able to consent to this, but how likely is it that you're actually going to have enough users adopt this? And from what I'm seeing, you need at least 60% of the population to adopt it.

|

For more on the future of law, sign up for What's Next


What are you most concerned about with these contact-tracing apps from a privacy perspective?

I think my biggest concern is I've been doing a lot of comparison to the Patriot Act after Sept. 11. We had something that was a limited provision, because we were all freaked out, and understandably so. But [for] something that was supposed to be sunsetted back in 2005, it's still up for renewal. That's my concern for something like this. Apple and Google have said they want the data to be destroyed, but at what point is it actually destroyed? Is it once we're all vaccinated?

One challenge these contact-tracing apps encounter is how to responsibly reuse data from a privacy perspective. What are the major hurdles with this?

I think the reuse of data has always been an issue in the privacy world. This pandemic has exacerbated that issue because it's really front and center when we're dealing with location and health data. And the answer is unfortunately it's not clear, which brings us back to the trust issue. If it's not mandated that we do this, what's going to allow me to trust you that you're not going to use this for a secondary purpose. If you want me to be part of that 60% that's opting in, then I better know that you're using this for a limited purpose and you're only going to be using it for a limited amount of time. And I don't have those assurances yet. And that's the part that scares me, and I think scares a lot of people.

Do you anticipate any litigation stemming from these apps?

Probably. We're a litigation-happy country. Hypothetically, it could come into play if we're dealing with a false positive because Bluetooth has certain limitations. So, let's say you get a false positive that you were notified you might have COVID-19, and you need to stay at home, but it turns out you didn't have it. You could sue for losing pay for that period because you had to stay at home. I'm not quite sure how you prevent the liability issue. They're working with public health officials, so they could potentially tell those officials, "It's up to you to be the face of this and if someone is suing you, they're not suing Apple or Google as the third-party provider. They're suing you, the public health officials."

What are you hearing when it comes to contact tracing in the workplace and other measures that could get employees back to work but potentially infringe privacy?

I'm on group chains with different privacy practitioners, and they're all asking, "How do we do this in a way that we can allow people safely back in the office without totally going overboard." I think that's also very unclear at this moment. Temperature checks seem to be popular right now, but the problem with that is you can be asymptomatic, so I think that's going to be a concern. The concern is how much is too much. The temperature check is one thing. But what happens if they start monitoring your web-browsing activity to see if you're googling "Do I have COVID?" I don't know if we're crossing into that territory, but most employers when you get onboarded say that we can monitor any of your work devices. We're already seeing stuff around employee monitoring.

I'm also seeing a daily survey that employees have to complete before they go in. And it's not just about the employee, but the people they have close relationships with. That's not just implicating you, but your family. One way to handle that data is to segregate that information in a separate database, versus your regular HR data, and only touch that data when necessary.