Welcome back for another week of What's Next, where we report on the intersection of law and technology. In the past, we have brought you what we called "a dose of dystopia." This past week, we've gotten a whole lot more than a dose. Here's what we have for you today:

>> What would happen if we turned the architecture of surveillance onto police?

>> Will Trump's Section 230 executive order affect efforts to dismantle end-to-end encryption?

>> The privacy community is questioning whether contact-tracing apps are worth the trouble.

Let's chat: Email me at [email protected] and follow me on Twitter at @a_lancaster.


|

SPONSORED BY ALM PARTNERS

Zoom – Quick Reference Guide

This Zoom Reference provides shortcuts, tips, and tricks for the popular video conferencing software. Use this reference to brush up on the basics and to find alternative methods to your favorite commands. This printable quick reference is yours to use, distribute, and share at your organization! READ MORE

Protestors march from Foley Square over the Brooklyn bridge to rally over the death of George Floyd Photo: Ryland West/ ALM
|

Turning Big Data into "Blue Data"

Andrew Guthrie Ferguson, visiting professor of law at American University is worried America is repeating the same mistakes it made after police shot and killed Michael Brown in Missouri six years ago. Ferguson, who researches predictive policing, said law enforcement seems to have forgotten that greeting demonstrators with overwhelming military-like force escalates protests. Another misstep is addressing the structural problem of police brutality with a policy response, an Obama-era strategy that failed, in part, because there was never the power shift to communities, the law professor said.

So, now, in the aftermath of national protests and police and military response fueled by the death of George Floyd in Minneapolis, Ferguson said those structural issues must be addressed. But once the U.S. begins to tackle those deep-rooted issues, Ferguson says one possible tool could be flipping the focus of America's surveillance mechanisms from the people to the police by creating a repository of "blue data."

Answers have been edited for length and clarity.

➤➤ How can new technologies be used to improve police accountability? We're watching live, the response of New York Police Department's aggressive response to protestors with snippets of cell phone images and media coverage. What we forget is that New York is one of the more surveilled cities we have. The city's Domain Awareness system has at least 9,000-plus linked cameras all over the city. So, if we really wanted to understand police action and understand police accountability, we have the data and footage available. The problem is it's just in the hands of the NYPD.

➤➤ What do you think will be the biggest challenges in creating a database of "blue data?" I think the biggest challenge is political will. I think what we're seeing today is rage from people who have recognized that their views about the oppression by police in certain communities has been ignored for generations, and it has spilled out onto the streets. And it is a demand for accountability and transparency. And I think there are a whole host of steps that need to be taken in a more immediate way, but some part of that could be the fact that we have new technologies that are observing police in some way almost better than civilians. If you step back and think about the data we have on police, we not only have police body cams, we have audio, video analytics capabilities, and GPS. If we had the political will to use that as a deep dive into police accountability, we would have some better sense of how police use their power and where they go and how they direct their resources. The problem is we do not have the political will to demand police accountability and haven't, even though we might have the technology.

➤➤ What logistically would need to happen to create these databases? Who would be in charge of the data? Depending on the city you are in and the technology available, some of the data is quite easy for the police to get if they chose to. If you take a city such as Hartford, Connecticut, that has this technology called BriefCam, which is essentially a video analytics system that can watch the streets. You can type in "find all of the white Subarus," or "find all the people wearing red hats." You literally can bring those cars and people to the fore, find out where they were, the time they were there and overlay it in minutes or seconds. The same technology could be used to find out every time a police officer drew their gun. The problem of course is the technology is in the hands of police, and police have not been willing to look deep into their professional souls to see how they are interacting with the community. An answer could be to set up some independent entity that would be able to review these kinds of interactions, if you believe there was a particular problem with policing.

In a strange way, we're seeing the power of citizens controlling surveillance with the use of cell phone cameras and videos in the last week. Part of the increased national outrage is that we can see it. The question is whether technology allows you to see police abuse and brutality anew, and I think it can be a helpful step forward to do that. But the power has to lie with the community and people who can hold police accountable.

➤➤ What questions have you seen raised this week about big data and policing? I think we're still in stage one of the conversation. Stage one is still about the rage and outrage that comes from people who have had enough of police abuse. I think stage two is going to be a recognition that a lot of the new forms of social control that police will use after the fires have subsided are going to be surveillance technology. One of the clues that you saw is in President Trump's comments Monday. He was talking tough, but he was also talking tracking and using surveillance to track protestors. I think we're going to see, as we always see, the first step is this militarized response to citizen outrage. And then the second step, which I think lasts a lot longer than the militarized step, is almost militarized surveillance. The same kinds of technologies are available to the military being used domestically. People are going to see in this phase a more social control mechanism to police protestors using police surveillance.

The really sort of difficult, scary thing about where we are now is that all of those protestors who went there to record the images with their phones and had their apps running are recorded. You can literally run back where they came from, where their home address is, where they were standing in the protest—that's how powerful these tracking technologies are. And if police get a hold of them, either through warrants or subpoenas to the tech companies, the individuals are going to be revealed as being a part of these protests. If that should not be seen as protected First Amendment speech, but something else, it could be really damaging. You can imagine a government that's angry at protestors putting those names in a database.

➤➤ Have you seen any police departments adopt technologies that track their own interactions with the community? There have been some police departments that have been brave enough and open enough to invite data scientists to examine what's happened, and see if big data can offer insights. Charlotte-Mecklenburg had a problem of police violence, and they invited some data scientists from Chicago and said, "Here's our data. This is what happens. Tell us if you can figure out something that's going on." And the data scientists began with a complicated deep dive into the data to try to find patterns of use of force.

One of the things they found was that officers that responded to a traumatic scene, such as a suicide or child death, the next shift they were much more likely to be involved in a use-of-force incident. These are human beings who witnessed trauma, and they are put into another scary situation, and they are reacting through the trauma they have seen, like many people do with PTSD, and they overreacted. And it has an easy response: Don't send that officer to the next crisis the next day. They also found that if they sent one or two officers to a domestic violence scene, that tended to create more potential use of force, usually because the man involved in the domestic violence scene would be outraged by the police presence. But if they sent six or seven officers, sort of an overwhelming presence, they reduced violence. So that's a good example for how tracking and studying these patterns can reveal insights that police didn't know, but it takes a brave chief to want to ask data scientists to come in and expose what's happening, because a lot of times what's happening isn't pretty.

➤➤ What would you say is the first step to instituting these tools? At this moment of crisis on the legitimacy of policing in America, one step, among many, could be to use technology to increase transparency and accountability. But it has to mean the data is not held by police, and that there are independent, community-based processes that can hold police to account if there is a problem. In some ways, we need to systematize the cell phone justice that we're seeing. And it doesn't have to be bad for police. It can improve police training, it could improve accountability, and it can be potentially a step—among many—toward a culture of police accountability.


|

Trump's Executive Order and the Fight Over End-to-End Encryption

Last week, President Donald Trump issued an executive order aiming to end the "selective censorship" afforded to tech companies as part of Section 230 of the Communications Decency Act.

The order followed Twitter's decision to label one of the president's tweets about the efficacy of mail-in ballots as "potentially misleading." As social media companies struggle to respond to online violence and misinformation, some critics of the protection have portrayed Section 230 as a tool in service of Big Tech's political agenda.

But the provision has also been tangled up in the debate over end-to-end encryption. In January, The Verge reported on a draft bill thought to be from Sen. Lindsey Graham (R-South Carolina) that would allow tech companies to keep their Section 230 protections, but only if they follow a committee's guidelines, which would likely discourage end-to-end encryption.

So how might the executive order impact the debate over encryption backdoors?

Although some hailed the presidential action as a Section 230 death knell, Loeb & Loeb's Jessica Lee said the action does not do a whole lot, at least immediately.

"It reads sort of like a temper tantrum about Twitter blocking the president's post," Lee said.

This order is designed to prevent companies from labeling content as false or misleading and does not get into larger conversations around regulating tech, Section 230 and encryption as a whole, Lee said.

However, the order does request the Department of Justice to form a working group of state attorneys general to see if there are state laws that could relate to online content moderation, and then for the U.S. attorney general to draft federal legislation, she said. Earlier this year at a keynote addressAttorney General William Barr said that encryption allows "criminals to operate with impunity, hiding their activities under an impenetrable cloak of secrecy."

Those are two places the agencies could attach regulations around end-to-end encryption, she said.

Lee does expect to see some challenges if federal agencies act on the order.

One of the order's provisions asks the Federal Communications Commission to issue rulemaking to interpret Section 230. "The FCC has no jurisdiction over Section 230," Lee said. "So, if they issue rulemaking, I imagine that will be subject to immediate challenge."

However, if Barr gets legislation introduced through normal legislative channels, that might be challenged politically but not legally, she said.

The Center for Democracy & Technology filed one of the first challenges to the order on Tuesday, suing Trump for violating the First Amendment protection of online platforms and their users.

The center's incoming general counsel, Avery Gardiner, who filed the complaint alongside a Mayer Brown team led by Andrew Pincus in Washington, D.C., said that the president threatening the ability of social media platforms to moderate speech about election is "mind blowingly terrifying and needs to stop."

Lee said the policies introduced in the next year or two could be pivotal when it comes to both Section 230 and encryption, and the order could be an opportunity to bring all the players to the table to address the larger issues of content moderation and encryption, she said.

"As we enter a period of potential surveillance, one issue we're dealing with is racism in the country and protests, and the other issue we're dealing with is the pandemic, and I think the pandemic is going to open the door to additional surveillance," she said. "I think this concept of being able to get a backdoor to take away end-to-end encryption and give law enforcement more access to user conversations on these platforms is going to be a pivotal point in which direction this country goes in the surveillance context."


|

'Have Hope' for Contact Tracing Apps

During a Zoom webinar earlier this week, Jennifer Urban, a UC Berkeley School of Law clinical professor and director of the university's Samuelson Law, Technology & Public Policy Clinic, said the number one question privacy and policy communities should consider when it comes to app-based contact tracing is "Will this work?"

Countries that have deployed contact-tracing apps have demonstrated that the voluntary platforms often see a low participation rate, Urban said, with Singapore's TraceTogether app seeing just 30% penetration. The other issue is that older Americans who are most at risk are less likely to have smartphones than the rest of the country, she said.

So when, a participant asked if she personally thought it was worth the sharing of private information, given all the limitations, she said:

"Have hope. Don't place it all in a Google-Apple app. We would like things to be easy, cheap and more privacy protective than traditional contact tracing, but maybe there are ways to improve these methods."


NYPD face peaceful protesters coming together for the death of George Floyd. Photo: Ryland West/ALM
|

On the Radar

General Counsel, Corporations React Publicly to George Floyd's Death "We didn't think of this necessarily as a political issue," Minority Corporate Counsel Association chair Stuart Alderoty, general counsel of fintech firm Ripple Labs Inc., said Monday. "We're treating this as a society issue that is rooted in diversity and inclusion, which is the mission of the MCCA." Read more from Phillip Bantz here.

King & Spalding Resists WhatsApp's 'Drastic' Disqualification Bid in Cyber Case "Disqualification is a drastic, disfavored and disruptive remedy that is unwarranted here," San Francisco lawyer Jessica MacGregor of Long & Levit said in the new court filing. MacGregor argued that "depriving NSO [Group Technologies] of highly skilled counsel of its choice will impose significant hardship and prejudice." Read more from Mike Scarcella here.

'Unlawful and Unenforceable': Legal Experts Deride Trump's Attempt to Target Social Media Companies "If I'm reading this correctly, the EO claims tech platforms are doing something they're not, in violation of an incorrect interpretation of law, and tasks agencies it can't task to look into the things that aren't being done that wouldn't be wrong," wrote Tiffany Li, technology attorney and visiting assistant law professor at Boston University in a Tweet. Read more from Jacqueline Thomsen, Nate Robson and Mike Scarcella here.


Thanks for reading. We will be back next week with more What's Next. Please take care of yourself and each other, everyone.