During this pandemic, when we invite the world into our homes with zoom and WebEx, are we changing what constitutes our "reasonable expectation of privacy" for purposes of Fourth Amendment guarantees against unreasonable searches?

We invite all kinds of people in our living rooms now, many of whom we barely know or don't know at all—they peer at the photos and artwork on our walls, listen to our kids and dogs, examine the mess or admire the cleanliness, and see us in whatever pandemic state of haircut we happen to be exhibiting. But still, they are with us by invitation, for only the time we allow them to be. And with the click of a button, we "leave" the zoom or WebEx meeting and part ways. In an instant, our home is private again. (It may not yet be quiet, the dogs may continue to bark and the children to raise the decibel level, but it is our little private world again).

Zoom and WebEx

The answer to whether massive but time-limited zoom/WebEx intrusions into our homes upset our legally cognizable expectation of privacy is a clear and unambiguous "no." Nor need we fear a different outcome in courts: an invitation into our homes is an invitation. Zoom or WebEx meetings no more change our expectation of privacy than inviting a neighbor over for a cup of coffee.

In today's world, lines between what in our lives is open to the public and what remains private are blurring in ways important to Fourth Amendment guarantees against unreasonable searches. For purposes of the Fourth Amendment, case law discusses a "search" as unauthorized entry onto a person or property. The word "unauthorized" is easy—"lacking permission." When you allow the world into your living room with zoom and WebEx, you are giving a temporary, limited permission.

What does "entry" really mean? It certainly has a physical component of literally "going into" or "treading upon"—but the law has evolved to extend that to less obvious ways of "going into" private space. In Kyllo v. United States, 533 U.S. 27 (2001), the Supreme Court extended the concept of entry to include monitoring heat radiating from a premises.

Our assumptions about privacy in our homes leads to a recoil reaction at the suggestion that a stranger could enter our homes and watch everything we do. But our computer usage (apart from zoom and WebEx) increasingly allows for just that. When most of us use computers to access the Internet, there is no certainty that we remain alone in our homes. Connected computers—that is, the computer you use to access the Internet — have effectively become doors to the outside world, located on a table top in what we consider private space.

Cameras

For some it will come as a surprise that those little camera-eyes on your computer can be activated, and there can be someone watching you through it, without your knowledge. That is, with a computer connected to the Internet, a third party can access that camera and watch you. Or look into the room in which that computer is sitting. Pretty creepy, but really real. This capability, punctuated by stories now sprinkled across the news of instances when this was done and discovered, is why many people now place post-its over the camera port, and college kids often buy plastic pieces with logos and phrases that they place over the camera eye to fulfill the same purpose.

It is a little eerie to consider that as we meander around the Internet, not only are websites accessed by our computers collecting all kinds of information about us—what we search, see, buy, listen to and read, and how long we do all that for—but they can be actually watching us as well.

Today, AI-enabled tools are used by data aggregators and brokers to assemble the vast and seemingly disconnected pieces of information about us and create profiles of who we are, how we spend our time, our likes and dislikes. What we thought were isolated pieces of data come together into a picture, almost as if someone was there in the room watching us…

If law enforcement purchases or acquires a profile of us, based on a composite of all we did using the Internet in the privacy of our own homes, would that constitute an "unreasonable search"? A "seizure"? An invasion of a legally cognizable right to privacy? No. While a violation of terms of usage governing how a website gathers, uses and monetizes information might give rise to a contract claim against the website or an aggregator, an innocent third-party purchaser (including a governmental entity) would not be in privity and likely could easily avoid liability.

Domain Awareness System (DAS)

We have long accepted that there is a difference between our expectation of privacy inside our homes and when we enter public space. We generally assume that—unless one is the target of an investigation—our movements outside our home are not being surveilled in any real sense. That is, we assume that no one is aggregating information about what we do, where we go, and who we see as we go about out external business. We have something that I think of as approximating an expectation of privacy in the aggregation of our public movements. But there are no Fourth Amendment protections in this sphere.

When we do venture outside (perhaps more often after the quarantine has been lifted), the sophisticated and little-known Domain Awareness System (DAS) deployed by the New York City Police Department has the capability to follow us, watch us, listen to us, and provide a swath of people and agencies information about who we are and what we are doing. Without us really being aware of it, Big Brother or Sister has the ability to watch us … Whether it does so or not is up to the decision of law enforcement and policy makers. That is, the technology is there. The only question is who uses it for what purpose.

DAS is the joint product of the NYPD and Microsoft. (An interesting aside is that NYPD retains the right to significant commissions from licensing the software to third parties). It is a set of software tools that allows for the aggregation, organization and dissemination of a wide variety of information about the movements and activities of millions of New Yorkers. AI capabilities built into the system allow for predictive use of certain information to assist law enforcement in a variety of ways.

Cameras and license plate readers (LPRs) are deployed on every bridge and tunnel entrance and exit from New York City; for any and every license plate coming into New York City, DAS can immediately process whether that car comes into New York on a regular basis, where it tends to go and where it might go this time. DAS can tell whether the car is likely that of a commuter, and if so, whether it follows certain patterns, or if it has varied from a normal pattern. A mapping feature enables tracking of particular people's specific movements.

Cameras deployed throughout the city, including on light poles, the outside and inside of public buildings (including public housing) and other accessible areas, capture still and video images of millions of New Yorkers. Facial recognition technology that is now a part of DAS has the capability for any image to be run against data bases of holders of driver's licenses, identity cards, or mug shots. While policies limit use of facial recognition technology, in 2018 there were almost 10,000 requests by law enforcement to match an image from a video clip with a photo from one of the databases.

DAS can instantly aggregate a variety of discrete pieces of information drawn from different databases about a person—combining data about a criminal or arrest history, 911 calls to a certain location, NYPD complaints, or certain data from public schools or a variety of city agencies. DAS can then add to this mix social media profiles.

The primary use of this technology is not surveillance but event contextualization—for instance, enabling an officer called to a premises to understand whether there have been prior 911 calls, a history of domestic violence or other information. There are plain public safety benefits. But DAS also allows for a level of surveillance that few understand exists.

We often think of China as a place where there is "real" surveillance, assuming that we haven't chosen to develop or deploy the same capabilities. That assumption is less accurate that most of us think.

All three examples above—the world's entry into our living room with zoom and WebEx, our computer's ability to watch us, and DAS, touch on expectations of privacy. As the world changes around us, whether our expectations of what should be legally private correspond with what is in fact private is a question for careful consideration.

Katherine B. Forrest is a partner in Cravath, Swaine & Moore's litigation department. She most recently served as a U.S. District Judge for the Southern District of New York and was the former Deputy Assistant Attorney General in the Antitrust Division of the U.S. Department of Justice.