We have entered an era where the roads will be shared by pedestrians, fully driver-operated vehicles, semi-autonomous vehicles and fully autonomous vehicles. As a result there will be a multitude of new issues regarding liability facing our legal system. The reality of these changes is apparent in cities that are hosting autonomous vehicle testing on their roads, such as Pittsburgh, which has become a hot bed for autonomous vehicle development and testing. Pittsburgh is one of Uber's main test locations, but it was in another location, Tempe, Arizona, where hypothesis became reality. The facts of a tragic, fatal accident read like a law exam.

On March 18, a woman attempted to walk her bicycle across a street, mid-block at night. At the same time an autonomous Uber vehicle going down the street at approximately 38 miles-per-hour, struck and killed the pedestrian. This was the first nondriver death in the autonomous age and it was captured on dash cam. We will never know how it would have played out in the courts from a liability perspective as Uber quickly entered into a settlement.

Liability issues in this case would have been very interesting. The pedestrian crossed a busy, high-speed street, in the dark, mid-block and outside of a cross walk. Uber does not operate vehicles unattended. Uber has a technician behind the wheel of the vehicles when they are operating in city streets. The video in the Uber shows that the technician was looking down immediately before the accident. The Tempe Police chief said that Uber was likely not at fault. She said “It's very clear it would have been difficult to avoid the collision in any kind of mode {autonomous or human-driven} based upon how she came from the shadows right into the roadway.”

On May 7, in Florida a Tesla owner was operating his car in Tesla's autopilot mode. This Tesla option is an advanced driver-assistance system that has lane centering, adaptive cruise control, self parking, ability to change lanes automatically and other semi-autonomous features. While in this mode, the vehicle was in a collision with a tractor-trailer, and as a result the driver of the Tesla perished. The National Highway Traffic Safety Administration investigated. Preliminary reports showed that apparently the truck made a left turn in front of the Tesla and the car did not brake. According to the investigation and information from Tesla, neither the autopilot system nor the driver noticed the white trailer against a bright sky. As a result the brakes were not activated. Tesla indicated that this was the first fatal accident involving their system in 130 million miles, compared to one fatality every 94 million miles among all types of vehicles in the United States.

Another interesting issue was what the truck driver reported to Associated Press. He indicated that he could hear a Harry Potter movie playing in the crashed Tesla. The movie was still playing when the car ran into a telephone pole about a quarter of mile further down the road. Although, it is impossible to play a movie while driving on the Tesla's display, the police did find and aftermarket DVD player in the debris.

These accidents raise questions as to what standards should apply and where the liability focus should be. Should the autonomous vehicle and its human technician be held to the standard of a human driver, or should there be a different standard? Assuming the technician's job was primarily to override the autonomous system in an emergency should that person be held to a higher standard than an ordinary reasonable driver? On the other hand, if a reasonable human driver could not have avoided the accident, how then can a technician be expected to react quicker? Of course this may not be the subject of normal vehicular accident liability principles. After all, the scrutiny will be whether or not the autonomous features malfunctioned or were not designed properly for operation in traffic. Perhaps autonomous vehicles should be designed to stop in enough time to avoid any “dart out” collision irrespective of lighting conditions. Is there a different standard of driver negligence when using semi-autonomous features such as what was in the Tesla? Are drivers breaking the law when they utilize advanced cruise control features that allow them to drive without their hands on the wheel or feet on the gas or brake pedals? As the technology advances and becomes more prevalent, both the statutory and common law will have to adapt.

Both local and federal regulators will need to adapt to set standards for autonomous vehicles. Another advantageous consequence of technology in the vehicles is more data for urban planners in addressing infrastructure, which in turn should make roads safer. For example, Uber is taking their data (from nonautonomous trips) in Pittsburgh to create a program called Uber Movement. This data can be accessed by the public using an interactive map. City planners can use the maps and data for more accurate modeling. By more fully understanding traffic patterns they can eliminate gridlock, redesign dangerous intersections, make appropriate road repairs and implement improvements to make traffic flow more freely and in a safer fashion.

Although Uber avoided answering the complex legal questions by entering into a quick settlement in respect to the fatal Arizona accident, the law will have to change to address these advances in technology. Reasonable standards will change and new regulations and traffic laws will develop. Although autonomous vehicle developers are testing their vehicles in limited open road locales, almost all new vehicles have semi-autonomous features. Systems similar to that in the Tesla are available in many different cars. These vehicles are already operating worldwide on all public roads creating a greater possibility of overlap between human negligence and product malfunction.

Ultimately, assuming that autonomous vehicles function as predicted and replace owner operated vehicles, the number of accidents and fatalities likely will diminish substantially. Until then, drivers and pedestrians will need to adjust to both fully autonomous vehicles and vehicles with semi-autonomous features. Until drivers understand exactly how much they can rely on the semi-autonomous features and how much they have to exert control over the vehicles, accidents will happen. As in Arizona, the public, whether pedestrians or drivers, will also need understand a new norm as to what driverless vehicles can and cannot do. Perhaps mothers will no longer have to tell children “Look both ways before you cross,” but we are not there yet. As a result creative lawyers will be developing new theories of liability.

David J. Rosenberg is a partner in the Pittsburgh office of Weber Gallagher Simpson Stapleton Fires & Newby. He concentrates his practice on insurance and commercial litigation. He handles matters including employment, civil rights, coverage, bad faith, construction litigation, premises liability, products liability and toxic torts. Contact him at [email protected].