Jason Doiy / ALM

The autonomous vehicle industry has promoted its cars with a promise of increased safety— fewer drunken drivers, texting drivers and other human error that can end in crashes. But on Monday, a self-driving car from Uber Technologies Inc. became the first of its kind to fatally strike a pedestrian.

Soon after the crash in Tempe, Arizona, Uber tweeted that “our hearts go out to the victim's family” and suspended vehicle testing in multiple cities. The San Francisco-based company was not immediately available for further comment.

It's not clear yet who's liable, or what the consequences will be, if any, for makers of autonomous vehicles—but in-house lawyers in the space should be watching this case unfold, and developing a disaster management plan of their own, attorneys said.

It's a little premature to act upon this,” said Neal Walters, the practice leader of Ballard Spahr's products liability and mass tort group. “Before we jump to conclusions, we all need to remember that everyone is working hard to bring autonomous vehicles to the market to save lives.”

Walters said that before other in-house teams at companies that work with self-driving cars push for a course of action, it's important to get accurate, full information on what happened in Monday's crash. According to a report in the San Francisco Chronicle, Tempe's police chief has said that based on early probes Uber doesn't seem to be at fault, and that it's possible no car—autonomous or manual—could have avoided the collision.

“We need to have a disciplined approach about this, in looking at the [event data recorder] data's full reconstructions before drawing any conclusions,” Walters said. “Having been involved in this [industry] a while, there is a tendency on the part of the public to want to criticize or be skeptical of this new technology, but we need to be patient and look at all the factors.”

Bryant Walker Smith, an assistant professor in the School of Law and in the School of Engineering at the University of South Carolina, said responsible autonomous vehicle companies' in-house leaders should “ensure that their companies have earned the trust that regulators and the public necessarily place in them,” before and after an accident.

That means companies should always push for data integrity, he said, so that others can trust the information released after an accident.

“[Companies] should explain to the world what they are doing, why they believe it is reasonably safe, and why we should believe them. They should provide context for the public, acknowledging that automated driving is a work in progress that may never be perfect, but also emphasizing that the status quo—in which over 100 people die on U.S. roads every single day, largely because of human error—is tragically imperfect,” Smith said in an email to Corporate Counsel. “Frankly, I'd be skeptical of anyone who claims that automated driving is perfect—or who expresses shock that it isn't.”

And because this technology isn't yet perfect, Smith said autonomous vehicle companies should have a “break the glass” plan for crises like the crash in Tempe, even if their company wasn't directly involved in the incident.

“If a company is not responsible enough to have already developed these plans, then they have no business testing in the first place and should of course stop immediately,” Smith said.