Truck driving could be the next new desk job. In the near future, you could have the disquieting experience of seeing nobody sitting in the driver's seat in the tractor-trailer to your right in traffic. Instead, the rig's operator could be operating the rig remotely, and not even continuously, while sitting many miles away behind a desk.

There is an almost overwhelming financial incentive to automate trucks. Trucks carry more than 70% of U.S. domestic freight tonnage, and the U.S. is experiencing a severe shortage of truck drivers. The shortage of drivers may be as large as 175,000 by 2026, according to the American Trucking Association. Trucking is a $700 million-a-year industry, and about a third of those costs are spent on drivers. Automation has the potential to address the shortage of truck drivers, reduce costs, and perhaps increase safety. But are we really ready to trust the operation of loaded 80,000-pound tractor-trailer rigs to automated systems?

Headline-making crashes of autonomous vehicles (AVs) designed for passengers over the last several years have dimmed much of the initial enthusiasm about passenger AVs. But some contend that the trucking business is different. They argue that trucks are ripe for automation because the technology is now sufficiently tested in passenger AVs. They also note that big trucks spend most of their time on repetitive, easily navigated highway routes, and not on the cramped urban intersections where passenger AVs spend most of their time.

If you think that we will be easing into a new era of large AV truck rigs on our highways after an appropriate period of societal and governmental reflection, think again. The era of AV truck rigs is here, and it is here now. Over half a dozen companies currently are road-testing AV trucks on our public roads. But the only way to bless this new era of AV trucks is to ignore all of the lessons that we should have learned from the crashes of passenger AVs over the past few years.

Last Oct. 14, Bloomberg Businessweek's cover screamed "Tesla's Autopilot Could Save Millions of Lives. How Many People Will It Kill First?" The headline captured the risk of the calculated decision by Elon Musk, Tesla's founder, to put Tesla's Autopilot feature in the hands of as many drivers as possible, as soon as possible. The headline also captured the broader risk of society's rushed embrace of AVs. Tesla's experience certainly is a cautionary tale. Critics maintain that Tesla markets its Autopilot feature as a system that will automatically drive a Tesla with little or no input from the driver, and that Tesla's marketing lulls drivers into a dangerous sense of complacency. Although Tesla's manual warns Tesla drivers to stay attentive, the warnings have not stopped Tesla drivers from checking text messages, reading books, strumming ukuleles, sleeping, or even having sex while their Teslas traveled along highways in Autopilot mode.

Since Tesla introduced its Autopilot feature in 2015, there have been a series of spectacular, widely publicized crashes involving Teslas in Autopilot mode. The March 2018 California crash of Apple engineer Walter Huang's Tesla was one such crash. Huang's Tesla was traveling in Autopilot mode when neither Huang nor the Autopilot system applied the brakes to prevent the Tesla from crashing into a concrete barrier on the highway. As a result, Huang regrettably is no longer with us today.

In February, the National Transportation Safety Board issued a scathing report about Huang's crash. The NTSB found that the design of Tesla's Autopilot system contributed to the crash because it allowed Huang to avoid paying attention and that Tesla failed to limit appropriately where the Autopilot system could be used. The NTSB Chairman Robert Sumwalt noted that the government had provided "scant oversight" of Autopilot and similar automated systems by other manufacturers. The NTSB criticized its sister agency, the National Highway Traffic Safety Administration, for failing to make sure that automakers put into place safeguards to limit the use of automated systems, such as the Autopilot system, to areas where they are designed to work. Finally, the NTSB pointedly noted that Huang's crash was the third fatal vehicle crash that it had investigated in which a driver's overreliance on Tesla's Autopilot system was implicated.

Uber's experience is yet another cautionary tale. Uber was the biggest player in the AV trucking sector until an Uber passenger AV struck and killed pedestrian Elaine Herzberg in a crosswalk in Arizona in March 2018. Although the AV system in the Uber that struck Herzberg detected Herzberg in the path of the Uber 6 seconds before the Uber struck her, the system could not determine whether she was a vehicle, a bicycle, or a person. The Uber's human backup driver failed to intervene to stop the Uber because she was watching a video program on her cellphone at the time. Uber manager Robbie Miller had sent an email days before the crash lamenting that Uber AVs "shouldn't be hitting things every 15,000 miles" and that "dangerous behavior" incidents were happening too frequently. Miller recommended two backup drivers and an 85% smaller fleet. Uber did not adopt Miller's recommendations before the crash. In the wake of Herzberg's death, Uber shut down its AV truck driving program, leaving the area to others. Uber had been testing its AV trucks on public highways using a model in which human drivers would handle the tricky urban driving at the beginning and end of the trip, and the Uber trucks' AV systems would handle the simpler long-haul freeway driving in between.

In fairness to Tesla and Uber, their safety woes are not unique to those companies. They are an almost inevitable byproduct of any imperfect automated system that is capable of lulling a human operator into a dangerous complacency. It is an open question whether the designers of automated driving systems are up to the challenge of keeping their human operators sufficiently vigilant and engaged. And the government's light-touch regulation in this area demonstrably is not working.

Now is not the time to ramp up our AV experimentation to include large tractor-trailer rigs. Let us frankly acknowledge what we are doing: we are beta testing AV systems on our public roadways. When we beta-test a smartphone with software that still has bugs within it, the phone may crash. When we beta-test a passenger AV with software that is still "learning," it may crash and kill the driver and some others. But when we beta-test a loaded 80,000-pound AV truck rig and it crashes, the potential carnage is ramped up exponentially.

Admittedly, driving is one of the most dangerous things that we do, and human error is the primary cause of automobile crashes. The promise of AVs is that they will never get drunk, never get tired, never get angry, and never feel the need to check text messages while driving down the road. But AV systems are not yet ready for prime time. Any functioning adult driver can tell the difference between a harmless highway overpass and a tractor-trailer rig pulling across the path of a vehicle, but AV systems cannot yet always make that crucial determination. AV systems do not need to be literally flawless before they are allowed to pilot tractor-trailer rigs, but certainly they need to be safer than the average human driver, at a minimum. Until that basic threshold is cleared, we should resist the urge to rush headlong into AV trucks. In the near future, none of us should see an empty driver's seat on a tractor-trailer rig on the road. Trucking should not be the new desk job any time soon.

Quentin Brogdon is a partner in Crain Brogdon Rogers in Dallas. He is a former president of the Dallas chapter of the American Board of Trial Advocates, and he is a fellow of the invitation-only International Academy of Trial Lawyers, American College of Trial Lawyers, and International Society of Barristers.