Despite two fatal accidents involving semi-autonomous cars occurring within days of each other in March, testing of the technology continues. On April 2, California expanded its testing rules to allow for remote monitoring instead of a safety driver inside the vehicle. Waymo and another company have since applied to begin testing vehicles without drivers in the state. While neighboring states Arizona and Nevada also allow testing without a safety driver, California is both the most populous state and also the home to many of the companies’ testing vehicles. States should learn from regulations that promote innovation and safety at the same time.
Setbacks in Autonomous Vehicle testing
An Uber vehicle with a safety driver struck and killed a pedestrian in Tempe, Arizona on March 18; Uber quickly suspended all testing of its autonomous fleet while it investigates the causes of the crash. On March 23, the driver of a Tesla in autonomous mode died when the vehicle crashed into a highway median in Mountain View, California. Tesla has not suspended the feature in its vehicles while the company and the National Highway Traffic Safety Administration (NHTSA) investigate the causes of that crash. Since proponents highlight the safety improvements of driverless cars, these fatalities will invite stricter scrutiny of the claims of the technology.
As their name implies, safety drivers have played an important role in autonomous vehicle development. They receive special training to assume control when onboard computers encounter a situation that the vehicle cannot navigate by itself. Driving conditions can change quickly and the safety driver must remain alert constantly. However, advancements in driverless technology promise to eliminate human inputs altogether. With no steering wheel or a gas pedal, a computer would control the engine and steering based on inputs from onboard sensors. Passenger shuttles without any of these features have launched in Las Vegas, the University of Michigan, and in San Ramon, California.
A look at state laws
While the U.S. Department of Transportation and NHTSA periodically update their guidelines for autonomous vehicles, individual states are already passing relevant laws. However, they differ on basics like the definition of “vehicle operator.” Tennessee SB 151 points to the autonomous driving system (ADS), while Texas SB 2205 designates a “natural person” riding in the vehicle. Meanwhile, Georgia SB 219 identifies the operator as the person who causes the ADS to engage, which might happen remotely in a vehicle fleet. These distinctions will affect how states license both human drivers and autonomous vehicles going forward.
The most popular topic, with 11 state laws, is exemptions to following distance rules that allow for truck platooning. Drivers typically maintain an appropriate following distance from other vehicles to account for speed, road conditions, and human reaction times when traffic comes to a stop. With wireless communication, a line of trucks can accelerate and brake over much shorter distances. Closer following distances in a truck platoon lowers air resistance on the following vehicles, with fuel savings that add up quickly for multiple trucks hauling cargo over long distances. The popularity of platooning laws suggests a wider focus on commercial applications of autonomous vehicle technology on the state level.
Many other state laws call for studies of autonomous driver systems, though no states have yet published their findings. Of all the states, North Dakota has considered what happens to the data produced by self-driving cars. SB 2012, enacted in 2017, calls for the state Department of Transportation to study “the data or information stored or gathered by the use of those vehicles.” A failed bill from the same year, HB 1394, would have granted ownership of data to the vehicle’s owner and allowed sharing of data with the consent of the customer. Given the number of sensors built in to autonomous vehicles and the amount of data they generate, determining privacy protections will be an important aspect of new regulations.
Laboratories of democracy, and self-driving cars
Looking at the database of autonomous vehicle legislation from the National Conference of State Legislatures, we can track the progress of states in passing legislation. Twenty-two states and the District of Columbia have passed laws and an additional 10 state governors have issued executive orders regarding the operation of autonomous vehicles, while ten other state legislatures have considered legislation and the remaining eight state legislatures have not considered any.
California requires companies that test self-driving cars in the state to report the number of miles driven as well as the number of disengagements, or times a human driver taken control from the autonomous system. The number of disengagements per vehicle mile driven must have fallen enough to warrant a relaxing the rule to require a safety driver.
Within two years, 20 companies had collectively logged over 1 million miles of autonomous driving. Growth in miles driven slowed after September 2016 as more states passed laws to attract self-driving car testing. Around the same time, average disengagements per mile leveled off at 5 per 1,000 vehicle miles, or one disengagement every 200 miles driven. The stability in this figure over 14 months may have prompted California Department of Transportation to relax its rules on having a safety driver in the front seat.
The two vehicle fatalities in March emphasize the human costs of testing technology. Fully realized, replacing human drivers with artificial intelligence could drastically reduce motor vehicle deaths, a toll that claimed over 40,000 lives in the U.S. in 2016. If the rewards for getting it right outweigh the associated risks, how can technologists and policymakers minimize those risks? National safety guidelines and state laws should incorporate the lessons learned from real world testing. Preventing all future accidents may prove impossible, but they can provide feedback on what policies work best and which do not.
Christian Rome Lansang provided research for this blog post.