A few hundred self-driving cars are undergoing testing on American roads today, using advanced technology to journey down highways, stop at red lights, and avoid pedestrians and cyclists—except when they don’t. More than 60 companies are registered to test in California alone, though just 28 tested on state roads last year.
Exactly how many vehicles are testing, where they’re doing it, and how those cars are performing is mostly anyone’s guess. In many states, companies experimenting with autonomous vehicles don’t have to specify, and the federal government doesn’t keep track either. Yes, the tech is still very much under development, and industry reps and experts say it’s way too early to create some kind of robot self-driver’s license exam. But two meetings in Washington, DC, last week made clear that some believe regulators aren’t properly overseeing the testing of this technology.
In the first, the National Transportation Safety Board met to conclude its investigation into a 2018 collision in which an Uber self-driving vehicle hit and killed an Arizona woman. Board members railed against a system that has left most of the oversight of self-driving vehicles to states, and made the few federal guidelines voluntary.
“In my opinion, they’ve put technology advancement here before saving lives,” said NTSB member Jennifer Homendy, referring to the National Highway Traffic Safety Administration, which regulates motor vehicle safety. “There’s no requirement. There’s no evaluation. There’s no real standards issued.”
NTSB investigator Ensar Becic said the federal government was “perfectly positioned” to demand a more rigorous safety approach from self-driving vehicle companies. It could, for example, provide feedback on the voluntary safety self-assessments that the agency has politely asked companies to write and hand over. It could also make them mandatory.
The next day, during a meeting of the Senate Commerce Committee, senators said they wanted to hasten the development of autonomous vehicles, which the industry says will drive more carefully than their human counterparts. But some Senate Democrats wondered, sometimes angrily, whether the federal government was doing enough to monitor the testing of self-driving cars right now.
“While I appreciate the potential benefits of autonomous vehicles, I remain concerned that humans will be used as test dummies,” said Senator Tom Udall, the Democrat from New Mexico.
Last December, the Senate’s first bit of legislation that would have governed self-driving cars, a bill called AV START, sputtered and died after an analogous bill passed the House. The stakes are still high: More than 36,000 people died on American roads last year, according to federal statistics, including a growing number of pedestrians and cyclists. Slowing the development of self-driving vehicles, which should one day drive with more precision and way less distraction, might mean more lives lost in the long run. But some have begun to question whether autonomous vehicles, when they arrive, will be unmitigated saviors. And more deaths in the testing phase could decelerate their rollout too.
Technically, regulating the testing of self-driving vehicles falls to the states. The federal government deals with stuff related to vehicle design; for example, it can recall a car if something is wrong with the airbag. At this stage, all the self-driving vehicles testing on public roads are just normal cars with some extra software thrown in. So they don’t need special exemptions from the federal government to operate. (If companies like General Motors follow through with their plans to mass-produce cars without steering wheels, they will.)
The states, however, typically deal with stuff related to how to operate vehicles. For example, they determine if you’re a good enough driver to earn a license. And especially at the testing phase, before these cars are carrying around too many members of the public, state legislatures and agencies are in charge of determining who gets to do what.
Today, state rules for self-driving vary wildly. Arizona, for instance, is a hotbed for self-driving testing in part because its governor in 2015 wrote an executive order telling state agencies to support their testing. Companies testing cars there with a backup driver behind the steering wheel—as Uber did before it was kicked out following the 2018 crash—must only notify the state that they’re there. A company like Waymo, which has reportedly started carrying some paying passengers in driverless vehicles around the Phoenix suburbs, must fill out more forms to get permission to nix the backup driver. In a statement, a spokesperson for the state’s Department of Transportation called Arizona’s rules a “rigorous process,” but said the department was reviewing the process after the release of the NTSB’s report.
Pennsylvania and California, by contrast, took years to develop their tougher self-driving rules. Potential Pennsylvania testers have to complete a more detailed application, which must be approved by the state, and requires companies testing AVs to tell the state where they’re testing and report crashes. California collects and publishes information about testing vehicles’ performance, much to companies’ chagrin.
Acting NHTSA administrator Jeremy Owens told senators at the hearing that the federal government, which has more expertise in software, often helps states write their regulations for self-driving. But some, like the watchdog group Consumer Reports, want more from the feds, like validating the safety of self-driving vehicles before they hit the road—even if they’re still just testing, and not being purchased by consumers.
Companies can choose to submit detailed safety information to the federal government, as part of NHTSA’s guidance on automated vehicles, which broadly outlines its approach on autonomous vehicles. But only 16 of the companies working on self-driving vehicles have chosen to do so, and the quality of the information is “kind of all over the place,” said Becic.
In the end, the NTSB recommended that states like Arizona develop applications for testing self-driving vehicles on their roads, and force companies to disclose how they might manage the risk associated with operating the vehicles around real—and sometimes unsuspecting—people. The panel also called on NHTSA to force companies developing the tech to give it information on their approach to safety, and to develop standards for evaluating that information.
NHTSA said in a statement that it “will carefully review” the NTSB report and recommendations. It’s possible we may be about to learn a lot more about what’s rolling out on American roads.
- The strange life and mysterious death of a virtuoso coder
- Alphabet's dream of an “Everyday Robot” is just out of reach
- An origami artist shows how to fold ultra-realistic creatures
- Wish List 2019: 52 amazing gifts you'll want to keep for yourself
- How to lock down your health and fitness data
- 👁 A safer way to protect your data; plus, the latest news on AI
- 🏃🏽♀️ Want the best tools to get healthy? Check out our Gear team’s picks for the best fitness trackers, running gear (including shoes and socks), and best headphones.