In America, 2018 was supposed to be a very big year for self-driving cars. Uber quietly prepped to launch a robo-taxi service. Waymo said riders would be able to catch a driverless ride by year’s end. General Motors’ Cruise said it would start testing in New York City, the country’s traffic chaos capital. Congress was poised to pass legislation that would set broad outlines for federal regulation of the tech.
Instead, one year ago today, an Uber self-driving SUV testing in Arizona struck and killed a woman named Elaine Herzberg as she was crossing the street. The crash derailed much of the optimism surrounding the advent of autonomy, underscoring its potential to do harm. And it ushered in a year during which the greatest promise of the technology—a drastic drop in road deaths—could feel farther away than ever.
Uber stopped testing on public roads for nine months, recalibrated its program, and now only uses one part of a Pittsburgh neighborhood to experiment with self-driving. (Arizona’s governor, who had declared the state’s roads open for testing in 2015, expelled the company after the crash.) Waymo launched its service in Phoenix, but keeps its human safety drivers behind the wheel. GM got mired in regulations and politics and stopped talking about testing in New York. That autonomous vehicle bill languishes in Congress.
The WIRED Guide to Self-Driving Cars
And the American people aren’t waiting for the National Transportation Safety Board’s final report on the crash to make up their minds. A recent AAA survey of US adults found 71 percent are afraid to ride in a self-driving vehicle, compared to 63 percent before the crash. Axios reports even President Trump is among them .
So if you’re keeping an ear out for self-driving predictions and pronouncements, chances are you’re catching more whispers than exclamations. Amidst the hushed tones, though, you will hear more open talk about safety.
While national legislation isn’t going anywhere, the federal Department of Transportation has encouraged companies testing automated vehicles to submit “voluntary safety self-assessments”. In an ideal world, these would include detailed information on how companies structure testing. They’d provide details on crashworthiness, and how their vehicles protect occupants and road users as engineers work towards ever-elusive self-driving perfection. Critics complain many of the assessments submitted so far are less technical documents than glossy brochures stuffed with marketing-speak. Still, 13 companies have now turned them in, compared to just two this time last year.
Some companies have also made high-profile safety hires. In January, Waymo hired former NTSB chairperson Debbie Hersman as its first chief safety officer. Uber brought on former DOT safety official Nat Beuse in December. Even smaller startups, like the automated trucking company Starsky Robotics, have started to bring on more employees with robust safety engineering training—its own discipline with its own approach to building machines.
And now more than ever, the denizens of this blooming ecosystem are quick to emphasize the difficulty of making their technology work. That the vehicles have to be safe not just when they’re ready for commercial service, but while they’re testing.
“If you really want to reach a higher level of safety, you have to do a lot more than just building prototypes,” says Burkhard Huhnke, the vice president of automotive strategy at the silicon chip design company Synopsys. “Showcasing fascinating self-driving technology has nothing to do with the full solution for the problem.”
Doing all that takes real time and effort. “I don't see [self-driving technology] happening in the next five years or so—it will really take a longer time than everyone thought,” says Huhnke. “This is not an industry where quick, startup ideas develop a lot of value and can sell it to a company. It takes a while to develop safety, security, and reliability into the systems.” For the foreseeable future, that means the automated vehicle industry’s goal is staying safe while engineering to promote safety. And that means keeping the public safe, too.
- Cambridge Analytica and the Great Privacy Awakening
- Freitag's latest bags have a funky new ingredient
- When Facebook goes down, don't blame hackers
- Can machines tell when patients are about to die?
- A genetic mutation to hint why birth control can fail
- 👀 Looking for the latest gadgets? Check out our latest buying guides and best deals all year round
- 📩 Get even more of our inside scoops with our weekly Backchannel newsletter