“Self-driving cars are here,” Dmitri Dolgov told the audience at MIT Technology Review’s EmTech Digital event this week. “It's not a matter of when or if. It’s a matter of how fast we can grow and how fast we can scale this technology in a responsible manner.”
Waymo’s CTO is right: The outfit that started off as Google’s self-driving car project is running a limited robotaxi service in the Phoenix metro area. (The company still uses safety drivers, so the cars aren’t yet totally driverless. Dolgov also told the audience that the company has tech yet to crack.) And it’s not alone. GM Cruise plans to launch a service this year. Uber is testing in Pittsburgh. Lyft and Aptiv have a limited self-driving service in Las Vegas. Nuro’s delivery bots are hauling groceries around Texas and Arizona. May Mobility is running robo-shuttles in Detroit.
So for the public sharing the roads with these things, a few long lurking questions are now more pressing than ever: How do we know these things are safe? And how can the companies that promise they are prove it to us?
One thing is for sure: The way we certify human drivers ain’t going to cut it. Just because software can pull off a three-point turn once doesn’t mean it will be able to do it every time, in any conditions. Or that the people who built it even why it pulled it off. Algorithms are black boxes, and ven if developers know that a computer is doing something right, they can’t necessarily tell if the computer understands why it’s right. If something goes wrong with a self-driving software, though, researchers are going to need to understand how it works—so they can fix it.
A different sort of test, then, is in order. One made not for people, but for machines. That’s why some people in the self-driving space are talking about setting a new kind of standard.
The WIRED Guide to Self-Driving Cars
For decades, engineers who build anything, including software, have used standards to verify the quality of their work. Whether they’re voluntary or government-mandated, they’re less rules for what to do than processes for making sure what you are doing works. Underwriters Laboratory, an Illinois-based organization writes standards and certifies companies are following them for just about any product you can think of: outdoor furniture; horticultural lighting and grow systems; armored cables; robotic equipment; factory-built fireplaces; tin-clad fire doors. Check your favorite American electronic product or appliance and chances are you’ll find a safety certification stamp from Underwriters or a similar organization. (If you don’t, maybe rethink your choices.) Another group, called the International Organization of Standards, came out with a new standard called ISO 26262 eight years ago, which outlines safety in electrical or electronic car systems.
But no one has made this kind of standard, this variety of test, for a self-driving car. Underwriters Laboratories and a safety software company called Edge Case Research would like to change that, and quickly. They have a plan to bring together all sorts of players in this budding industry to do what others have done for automotive software and those tin-clad fire doors. The groups plan to write a new safety standard for autonomous products called—and this just rolls off the tongue—UL 4600.
Right now, UL 4600 is a draft, written by collaborators with backgrounds in standards writing and aviation and automotive software tech. To make the final version, they need to bring together a “supergroup” panel of advisors. They’d like to do that this spring.
“I have a balance of interests that I look for,” says Deborah Prince, the standards process manager for Underwriters Laboratories. She has put together many advisory panels for the standards that the company has created oversees. “I’m looking for my producers, I might have software people in there, insurance people, regulators. I want the right cross section.” For a self-driving software standard, that cross section might be made up of big developers like Waymo and Uber, small self-driving startups, independent researchers, car companies, and maybe even a few staffers from the Department of Transportation.
(Waymo could not say whether it has been in conversation with UL about this safety standard, but Uber spokesperson Sarah Abboud said the company was aware of UL’s activity and “are interested in getting involved in any industry standard work that brings self-driving vehicles to market in a safe, responsible way.”)
Together, the groups want that advisory panel to come up with a standard that would force those building self-driving technology explain how their cars can get through bloopers and accidents, even without a driver at the wheel. Tire blows out? No one’s going to grab the steering wheel. Vehicle is catches on fire, and the passenger is asleep in the back? No one’s there to wake them up and get them out. Developers would have to lay out precisely how their software works around those autonomy-specific problems.
“The standard says, “This is a list of all the things that it means to do the right thing, and you have to explain to me how you're going to get it right,” says Philip Koopman, a cofounder of Edge Case Research and an electrical and computer engineering professor at Carnegie Mellon University, who is helping to write the standard. “I don't really care how. But you're not allowed to blow it off.”
The standard has a ways to go, though its makers want the process to move very quickly, by safety standards. If the group comes together this spring, it might update the draft standard by mid-year, and solicit public feedback. It could even publish a standard by the end of the 2019—warp speed for a process that usually takes years.
Then, though, would come the biggest challenge: getting people to agree to use the thing. Because a standard without adherents is like a church without congregants: kind of depressing, and not very useful. One challenge facing this group is that Underwriters isn’t well-known in the automotive space, so following it might take some more convincing from automotive players. Another is that many aren’t convinced it’s not better to rejigger ISO’s 26262 standard for autonomous driving.
But the upside for this standard—or any safety standard, really—is that following it could mean a safer self-driving car. One that’s far better than a teen who just passed his first driving test—and can prove it.
- How much prenatal genetic info do you really want?
- On the trail of the robocall king
- The real choice you make subscribing to Apple services
- The mathematical history of a perfect color combination
- For gig workers, client interactions can get … weird
- 👀 Looking for the latest gadgets? Check out our latest buying guides and best deals all year round
- 📩 Get even more of our inside scoops with our weekly Backchannel newsletter