Today's cars are expressive---to a point. Sure they've got turn signals, brake lights, high beams, and horns---all of which are effective at conveying specific information, but none of which is much good for subtle communication. That’s OK; cars have humans for that. Humans can make eye contact with fellow drivers, wave at pedestrians, glower at cyclists, and otherwise shout and gesticulate to their hearts’ content.
Great---until you yank the human from the driver’s seat. "These simple lights aren't gonna cut it anymore,” says roboticist Carol Reiley, co-founder and president of Drive.ai. Today, the startup emerges from stealth mode as the latest entrant in the rapidly expanding roster of companies testing autonomous cars.
Like most companies competing in this space---from automakers like Audi to rideshare behemoth Uber---Drive.ai is using deep learning to teach its cars to drive, and exploring how they will talk to their passengers.
What sets the Mountain View-based startup apart is its interest in how driverless cars will communicate with outsiders like pedestrians and other drivers. “We need to be able to communicate in all directions, and we need to be able to show intention and have a conversation with the other players on the road,” Reiley says.
This is largely unexplored territory. Mercedes' F 015 concept can project a crosswalk onto the ground for wary pedestrians; and last year, Google patented a pedestrian notification system for its autonomous car, with a stop sign on its side and a "safe to cross" sign on its front bumper. But that's about it.
Still unknown: How to communicate to non-passengers how a driverless car will behave. Will it let them cross, or run them down? Will it race up the highway on-ramp and merge like a jerk, or roll along until it finds a comfortable spot to join the flow? These are complicated non-verbal communication problems, Reiley says---problems that require reinventing the car.
To that end, Drive.ai is testing driverless vehicles with roof-mounted billboards that flash messages to everyone within eyeshot. "It could show text, it could show pictures," Reiley says. They’ve toyed with emoji-base signaling, to skirt language and literacy issues. The cars can even draw from a library of "safety sounds," to express everything from the aggressive get out of my way! to the courteous after you, I insist.
Drive.ai bounced early ideas against employees' friends and families, and "crowd-sourced" some testing with Amazon's "Mechanical Turk" workforce. But it didn't wait to figure out the actual driving stuff to get its design theories onto the streets. "These things need to be baked in from day one," Reiley says, not just because they may take as long to sort out as the driving itself, but because their design will influence how the car interacts with the world.
"I think it's an interesting design problem," says John Rousseau, executive director at design firm Artefact, who worked with Hyundai on a self-driving concept. “It’s absolutely worth thinking about."
He agrees autonomous cars should be able to communicate their intent, but isn't sure about Drive.ai's approach. Even with sounds and signs that seem intuitive, you're essentially creating a language---a language not everyone will understand. The fact that dozens of companies are working on autonomous vehicles stands to complicate things further; if one outfit’s language doesn’t mean the same thing as another’s, blue lights could mean "I'm stopping" on one car and mean "get out of my way" on another.
Rousseau argues for taking the onus of communication away from each individual car, and working on the system as a whole. Cars should tie into our infrastructure (or our augmented reality goggles), he says, with our traffic signals and built environment. "Get past the immediate, this is what we can make this device do," he says. "Think about the entire system."
But for now, Drive.ai is making its own path. "I think robots need to be adorable and loved," Reiley says. And it's hard to love something you can't understand.