Self-driving cars turn out to be fair-weather friends. I remember once taking a test ride in an early fuel-cell car, and was asked to avoid puddles because the car couldn’t get wet. It seems that the state-of-the-art with autonomous cars is exactly that — they don’t work in heavy rain.
Christopher Urmson, who heads Google’s self-driving car team, told MIT Technology Review that, among other things, the company’s cars haven’t yet driven in snow, can’t be tested in downpours, and are befuddled by large open parking lots and multi-floor parking garages. Sunny days can be bad, too, because if the sun is behind a traffic light the car may not be able to see it change color. Construction zones are an issue, pedestrians (such as a wildly gesticulating traffic cop) are poorly understood, and newly added road features not yet incorporated into Google’s maps would be cheerfully ignored. Big rock or crumpled ball of paper? The car can’t tell, so it will detour around both.
Sam Abuelsamid, research analyst at Navigant Research, points to the kind of winter storm that routinely sweeps through the Midwest. “None of the autonomous cars currently being tested are even remotely capable of dealing with these kinds of conditions, instead throwing full control back in the hands of often-hapless human drivers,” he said.
A Google self-driving car in traffic. Note the good weather, probably in California. (Photo: Hydrogen/flickr)
Brad Templeton, who calls himself a “robocar blogger and developer,” notes on Quora:
In very heavy rain the Velodyne LIDAR which is popular starts getting too many returns from drops and spray, and so operation in these conditions is not viewed as safe. Newer generation LIDARs are in development which are more robust against heavy rain.
Radar is fine in rain and fog as well, but radar is not enough to drive on at high speed. Radar has trouble spotting stopped objects—it gets returns from them but also gets returns from the terrain and so it is hard to be sure a radar return is from a stopped object on the road. Driving on fresh snow is challenging because the vehicle can't see the lane markers or even road edges. Humans also have trouble with this. Several solutions are in the works.
Currently, the Google vehicles use GPS for general positioning but the cars require visual scan of road lines and surfaces boundaries for more refined positioning. None of this will work on a road covered with snow, ice, heavy rain, or cicada infestations. Or fog. The cars rely heavily on Google map data and if anything changed since the last Google drive-through the consequences may be interesting. Construction? Ooops. New ditch? Whoops. Massive pothole causing damage? Geronimo!
In actual traffic, the mind boggles at the number of factors that any ordinary trip to get milk would need to consider. There are dogs, people crossing the street, obstacles in the road, drivers coming in from side streets, traffic lights, street signs. If your autonomous car missed just one of these things, the results could be tragic.
Urmson thinks Google can overcome all these obstacles in five years. That seems optimistic to me, but the skunkworks is working overtime. On video, here's Google's view of how self-driving cars work: