Autonomous vehicle algorithm uses ‘watch and learn’ approach

Autonomous vehicle algorithm uses ‘watch and learn’ approach

Technology News |
Researchers at Boston University say they have developed a new machine learning algorithm that teaches cars to self-drive by observing other traffic.
By Rich Pell


Currently, self-driving cars are powered by machine learning algorithms that require vast amounts of driving data in order to function safely – data that is often kept private to prevent competition. But, say the researchers, if self-driving cars could learn to drive in the same way that babies learn to walk – by watching and mimicking others around them – they would require far less compiled driving data.

This thought led the researchers to develop a new way for autonomous vehicles to learn safe driving techniques – by watching other cars on the road, predicting how they will respond to their environment, and using that information to make their own driving decisions. Such a training paradigm, say the researchers, could also increase data sharing and cooperation among researchers in their field.

“Each company goes through the same process of taking cars, putting sensors on them, paying drivers to drive the vehicles, collecting data, and teaching the cars to drive,” says Ohn-Bar, a BU College of Engineering assistant professor of electrical and computer engineering and a junior faculty fellow at BU’s Rafik B. Hariri Institute for Computing and Computational Science & Engineering.

Sharing that driving data could help companies create safe autonomous vehicles faster, allowing everyone in society to benefit from the cooperation. Artificially intelligent driving systems require so much data to work well, says Ohn-Bar, that no single company will be able to solve this problem on its own.

“Billions of miles [of data collected on the road] are just a drop in an ocean of real-world events and diversity,” says Ohn-Bar. “Yet, a missing data sample could lead to unsafe behavior and a potential crash.”

The researchers’ proposed machine learning algorithm works by estimating the viewpoints and blind spots of other nearby cars to create a bird’s-eye-view map of the surrounding environment. These maps help self-driving cars detect obstacles – such as other cars or pedestrians – and to understand how other cars turn, negotiate, and yield without crashing into anything.

Through this method, self-driving cars learn by translating the actions of surrounding vehicles into their own frames of reference – their machine learning algorithm–powered neural networks. These other cars may be human-driven vehicles without any sensors, or another company’s auto-piloted vehicles. Since observations from all of the surrounding cars in a scene are central to the algorithm’s training, this “learning by watching” paradigm encourages data sharing, and consequently safer autonomous vehicles, say the researchers.

The researchers tested their “watch and learn” algorithm by having autonomous cars that were driven by it navigate two virtual towns – one with straightforward turns and obstacles similar to their training environment, and another with unexpected twists, like five-way intersections. In both scenarios, say the researchers, they found that their self-driving neural network gets into very few accidents. With just one hour of driving data to train the machine learning algorithm, the autonomous vehicles arrived safely at their destinations 92 percent of the time.

“While previous best methods required hours,” says Ohn-Bar, “we were surprised that our method could learn to drive safely with just 10 minutes of driving data.”

While the results are promising, say the researchers, there are still several open challenges in dealing with intricate urban settings.

“Accounting for drastically varying perspectives across the watched vehicles, noise and occlusion in sensor measurements, and various drivers is very difficult,” says Ohn-Bar.

Looking ahead, the researchers say their method for teaching autonomous vehicles to self-drive could be used in other technologies, as well.

“Delivery robots or even drones could all learn by watching other AI systems in their environment,” says Ohn-Bar.

For more, see “Learning by Watching.”

Related articles:
AI system lets robots teach themselves to see
Autonomous navigation using visual terrain recognition gets AI boost
Self-learning robot a step closer to machine self-awareness
Artificial Intelligence in Autonomous Driving


Linked Articles