(This was originally published on June 22, 2021, and has been updated to reflect the release of nuPlan datasets.)
Our expanded open-source planning dataset will allow our industry and researchers to better understand how a driverless vehicle can find its way through a dynamic environment full of obstacles and ever-changing circumstances – like a human driver does every day.
Autonomous vehicles are at a crossroads between perception and planning. When the machine learning team at Motional examined the competitive landscape back in 2018, most of the attention was focused on perception, or the act of teaching AVs how to see cars, bicycles, pedestrians, and other objects in images and laser scans. By teaching autonomous vehicles to identify objects in their vicinity, self-driving systems can paint a detailed picture and get a better understanding of their surroundings.
While others kept their valuable perception datasets proprietary, Motional was the first to make ours publicly available. This effort, called nuScenes, has been downloaded by more than 16,000 students, researchers, and developers and referenced in more than 1,000 academic publications since its release in 2019. More importantly, nuScenes ushered in an era in which almost all autonomous vehicle companies now share their datasets to advance the state-of-the-art as a community, rather than going at it alone.
World's First Benchmark for AV Planning
Motional has launched an updated version of nuPlan, the world’s first and largest benchmark for autonomous vehicle planning. We have made this one-of-a-kind dataset available to the public with the goal of improving the way we teach vehicles to find their way through a dynamic environment full of obstacles and ever-changing circumstances – like a human driver does every day.
The previous version featured 1,200 hours of driving data collected across four different cities: Boston, Pittsburgh, Las Vegas, and Singapore. Data annotation was done exclusively by machine learning, with nearly the same quality as human annotated data.
The newest release includes 120 hours of raw sensor data or 10% of the full nuPlan dataset. This amounts to approximately 16TB of real-world data collected from eight onboard cameras and five lidars. This is in addition to the larger, annotated dataset, which contains around 500 million images and 100 million lidar scans. We also provide the nuplan-devkit as a starting point for users interested in using the data.
nuPlan allows scaling to all the complexities that drivers encounter in the real world, including the extreme edge-case scenarios that are only experienced once every 1,000 hours of driving.
Scaling to All the Complexities Drivers Encounter
Why is more data needed? Consider the diverse environment in which most people drive. Downtown Boston, residential areas in Pittsburgh, startup hubs in Singapore, and the Las Vegas Strip – all areas in which Motional is testing robotaxis – each have unique driving situations. These range from different traffic-flow patterns (as in Singapore, with its left-hand traffic) to roadway signage and signals to various types of intersections, terrain, and other characteristics. Even within the same city, our cars encounter different driving situations every day because traffic densities, road layouts, topography, and traffic laws vary, and each area has its own unique rules of the road.
nuPlan contains a large-scale machine learning dataset and a toolkit for measuring the performance of planning techniques – essentially a virtual driving test. The dataset contains hundreds or hours of raw and annotated driving data collected across four different cities: Boston, Pittsburgh, Las Vegas, and Singapore. This makes nuPlan the world's largest public dataset for autonomous vehicle prediction and planning.
While ML-based planning has been studied extensively, the lack of published datasets that provide a common framework for closed-loop evaluation has limited progress in this area. Motional believes that nuPlan is filling this gap by providing an ML-based planning dataset, closed-loop evaluation, and planning-related metrics.
Notable Improvements in Perception
Following the release of nuScenes, we have seen notable improvements in the task of perception through the combination of a standardized testing protocol and AV developers sharing their recipes for success. For example, academic research has shown that the leading perception method to identify bicycles under certain circumstances has increased performance fourfold over the last three years.
While perception endows an autonomous vehicle with the ability to see the world, planning helps the vehicle more safely navigate it. Now that AVs can more accurately identify what is around them, Motional is setting our eyes on autonomous vehicle planning or pathfinding.
Closed Loop versus Open Loop
The other significant advantage of nuPlan is its closed-loop testing capability. This has been an improvement over open-loop testing used in existing benchmarks. With an open-loop system, the input is independent of the system's response, regardless of the system's behavior. Open-loop is sometimes called imitation learning, since the system simply checks that the planned route is similar to the one the driver took. In closed-loop evaluation, the planned route is used to control the vehicle. The vehicle may deviate from the original route that the driver took. Other drivers will then react accordingly.
Among other things, our closed-loop driving tests how much distance a car keeps when overtaking cyclists, whether passengers are likely to get car sick due to high acceleration in turns, and how we slow down when approaching jaywalking pedestrians. It considers planning metrics related to these scenarios as well as traffic rules, vehicle dynamics, goal achievement – the same way an experienced and safe human driver would.
The Final Frontier in Autonomous Driving
With nuScenes, we moved the needle in perception by helping driverless vehicles better view the world and other road users around them, and shared this knowledge with the world to advance autonomous technology. With nuPlan, we believe providing a large-scale dataset and common benchmark, will help pave a path towards progress in planning, which is perhaps one of the final frontiers in autonomous driving.
The nuPlan dataset is free of charge for academic use, and licensing is available for commercial purposes. For more information on commercial licensing, please reach out to nuPlan@Motional.com.
We invite you to follow us on social media @motionaldrive to learn more about how we’re making driverless technology safe, reliable, and accessible. You can also search our engineering job openings on Motional.com/careers.