[edit]
Deep Traffic Benchmark: Aerial Perception and Driven Behavior Dataset
Proceedings of the 15th Asian Conference on Machine Learning, PMLR 222:1670-1682, 2024.
Abstract
Predicting human driving behavior has always been an important area of autonomous driving research. Existing data on autonomous driving in this area is limited in both perspective and duration. For example, vehicles may block each other on the road, although data from vehicles behind them is useful for research. In addition, driving in this area is constrained by the road environment, and the host vehicle cannot observe the designated area for an extended period of time. To investigate the potential relationship between human driving behavior and traffic conditions, we provide a drone-collected video dataset, Deep Traffic, that includes: (1) aerial footage from a vertical perspective, (2) image and annotation capture for training vehicle destination detection and semantic segmentation model, (3) high-definition map data of the captured area, (4) development scripts for various features. Deep Traffic is the largest and most comprehensive dataset to date, covering both urban and high-speed areas. We believe that this benchmark dataset will greatly facilitate the development of drones to monitor traffic flow and study human driver behavior, and that the capacity of the traffic system is of great importance. All datasets and pre-training results can be downloaded from github project.