Abstract 4D radar has higher point cloud density and precise vertical resolution than conventional 3D radar, making it promising for adverse
Abstract 4D radar has higher point cloud density and precise vertical resolution than conventional 3D radar, making it promising for adverse scenarios in the environmental perception of autonomous driving. However, 4D radar is more noisy than LiDAR and requires different filtering strategies that affect the point cloud density and noise level. Comparative analyses of different point cloud densities and noise levels are still lacking, mainly because the available datasets use only one type of 4D radar, making it difficult to compare different 4D radars in the same scenario. We introduce a novel large-scale multi-modal dataset that captures both types of 4D radar, consisting of 151 sequences, most of which are 20 seconds long and contain 10,007 synchronized and annotated frames. Our dataset captures a variety of challenging driving scenarios, including multiple road conditions, weather conditions, different lighting intensities and periods. It supports 3D object detection and tracking as well as multi-modal tasks. We experimentally validate the dataset, providing valuable insights for studying different types of 4D radar.