StreetAware: A High-Resolution Synchronized Multimodal Urban Scene Dataset

Publication Date: 4/3/2023

Event: Sensors

Reference: Sensors 23 (3710): 1-21, 2023

Authors: Yurii Piadyk, New York University; Joao Rulff, New York University; Ethan Brewer, New York University; Maryam Hosseini, New York University; Kaan Ozbay, New York University; Murugan Sankaradas, NEC Laboratories America, Inc.; Srimat T. Chakradhar, NEC Laboratories America, Inc.; Claudio Silva, New York University

Abstract: Access to high-quality data is an important barrier in the digital analysis of urban settings, including applications within computer vision and urban design. Diverse forms of data collected from sensors in areas of high activity in the urban environment, particularly at street intersections, are valuable resources for researchers interpreting the dynamics between vehicles, pedestrians, and the built environment. In this paper, we present a high-resolution audio, video, and LiDAR dataset of three urban intersections in Brooklyn, New York, totaling almost 8 unique hours. The data were collected with custom Reconfigurable Environmental Intelligence Platform (REIP) sensors that were designed with the ability to accurately synchronize multiple video and audio inputs. The resulting data are novel in that they are inclusively multimodal, multi-angular, high-resolution, and synchronized. We demonstrate four ways the data could be utilized — (1) to discover and locate occluded objects using multiple sensors and modalities, (2) to associate audio events with their respective visual representations using both video and audio modes, (3) to track the amount of each type of object in a scene over time, and (4) to measure pedestrian speed using multiple synchronized camera views. In addition to these use cases, our data are available for other researchers to carry out analyses related to applying machine learning to understanding the urban environment (in which existing datasets may be inadequate), such as pedestrian-vehicle interaction modeling and pedestrian attribute recognition. Such analyses can help inform decisions made in the context of urban sensing and smart cities, including accessibility-aware urban design and Vision Zero initiatives.

Publication Link: https://www.mdpi.com/1424-8220/23/7/3710