Event-Based Activity
Recognition Dataset
15 Adr street, 5015
Download Event Based Vehicle Activity Dataset

We introduce the first very large vehicle activity dataset for event and frame-based cameras. This dataset is composed of both synthetic and real-world dataset with more than 68hours of automotive recordings acquired with a 346 x 260 pixels DVS event camera. It contains open roads and very diverse driving scenarios, ranging from urban, highway, suburbs and countryside scenes, as well as different weather and illumination conditions. We also present a class of efficient model called n-EAR for on-board vehicle activity recognition on autonomous vehicles. The core decision making in a self driving car relies heavily on the relative position of the vehicle with respect to its surroundings. We introduce two novel techniques, first an event based attention sampling technique that leverages on the bio-inspired event data to adaptively sample the frame-based data. Secondly, a two-stream architecture that that efficiently trade off between latency and accuracy. We bring these two ideas together to build the n-EAR. We also present a modified carla simulator to generate data that can be used for solving the challenges faced in autonomous driving vehicles.

Frame-Based and Event-Based Data
CITING n-EAR Acitivity Dataset

FYI! If the links are broken please email to neuronics@iisc.ac.in

CAUTION If dataset is used in you research, dont forget to cite us!!! Please fill the below form to recieve the resilio sync download link

n-EAR paper Once uploaded to arXiv we can cited her