Now Reading
Ford Releases Multi-Agent Seasonal Dataset For Autonomous Car Development

Ford Releases Multi-Agent Seasonal Dataset For Autonomous Car Development

W3Schools

The automotive industry has been working hard for a few years now on one of the most challenging problems of transportation, which is a fully autonomous self-driving vehicle. Currently, the autonomous systems use a combination of 3D scanners, high-resolution cameras and GPS/INS, to enable autonomy. However, in order to deal robustly on roads, handle a number of scenarios and maintain operating conditions, these systems will have to evolute into multi-agent autonomous systems.

Tech giants such as Apple, Facebook, Microsoft, among others, have been developing intelligent machine learning models to reduce collisions of self-driving cars. However, in all these years, Ford has always been on the quieter side when it comes to being open regarding their plans on autonomous driving projects until now.

Recently, researchers from the multinational automaker, Ford launched a challenging multi-agent seasonal dataset for autonomous cars. This multi-agent autonomous vehicle data presents the seasonal variation in weather, lighting, construction and traffic conditions experienced in dynamic urban environments.



Behind the Dataset

For creating this dataset, the researchers used a fleet of 2014 Ford Fusion Hybrids as the base platform, which was outfitted with an Applanix POS-LV inertial measurement unit (IMU), four HDL-32 Velodyne 3D-lidar scanners, 2 Point Grey 1.3 MP stereo camera pairs, 2 Point Grey 1.3 MP side cameras and 1 Point Grey 5 MP dash camera. 

The GPS data provides the latitude, longitude and altitude of the body frame, the IMU data consists of linear acceleration, and angular velocity in m/s2 and rad/s, the two front and the two rear cameras are 1.3 MP stereo pairs operating at a maximum rate of 15 Hz. Each drive in the dataset is accompanied by two types of global 3D maps, which are ground plane reflectivity and a map of a 3D point cloud of non-road points. The localisation framework is based on using measurements from 3D lidar scanners to create a 2D grid local map of reflectivity around the vehicle.

Further, the vehicles traversed an average route of 66 km in Michigan that included a mix of driving scenarios such as the Detroit Airport, freeways, city-centres, university campus and suburban neighbourhoods, etc.

See Also

How This Dataset Will Help

  • According to the researchers, this dataset can help robust design algorithms for autonomous vehicles and multi-agent systems
  • This Multi-AV seasonal dataset can provide a basis to enhance state-of-the-art robotics algorithms related to multi-agent autonomous systems and make them more robust to seasonal and urban variations
  • The dataset will provide new research opportunities in collaborative autonomous driving

Wrapping Up

Ford presented a multi-agent time-synchronised perception (camera/lidar) and navigational (GPS/INS/IMU) data from a fleet of autonomous vehicle platforms travelling through a variety of scenarios over the period of one year during 2017-18. According to the researchers, this dataset is a first-of-a-kind, which contains data from multiple vehicles driving through an environment simultaneously, In the project, the researchers also provided ROS based tools for visualisation and easy data manipulation for scientific research.

Some of the significant contributions by the researchers in this project are mentioned below:

  • A Multi-Agent dataset with seasonal variations in weather, lighting, construction and traffic conditions
  • Full resolution time-stamped data from 4 lidars and 7 cameras
  • The dataset includes GPS data, IMU Pose trajectory and SLAM corrected ground truth pose
  • The dataset includes high-resolution 2D ground plane lidar reflectivity and 3D point cloud maps
  • The dataset provides state-of-the-art lidar reflectivity based localisation results with ground truth for benchmarking purposes
  • All data can be visualised, modified and applied using the open-source Robot Operation System (ROS)

The dataset can be downloaded here.

What Do You Think?

If you loved this story, do join our Telegram Community.


Also, you can write for us and be one of the 500+ experts who have contributed stories at AIM. Share your nominations here.
What's Your Reaction?
Excited
0
Happy
0
In Love
0
Not Sure
0
Silly
0

Copyright Analytics India Magazine Pvt Ltd

Scroll To Top