According to a study done by Deloitte, in the US alone, people could travel up to 25 percent more miles by 2040 than they do today. And while the shift to shared autonomy is likely to happen earlier in densely populated urban centers, the rollout will happen over time at different rates in suburban and rural areas.
This growing demand for fully autonomous and shared autonomy will redefine the way we travel in the near future. Online tech giants like Google, Apple and Baidu have forayed into the automotive space and with heavy investments and are competing with the likes of Tesla, Audi and BMW.
Here are few latest happenings in the world of autonomous driving that will give a hint of what is to come:
In a bid to unlock access to self-driving research, Lyft released a large-scale dataset featuring the raw sensor camera and LiDAR inputs as perceived by a fleet of multiple, high-end, autonomous vehicles in a bounded geographic area.
The Level 5 Dataset is the largest publicly released dataset of its kind. It includes over 55,000 human-labeled 3D annotated frames, a drivable surface map, and an underlying HD spatial semantic map to contextualize the data.
The Lyft Level 5 Dataset includes:
- Over 55,000 human-labeled 3D annotated frames;
- Data from 7 cameras and up to 3 lidars;
- A drivable surface map; and,
- An underlying HD spatial semantic map (including lanes, crosswalks, etc.)
Download the dataset here.
Waymo, in a research collaboration with DeepMind, has taken inspiration from Darwin’s insights into evolution to make this training more effective and efficient.
Researchers at DeepMind devised a way to automatically determine good hyperparameter schedules based on evolutionary competition (called “Population Based Training” or PBT), which combines the advantages of hand-tuning and random search.
The aim was to investigate whether PBT could improve a neural net’s ability to detect pedestrians along two measures: recall (the fraction of pedestrians identified by the neural net over total number of pedestrians in the scene) and precision.
PBT uses half the computational resources used by random parallel search to efficiently discover better hyperparameter schedules.
PBT models were able to achieve higher precision by reducing false positives by 24% compared to its hand-tuned equivalent, while maintaining a high recall rate.
Apollo 5.0 is an effort to support volume production for Geo-Fenced Autonomous Driving. The car now has 360-degree visibility, along with upgraded perception deep learning models to handle the changing conditions of complex road scenarios, making the car more secure and aware. Scenario-based planning has been enhanced to support additional scenarios like pull over and crossing bare intersections.
Vehicles with this version can drive autonomously in complex urban road conditions including both residential and downtown areas.
Audi has partnered its Autonomous Intelligent Driving group (AID) with the Silicon Valley-based company Luminar to accelerate its autonomous vehicle development.
Luminar is known for producing the first safer version of Light Detection and Range (LiDAR) system.
Partnering up with startup that deal with cutting edge technology has made Audi one of the top contenders in the autonomous world.
Audi has recently debuted its Traffic Jam Pilot (TJP) option on its 2019 A8 flagship sedan.
Now drivers can afford a complete hands-off, eyes-off automation in slow traffic, TJP is capable of driving an A8, totally free of driver intervention, in traffic at speeds up to 37 miles per hour.
Audi has pledged to spend nearly $16 billion on electric mobility and self-driving tech through 2023. As part of this effort, AID is currently assessing a fleet of 12 test vehicles with Luminar sensors on public roads in Munich.
In May 2016, General Motors completed the acquisition of Cruise Automation to boost its autonomous car segment. They paid $291 million in cash and nearly $290 million through issuing new common stocks.
Now, with the Trump administration’s approval of Softbank’s whopping $2.25 billion investment in General Motors’ (GM) self-driving unit, Cruise Automation, is poised to claim a larger chunk of the pie.
The team behind Cruise have also open sourced a data visualization web application, which named Webviz, where the users can drag and drop any ROS bag file into Webviz to get immediate visual insight into their robotics data.
The user base has grown to include AV engineers calibrating LIDAR sensors, machine learning engineers verifying model outputs, and QA engineers debugging test rides, Webviz has become increasingly feature-rich without compromising its flexibility.
Apple has poached several notable engineers from Tesla over the last year. Back in March, it hired Tesla drive systems VP Michael Schwekutsch. Last year, Doug Field returned to Apple after a brief stint at Tesla as its chief vehicle engineer.
Elon Musk once even took a shot at Apple’s poaching strategy saying, “If you don’t make it at Tesla, you go work at Apple.”
Apple, however doesn’t look they are going to change their strategy anytime soon as they have made another hire this month. As first reported by a leading news portal, Steve MacManus has joined Apple as a “Senior Director” working out of Apple Park in Cupertino.
MacManus served as Tesla’s vice president of engineering. Prior to his time at Tesla, he worked at Aston Martin, Bentley, Jaguar and Land Rover.
Though the future of Apple’s Project Titan is uncertain, these significant hirings does put Apple on the radar.
Tesla To Release AutoPilot V10
Back in 2016, Tesla announced that it would be shipping all of its cars with the necessary hardware to support the future advancements in autonomous tech. And, as promised, Tesla not only managed to live up to their ambitious goal but also ended up manufacturing the world’s most advanced chip.
Tesla has been a pioneer when it comes to autonomous technology. Tesla enjoys the advantage having started their journey very early. They possess the world’s largest customer base for semi-autonomous vehicle. Whenever a Tesla driver takes an action be it steering left or right or pressing the pedal, what they are doing is annotating the data and generating more refined data related to driver’s behaviour.
As the time passes, more data will be generated and every possible driver reaction will be captured. This will give the computers more control over the steering mechanism and perhaps drop the need for having the steering wheel altogether in the future.
According to Elon Musk’s social media account, the latest version of Autopilot will include improved self-driving features such as better highway handling, traffic lights and stop sign recognition and “Smart Summon”.
The release of V10 (version 10) of Autopilot will also include YouTube and Netflix, as well as several other games and infotainment goodies.
Enjoyed this story? Join our Telegram group. And be part of an engaging community.
Provide your comments below
What's Your Reaction?
I have a master's degree in Robotics and I write about machine learning advancements. email:email@example.com