• The state of self-driving vehicles
  • WaveSense unveils a ground-penetrating radar for self-driving vehicles
  • Waymo’s self-driving cars rely on machine learning to navigate in snowy weather
  • MIT’s new sensor could help autonomous vehicles see through fog and dust
  • The Gacha shuttle bus uses AI and smart tech to beat bad weather

Autonomous vehicles have been touted as the future of transportation for quite some time, and rightfully so. One of the main reasons why major companies started working on autonomous vehicles in the first place was to improve road safety, and the best way to do that is to eliminate the human element from the equation.

Road accidents claim thousands of lives each year. While some tend to blame slippery roads, potholes, or bad weather, human error is one of the primary causes of fatal car accidents. In fact, the US National Center for Statistics and Analysis estimates that human error is responsible for 94 per cent of traffic fatalities in the United States, and it is widely believed that autonomous vehicles would be able to prevent most, if not all of them.

According to the Insurance Institute for Highway Safety (IIHS), even some of the most basic driver-assistance features, such as lane departure warning, could reduce the fatal crash rate by as much as 86 per cent. Over the years, competition in the self-driving arena has intensified, with a number of new startups, ride-hailing companies, and even some tech giants entering the field, aiming to get a piece of the pie for themselves. At the same time, it seems that almost every major car manufacturer is also looking to build a self-driving car of their own. In fact, according to a recent report published by Allied Market Research, the value of the global autonomous vehicle market is expected to reach $557 billion by 2026. However, despite all the hype, a truly autonomous vehicle capable of handling every driving task entirely on its own is yet to materialise. Why is that?

The state of self-driving vehicles

In a 2018 interview, Elon Musk, the CEO of Tesla, said that Tesla could have a fully self-driving car on the road by 2019. This turned out to be easier said than done, as 2019 is already here, and there’s no indication that Musk’s claim will actually come true. In fact, none of the car manufacturers seem to be close to putting a fully self-driving car on the road.

The IIHS recently conducted a comprehensive test of advanced driver-assistance systems, with features such as adaptive cruise control and active lane-keeping, in cars from some of the leading manufacturers, including BMW, Mercedes, Volvo, and Tesla. The test evaluated the systems’ performance in typical driving situations, such as approaching stationary vehicles and negotiating hills and curves, and the results were a bit underwhelming. While none of the cars actually crashed during the test, they all made a number of mistakes, ranging from overly cautious braking to crossing the lane, leading the IIHS to conclude that the technology is still not ready to replace human drivers. “None of these vehicles is capable of driving safely on its own,” says David Zuby, IIHS’ chief research officer. “A production autonomous vehicle that can go anywhere, anytime isn’t available at your local car dealer and won’t be for quite some time. We aren’t there yet.”

The IIHS recently conducted a comprehensive test of advanced driver-assistance systems, with features such as adaptive cruise control and active lane-keeping, in cars from some of the leading manufacturers, including BMW, Mercedes, Volvo, and Tesla.

It may sound unusual, but one of the biggest obstacles standing in the way of fully autonomous vehicles is the weather. The weather presents a wide variety of issues for self-driving vehicles, which are usually tested in a small number of areas with limited weather conditions, such as the hot and dry desert of Arizona. That makes it difficult to predict how they will behave once those conditions change. Fog, heavy rain, and snow will often render cameras completely useless, while lidar tends to mistake snowflakes and raindrops for solid objects. Even the remaining technologies autonomous vehicles rely on have certain drawbacks of their own. GPS connection is slow and spotty in many areas, while radars have difficulties distinguishing between different types of obstacles. To be safe on the road, a self-driving vehicle will have to be able to operate in severe weather conditions as well, and several companies are already working on technology that would make this possible. The solution is to provide the vehicle with a combination of the best information available from each of its systems while ignoring the rest — something autonomous driving engineers refer to as ‘sensor fusion’.

WaveSense unveils a ground-penetrating radar for self-driving vehicles

A Massachusetts-based startup called WaveSense recently announced the launch of the world’s first ground-penetrating radar (GPR) for self-driving vehicles, which could significantly improve their navigation and safety on any road by helping them stay within the lane at all times. WaveSense uses the ground-penetrating radar to send an electromagnetic pulse up to 3 metres below the ground, and then measures reflections from objects like rocks, pipes, and roots and changes in soil properties to build a highly accurate map of the subsurface strata. As the vehicle drives along the road, WaveSense scans the ground below it approximately 126 times per second and then compares the data to the onboard image database, allowing the vehicle to determine its exact position down to a few centimetres. According to the company, the tests have shown the technology to be accurate and reliable at standard highway speeds (up to 105 kilometres per hour), even during the night and in snowstorm conditions. Compared to surface maps, subsurface maps are more accurate and reliable, because subsurface data is relatively static and unaffected by changes in the landscape above the ground, which means the maps don’t need to be updated as often.

The technology was originally developed for the US military at the MIT Lincoln Laboratory and deployed for the first time in Afghanistan in 2013, where it was used to help military vehicles avoid landmines.

However, this technology isn’t meant to replace other technologies self-driving cars currently rely on, but is designed to work alongside GPS, cameras, lidar, and radars, providing the vehicle with a wealth of information that will allow it to know its exact position at all times, regardless of the conditions outside. “A massive transformation in transportation and mobility is underway around the world as autonomous systems advance,” says Tarik Bolat, WaveSense’s CEO. “But before broad adoption of self-driving vehicles can occur, navigation safety and reliability must improve significantly. WaveSense’s technology radically improves the safety of self-driving vehicles in all conditions and provides the confidence and reliability our sector must demonstrate in order to earn the public’s trust.”

The technology was originally developed for the US military at the MIT Lincoln Laboratory and deployed for the first time in Afghanistan in 2013, where it was used to help military vehicles avoid landmines. WaveSense has since managed to adapt the technology for the private sector by reducing the size of the hardware. It now consists of a box of electronics that measures approximately 150x60x7 centimetres. However, the company hopes it will soon be able to reduce the size even further and cut the price down to $100 per piece.

Waymo’s self-driving cars rely on machine learning to navigate in snowy weather

Waymo, widely considered as the leader in self-driving technology, demonstrated during a recent Google I/O keynote how it plans to use artificial intelligence and machine learning technologies to help its self-driving cars navigate safely in snowy conditions. Waymo’s cars will use machine learning to filter out the snow and see exactly what’s ahead of them, even if there are other vehicles parked by the side of the road. While the video presentation doesn’t make it clear whether the cars will be able to actually see the lanes, it should be more than enough to help them avoid any crashes and get their passengers safely to their destinations.

We aim to bring self-driving technology to everyone, everywhere… and in all weather,” writes Dmitri Dolgov, Waymo’s CTO and VP of engineering. “Driving in heavy rain or snow can be a tough task for self-driving cars and people alike, in part because visibility is limited. Raindrops and snowflakes can create a lot of noise in sensor data for a self-driving car. Machine learning helps us filter out that noise and correctly identify pedestrians, vehicles and more.”

MIT’s new sensor could help autonomous vehicles see through fog and dust

MIT researchers have developed a chip that can use signals at sub-terahertz wavelengths to detect objects and help self-driving cars navigate through fog and dust. While infrared-based lidar systems tend to struggle in such conditions, the sub-terahertz wavelengths, which are between microwave and infrared radiation on the electromagnetic spectrum, can easily be detected in fog and dust clouds. The system works by sending an initial signal through a transmitter and then measuring the absorption and reflection of the rebounding sub-terahertz wavelengths with a receiver. The signal is then sent to a processor, which recreates the image of the object.

“A big motivation for this work is having better ‘electric eyes’ for autonomous vehicles and drones,” says co-author Ruonan Han, an associate professor of electrical engineering and computer science. “Our low-cost, on-chip sub-terahertz sensors will play a complementary role to LiDAR for when the environment is rough.” The main reason why nobody tried to implement sub-terahertz sensors into a car before is that the equipment needed to produce a strong enough signal used to be rather large and expensive. However, the MIT researchers have managed to develop a sensor small enough to fit onto a chip, yet sensitive enough to provide meaningful information even if there’s significant signal noise. In addition to calculating the distance of the object, the output signal can also be used to create a high-resolution image of the scene, which will be of crucial importance for autonomous vehicles.

The Gacha shuttle bus uses AI and smart tech to beat bad weather

The Finland-based vehicle automation systems company Sensible 4 recently joined forces with the Japanese retailer MUJI to build the Gacha, the first autonomous shuttle bus in the world capable of driving in all weather conditions. This unique partnership will see Sensible 4 providing the technology needed for navigation, positioning, and obstacle detection, while the design of the vehicle will be handled by MUJI. The Gacha will be equipped with four lidar emitters, eight radar emitters, high-accuracy GPS, and 360-degree camera vision, which will ensure it can drive autonomously throughout the year, regardless of the weather conditions. It has a capacity of ten seats and six standing places, and a maximum speed of 40 kilometres per hour.

Furthermore, it features four-wheel drive, an electric powertrain, fast charging, and a maximum range of 96 kilometres. The Gacha has a somewhat unusual, slightly rounded shape, with no distinguishable front or back. There is also an LED light belt that goes all around the bus, while the seating inside follows the vehicle’s contours. “We are developing these vehicles so that they can become part of daily transportation service chain. Autonomous vehicles can’t become mainstream until their technology has been insured to work in all climates,” says Harri Santamala, the CEO of Sensible 4. The test driving of the Gacha is scheduled to begin in the first half of 2019 in three Finnish cities, starting with Helsinki. The first Gacha fleet is expected to roll out in 2020, and the two companies hope that the service will be available to the public by 2021.

Self-driving cars have come a long ways in recent years, but none of them have been able to achieve full autonomy yet. One of the biggest remaining obstacles standing in the way of achieving this goal is the weather, which presents a number of issues for the existing technologies self-driving cars rely on. Cameras and lidar tend to struggle in fog, heavy rain, and snow, radars have difficulties distinguishing between different objects, and GPS connections are often slow and spotty. Thankfully, the technology that can address these problems is already being developed, with a number of companies coming up with innovative solutions that will accelerate the arrival of fully autonomous vehicles. At this point, it seems only a matter of when, not if, that dream will become a reality and fully autonomous vehicles start driving down our roads, revolutionising our transportation system.

This article first appeared on the Richard van Hooijdonk blog at https://www.richardvanhooijdonk.com/en/blog/sensor-fusion-next-gen-autonomous-vehicles-could-function-in-all-weather-conditions/

Source: Artificial Intelligence on Medium