a

Lorem ipsum dolor sit amet, consectetur adicing elit ut ullamcorper. leo, eget euismod orci. Cum sociis natoque penati bus et magnis dis.Proin gravida nibh vel velit auctor aliquet. Leo, eget euismod orci. Cum sociis natoque penati bus et magnis dis.Proin gravida nibh vel velit auctor aliquet.

  /  Project   /  Blog: Sensor Fusion in Autonomous Vehicles

Blog: Sensor Fusion in Autonomous Vehicles


Depth Perception for Autonomous Vehicles

Along with computer vision and other components like localization, path planning, and control, sensor fusion plays a key role in how a self-driving car can actually see the road.

How an AV really sees the road

Computer Vision and Sensor Fusion work in conjunction. Computer Vision allows the vehicle to see directly what’s there, like the object. However, the camera can’t tell depth perception so, the car won’t know when to slow down or speed up and that’s where Sensor Fusion comes in. Sensors are able to detect the proximity of the object. Once it knows that, it can send the proximity alert to the car’s computer and the car will know exactly what to do.

Sensor locations on an AV

Types of Sensors

There are four main sensors that are used in self-driving cars. These sensors all have different purposes and are all connected to the main computer interface of the car, this is normally located in the trunk of the car.

Radar Sensors (Radio Detection and Ranging)

Radar Sensors emit radio waves to detect objects in its vicinity. The vicinity is normally 5–6 metres so, it can detect objects from quite a distance. Currently, we are at the brink of level 3 autonomy and we use Radar Sensors to tell us we have a car in a blind spot or a car approaching behind. This has helped avoid many collisions. They have better results in moving objects than on static objects.

LiDAR Sensors (Light Detection and Ranging)

LiDAR Sensors use infrared sensors to determine the distance to an object. A rotating system (yes, those big bulky spinny things) makes it possible to send out waves and to measure the time taken for that wave to return to the LiDAR system. A LiDAR sensor can generate about 2 million points (to another object) per second so, the distance between the vehicle and the object is constantly updated.

Ultrasonic Sensors

Ultrasonic sensors are used to estimate the position of stationary vehicles like, when parking. They are the cheapest of all the sensors so, it makes sense that they have a range of only about 3 metres.

Odometric Sensors

These sensors make it possible to estimate the speed of the vehicle by studying and analyzing the displacement of its wheels.

Graph summarizes the information

Equations & Kalman Filter

The sensors are able to collect a whole bunch of data to send to the car’s computer but, the computer won’t know what this data. Think of it like reading another language, if you don’t know that language, it makes it quite impossible to understand what it’s saying so, you would need a translator (where’s Google Translate when you need it 😂). Well, a Kalman Filter is a translator for sensors.

Kalman Filter

Essentially, there are two equations that are used for almost all sensor fusion algorithms: the predict and update equations. Before you can understand the equations, you need to be able to understand what a Kalman Filter is.

A Kalman Filter is an algorithm used to merge data. Like, a camera’s collected data and a sensor’s data.

A Kalman filter can be used for data fusion to approximate the state of a moving system (like a car) in the present, past or the future (prediction). Sensors in autonomous vehicles emit measures that are sometimes incomplete and noisy. The inaccuracy of the sensors (known as noise) is a prominent problem and can be fixed by the Kalman filters.

A Kalman filter is used to estimate the state of a system, x. This vector is composed of a position p and a velocity v.

In every instance, we associate ‘p’ with uncertainty to solve for x

The use of Kalman filters allows you to have an exact idea to decide how many metres the pedestrian is away by eliminating the noise of the two sensors. Kalman Filters make an autonomous vehicle’s driving safer for everyone.

Now the Math…

Prediction

The prediction consists of estimating the state of x’ and an uncertainty P’ at time t from the previous states x and P at t = 1.

Prediction formulas
  • F: Transition matrix from t =1 to t
  • v: Noise added
  • Q: Matrix including noise

Our prediction consists of estimating a state x’ and an uncertainty P’ at time t from the previous states x and P at time t-1.

Prediction formulas
  • F: Transition matrix from t-1 to t
  • ν: Noise added
  • Q: Covariance matrix including noise

Update

The update phase consists of using a z measurement from a sensor to correct our prediction and thus, predict x and P (more exact v (noise) value)

Update formulas
  • y: Difference between actual measurement and prediction, ie the error.
  • S: Estimated system error
  • H: Matrix of transition between the marker of the sensor and ours.
  • R: Covariance matrix related to sensor noise (given by the sensor manufacturer).
  • K: Kalman gain. Coefficient between 0 and 1 reflecting the need to correct our prediction.

Conclusion

Sensor Fusion is a key component in the success of self-driving cars. Without it, we would be able to see where we are going but, not tell depth, how fast you are going or even know when to stop for pedestrians, cars or bicyclists.

Key Takeaways

  • Computer Vision and Sensor Fusion work in conjunction. Computer Vision allows the vehicle to see directly what’s there, like the object.
  • Radar Sensors emit radio waves to detect objects in its vicinity.
  • LiDAR Sensors use infrared sensors to determine the distance to an object.
  • Ultrasonic sensors are used to estimate the position of stationary vehicles like, when parking.
  • These sensors make it possible to estimate the speed of the vehicle.

If you liked the article feel free to clap and message me on LinkedIn!

Source: Artificial Intelligence on Medium

(Visited 9 times, 1 visits today)
Post a Comment

Newsletter