Why Heavy Rain is not bad for LiDAR and Self-Driving Vehicles

17-05-2021 |   |  By Robin Mitchell

A recent study from the University of Warwick demonstrated how heavy rain could affect LiDAR in automotive vehicles, but the claim that this is problematic is blown out of proportion. What sensing technologies can be used in vehicles, what did the researchers discover, and why is it not important?

What sensing technologies exist for automotive vehicles?

As discussed in previous articles, there are no self-driving systems to date that provide users with a real autonomous experience. The best systems currently deployed (such as in Tesla vehicles), provide a basic type of autonomous driving where the driver is still required to have their hands on the wheel and having their full attention to the road (this is a Level 2 autonomous vehicle).

For self-driving vehicles to function, they are required to observe their environment, recognise objects, and make decisions based on what those objects are and what they are doing. For example, a cars object detection system could be extremely accurate, but such a system is pointless if the car brakes every time a leaf in the wind passes by.

Choosing a sensor technology for such an application can be difficult as each technology type has its own advantages and disadvantages. For example, LiDAR is a laser ranging technology that can produce a 2D image map of its surroundings, and each pixel in the image refers to a distance measurement. Therefore, LiDAR can be used to determine the distance of objects as well as their outlines. 

RADAR is another technology that can be used to determine the distance of objects. While it cannot determine the outline of objects accurately (depending on the wavelength used), it has an incredibly long range with great penetrating power against fog and snow.

Ultrasonic ranging is another technology that can be used. The short-range of ultrasonic makes it ideal for accurate positioning between close objects (such as parking), but it cannot be used over distances of more than a meter. 

It is clear that many technologies are capable of mapping their surrounding environment, but which one should an engineer deploy? While some may believe that technologies such as RADAR are ineffective (Elon Musk), the answer is that any self-driving system should deploy all technologies simultaneously. Considering that each technology does not interfere with each other and that each one can operate in different conditions, a self-driving system should deploy all technologies to determine if objects have been missed. Furthermore, the combination of all technologies allows for the vehicle to operate in multiple conditions and have access to backup systems that can take over should one ranging system fail.

The University of Warwick Study Shows LiDAR Diminished Performance in the Rain

Recently, the University of Warwick led a study on LiDAR and its use in self-driving systems to see how it performs in poor road conditions. Their study, which was published in the IEEE Sensors Journal, demonstrated that during heavy rain, LiDARs performance drops with the production of both false positives and false negatives. 

Using their WMG 3xD simulator, the team inserted rain into a simulation using different probabilistic rain models to simulate different rain conditions. The experiment found that raindrops close to the vehicle (within 50 meters), are often detected while raindrops exceeding 50 meters are less frequently detected. However, as rain density increases, rain from a greater distance is more readily detected thereby interfering with the LiDAR output.

The cause of the interference comes from how LiDAR works. A LiDAR system starts by firing a laser towards an object of interest while simultaneously starting a timer. The laser hits the object, bounces back, and is detected by a receiver. The known speed of light and time duration allows for distance measurement. According to the researchers, raindrops can interfere with LiDAR. The laser beam hitting the drops either refracts into a different direction (thereby producing no result), or reflects back towards the sensor (reflection).

Should LiDAR systems focus on noise mitigation?

Before looking at how rain and LiDAR specifically are of no concern, we should first appreciate that the conclusion of the report made by the University of Warwick is more or less “unneeded”. Engineers have to be careful when reading scientific papers and reports on new discoveries and developments. Many of these announcements either contribute little to the scientific community, exaggerate claims or are outright false. For example, researchers recently developed a wearable TEG for powering devices. Still, any engineer worth their salt would look at the numbers and recognise that the device is a laboratory novelty at best.

The paper released by the University of Warwick talks about simulated environments for autonomous vehicles and their sensors. However, while the development of simulation environments is applicable to the real world and potentially game-changing, the conclusion from the paper stating that their system shows that rain affects LiDAR is obvious to anyone involved with sensor design. 

Furthermore, the paper mentions that research needs to determine how to mitigate against rain noise in LiDAR, but this is arguably a dramatic statement. An example of how a similar paper could be released with a meaningless discovery would be “Vehicle Simulator shows that object reflectivity affects LiDAR performance”. While there may be few papers surrounding the topic of object reflectivity, it is blindingly obvious to anyone involved in the LiDAR industry. 

Simply put, future fully autonomous driving systems will deploy a multitude of sensors, and trying to rely on one technology heavily will present major challenges. For example, LiDAR will enable vehicles to range objects coming towards them at a distance. In contrast, the use of stereoscopic cameras will enable real-time distance determination via the parallax effect as well as accurately read road signs. In addition, ultrasonic sensors will help such vehicles determine their position relative to neighbouring vehicles, and RADAR will help to map distant objects in the worst of conditions. 

Overall, engineers need not panic with the “discovery” that LiDAR is affected by rain, and that systems of the future will not exist until these problems are solved. Furthermore, designers who focus on noise mitigation may find themselves wasting their effort on trying to make LiDAR work in the rain when instead they could be building an autonomous vehicle utilising multiple sensor technologies. Generally speaking, the simplest solutions are the most effective.

Read More


By Robin Mitchell

Robin Mitchell is an electronic engineer who has been involved in electronics since the age of 13. After completing a BEng at the University of Warwick, Robin moved into the field of online content creation developing articles, news pieces, and projects aimed at professionals and makers alike. Currently, Robin runs a small electronics business, MitchElectronics, which produces educational kits and resources.

Related articles