Tesla To Remove Ultrasonic Sensors, A Recipe For Disaster?

20-10-2022 | By Robin Mitchell

Recently, Tesla announced that it will be removing ultrasonic sensors on its vehicles and moving towards total reliance on its vision-based AI. Why do cars utilise different sensing technologies, why is Tesla moving towards vision-only, and will this be a disastrous move?

Why do cars utilise numerous sensing technologies?

The introduction of electronics into vehicles has allowed engineers to develop all kinds of brilliant technologies that not only improve the driving experience but also significantly improve safety for vehicle occupants and nearby pedestrians. One company that stands above the rest for its development of safety technologies is Volvo, which was the first car manufacturer to develop the three-point safety belt, booster cushions, side-impact protection, blind spot detectors, and pedestrian detection. Even though some of these developments are purely mechanical, most modern safety technologies require some kind of sensor and associated processing system.

But while humans mostly use vision for operating vehicles, modern vehicles deploy all kinds of different sensors, including RADR, SONAR, vision, and LiDAR. The reason why engineers use multiple sensing technologies on vehicles primarily comes down to three factors; application, environment, and reliability.

With regards to application, parking sensors that beep when in close proximity to an object use ultrasonic sensors due to the ability to accurately measure small distances. The speed of sound is slow enough that by the time a sound pulse has reflected from a distant object and come back, a low-end microcontroller would have managed to count many thousands of times (and this, in turn, results in accurate distance measurement). By contrast, trying to measure tens of centimetres with LiDAR is extremely challenging due to the fast speed at which light travels, and the time between firing a beam of light and receiving the reflection would provide barely any time for a microcontroller to count. 

In the case of the environment, weather conditions can significantly affect the performance of a sensor, which can lead to potentially dangerous situations. For example, a safety mechanism that breaks the vehicle if it detects a potential collision may fail to operate if a LiDAR ranging system cannot penetrate heavy rain. While RADAR is better at penetrating rain, it has a much lower resolution, meaning it is not as good as object detection. Therefore, a system that combines the two can introduce resilience against varying weather conditions and thus improve reliability.

Finally, the simple act of having numerous sensors using different technologies allows a vehicle to be “hyper-aware” of its surroundings. It is possible that vehicles in the future will have a centralised processor that reads all readings from all sensors and uses this information to process its surroundings (some vehicles do support centralised controls, but in many cases, sensors are independent of each other). Furthermore, the ability to combine different sensor technologies allows vehicles to make better decisions on which sensors they should rely on and ignore data from others when conditions change. 

Tesla to scrap all ultrasonic sensors

Recently, Elon Musk announced that Tesla will move to scrap all ultrasonic sensors and will instead move towards a vision-only vehicle. It is believed that Tesla will save around $114 per vehicle, and if Tesla’s estimate of 1 million vehicles being manufactured yearly in the future is correct, this could result in over $100m in savings. At the same time, Elon Musk noted that humans drive solely using vision, and as such automated vehicles should also follow this concept.

Tesla vehicles currently have 12 ultrasonic sensors placed around the edge of the body to provide localised distance measurements from obstacles. These sensors are not only used for parking assistance but can also allow the vehicle to navigate autonomously around objects as well as parking itself. This is not the first time that Tesla has moved away from traditional sensing technologies in favour of its vision-based AI. For example, Tesla announced in 2021 that it would phase out the use of RADAR in its products, which not only helps reduce costs but also enables the vehicle to rely on a single sensor system.

In place of the sensors, Tesla vehicles will rely on the 8 cameras that are installed around the vehicle. One camera is installed at the rear while three cameras are installed at the front, then a mix of two cameras are directed towards the rear, and two cameras are directed towards the front.

Will this move be disastrous?

While there may be some economic benefits to reducing the number of sensors, removing sensors and relying solely on a vision-based system is undoubtedly a bad engineering move. The biggest evidence for this is that humans rely solely on their eyes when driving, and numerous mistakes are made, whether it is due to being blinded by the sun, poor weather conditions, or blind spots. It is highly unlikely that an AI would be able to process images better than a human and therefore is likely to make unusual decisions (such as when Tesla vehicles were convinced a harvest moon was an amber light).

Considering that cameras used in vehicles don’t even come close to the human eye’s capabilities, is it also possible that vision-based vehicles will struggle with close-range distance measurements in poor lighting conditions. For example, a parking space in a multistorey where the back wall is painted in a single colour will be virtually impossible to range if there are no features that stand out. 

Overall, scrapping ultrasonic sensors, which have been proven to work well beyond doubt, seems to be an unusual move. Even if the vehicle doesn’t rely on these sensors, they offer an extremely valuable fall-back option that could be engaged if the onboard vision system fails. And when considering the price of Tesla vehicles exceeding $50,000, trying to save $114 seems cheap, but of course, Tesla vehicles have a reputation for being poorly built.

Profile.jpg

By Robin Mitchell

Robin Mitchell is an electronic engineer who has been involved in electronics since the age of 13. After completing a BEng at the University of Warwick, Robin moved into the field of online content creation, developing articles, news pieces, and projects aimed at professionals and makers alike. Currently, Robin runs a small electronics business, MitchElectronics, which produces educational kits and resources.