Emerging tech for autonomous vehicles: Thermal stereo sensing
Over the past five years, there have been a series of high-profile cases of vehicles misinterpreting the driving environment while using Advanced Driver-Assistance Systems (ADAS). Most recently in July, algorithms inside of Tesla’s TSLA +0.5% Full Self-Driving (FSD) system-initiated slowdowns for the yellowish-moon since sensors were accidentally interpreting the distant orb as a traffic signal. Another blockbuster-yet-slightly-humorous story was when a second Tesla vehicle crashed within a week, but the latter accident was with a parked police vehicle (March, 2021). Or sometimes, as in the cases of Tesla vehicles in Delray Beach (2019) and Williston (2016), the vision system’s miss of a semi-trailer resulted in two separate, unfunny crashes killing both drivers. In fact, in response to multiple such cases, the National Highway and Traffic Safety Administration (NHTSA) reported in March that is was launching a Special Crash Investigation. And, yes, all of the incidents are rare and Tesla does get more press than other such incidents (e.g. Uber’s 2018 fatal crash in Tempe), but these news stories quietly highlight that the standard fidelity of vision-based systems (e.g., cameras) might warrant additional improvements to meet the Safety of the Intended Functionality (SOTIF) and support the predicted self-driving market of $173B by 2023.