Modern vehicles are equipped with numerous driver assistance systems, such as cameras, lidar and radar, which help with parking or keeping in lane. They act as eyes, as it were, and identify relevant objects in the environment. What cars still lack so far is a sense of hearing. However, to monitor closely the traffic situation all around, being able to perceive and categorize external sounds is very important. This is because many events in road traffic announce themselves acoustically, such as an approaching emergency vehicle with its siren.
Essential for autonomous driving: Acoustic event detection
Fraunhofer IDMT in Oldenburg develops AI-based technologies for acoustic event detection as well as methods and concepts for optimal signal capture on the vehicle. Applied in practice, these developments can enable faster reaction times, smart decision-making for drivers and predictive maintenance intervals. The »car’s ears« become even more important in autonomous vehicles. These are expected not only to be able to see at least as well as humans, but also hear.
Danilo Hollosi is our automotive expert and heads the »Hearing Car« team. He explains: »As yet, no systems exist that allow autonomous vehicles to perceive external sounds, despite their high application potential. For example, they signal in a fraction of a second when a vehicle is approaching with its siren going. In this way, the autonomous vehicle knows that it must move aside to make way for the emergency vehicle.«