The Sounds of Safer Driving

Have you listened to your car lately? In-vehicle sound technology designed at Michigan Technological University could keep your eyes on the road and your emotions in check.

The Talking Heads once warned that you might find yourself behind the wheel of a large automobile asking, “How did I get here?” In the not-too-distant future, listening to your car may help you prevent such a predicament.

Research led by Myounghoon “Philart” Jeon, an associate professor of cognitive and learning sciences and computer science at Michigan Tech, focuses on a new generation of sound technology in cars. The results of his research, conducted in the Mind Music Machine (Tri-M) Lab, could help drivers pay better attention, regulate their emotions while driving and drive their cars in a more eco-friendly manner.

Jeon refers to this new generation as “non-traditional” sound technology that builds off and goes beyond the beeps and bells vehicles emit while backing up or when front-seat passengers remove their seatbelts while the car is in motion.

“In a perfect world,” Jeon says, “people would be able to completely focus on driving. Because that’s often not the case, we’re focused on how to make it safer.”

 Knight Rider for the 21st Century

Children of the 1980s are no strangers to the concept of a talking car that can calm its driver down – they watched a Pontiac Trans Am do just that on the TV show “Knight Rider.” Thanks to Tri-M research, the Knight Rider scenario is no longer science fiction. 

Jeon and his team are developing in-vehicle speech interaction technology that can engage the driver in what’s known as attention deployment. A sensor-equipped car could detect the driver’s heart rate, facial expressions, breathing patterns and other relevant data to gauge that person’s emotional state. If the driver shifts into a “non-optimal” state, the car would chat with the driver to refocus attention on the road, not on emotion.

For example, if a car detected anger from its driver, it could inform the driver that an intersection is approaching and say how many accidents take place in that intersection. Such speech interactions help to shift the driver’s attention away from whatever is causing the anger and deploy that attention elsewhere – on the road. Using music is another possibility, but Jeon and his team have yet to find any consistent data about what type of music reliably alters mood and how often the driver must listen to it.

Sonify More, Emit Less

To sonify means to make data audible, and sonification research focuses on using non-speech sounds – often music – to convey information. In the Tri-M Lab, Jeon and his researchers are exploring the ways sonification technology can improve driving. And it goes beyond emotion regulation.

To figure out how sound can be used in a car to alter the physical way in which a person drives, Jeon's team asks research subjects to operate a driving simulator under a variety of circumstances. Jeon then extracts the data collected during the simulation to see if he can make a map between driving parameters and music parameters. The end result could help people operate their vehicles in a more fuel efficient or eco-friendly manner.

For instance, when a driver stays within fuel-efficiency guidelines, she would hear harmonious sounds or music. Once the driver moves out of a fuel-efficient state, the car would alter the tone and pace of the sounds or music to make it less harmonious and more unpleasant.

Watch Listen to Your Drive Demo 1 video
Preview image for Listen to Your Drive Demo 1 video

Listen to Your Drive Demo 1

Sonic Branding

Imagine the sound an Apple Mac makes as it turns on, or the short songs you hear when someone opens or closes Microsoft Windows. These are sonic brands. Currently, in the automotive industry, sonic brands are limited to the sounds emitted when a vehicle is turned on. Both Porsche and Harley Davidson, for instance, have sonic brands. 

Though Jeon has vast experience in sonic branding – he used to do auditory design for companies such as LG and Samsung – he is looking to expand upon the idea to create branded sounds for electric vehicles.

The engines of electric vehicles, of course, do not make any sound, and this has proven to be a problem for pedestrians and people who are blind. State and federal lawmakers have repeatedly called for electric cars to make some kind of noise for the benefit of bystanders, and the National Highway Traffic Safety Administration recently responded by setting sound requirements for all new hybrid and electric light-duty vehicles.

Jeon wants to not only design sounds – or auditory icons – for quiet vehicles, he wants to determine the best ways for transmitting the sound. He’s currently considering whether the car could send its signal through a bone conduction phone (to personally communicate with one person instead of startling a large group) and whether vehicles could communicate with each other (so that an approaching train can alert a nearby car).

The Mind Music Machine Lab crosses computer science and cognitive science to better understand how to help people drive better.
The Mind Music Machine Lab crosses computer science and cognitive science to better understand how to help people drive better.

 

His team is also researching the best ways for cars to communicate this type of information to their drivers. For example, if a driver is changing lanes and doesn’t see a car on the left due to a blind spot, what would be the best way for the vehicle to warn its driver? Graduate-level research at Tri-M Lab has shown that spatiality overrides semantics – where the sound is coming from is more important than what the warning means. Thus, a driver would react more quickly to a noise coming from the rear, left-hand side of a car than to a voice saying, “Car on the left.”

“Location-based information is processed more quickly and more strongly,” Jeon explains, adding that he hopes the findings will result in car speakers that simulate the direction from which a vehicle or other danger is approaching.

Jeon and his students presented the current results of their research and provided interactive tutorials late last month at the International Conference on Automotive User Interfaces and Interactive Vehicular Applications, held at the University of Michigan in Ann Arbor.

“It was the start of our evangelism for non-traditional sound technology,” Jeon says with a laugh. If his mission is successful, operating a motor vehicle will soon be a safer, more harmonious experience.

Michigan Technological University is a public research university founded in 1885 in Houghton, Michigan, and is home to more than 7,000 students from 55 countries around the world. Consistently ranked among the best universities in the country for return on investment, Michigan’s flagship technological university offers more than 120 undergraduate and graduate degree programs in science and technology, engineering, computing, forestry, business and economics, health professions, humanities, mathematics, social sciences, and the arts. The rural campus is situated just miles from Lake Superior in Michigan's Upper Peninsula, offering year-round opportunities for outdoor adventure.

Comments