Although we’ve probably heard that we should never get behind the wheel when we’re emotionally upset or angry over something that happened at work, school or while we’re running errands, there’s actually a lot of science behind that recommendation. At least, that’s what researchers are hard at work on right now. The Ecole Polytechnique Federale De Lausanne (or EPFL) is studying how embedded cameras can identify drivers’ emotions by filming their faces.
The study hasn’t been without its ups and downs. For starters, there’s the whole issue about being non-invasive. Granted, there is technology that can measure facial emotions, but how and where do you put it in the car so that it doesn’t prove to be a distraction? Fortunately, EPFL researchers figured out a solution. Working with PSA Peugeot Citroen, EPFL’s Signal Processing 5 Laboratory adapted a facial device for use in the car. A small infrared camera is placed behind the steering wheel.
So much for placement, but how, exactly does the device work? What emotions does it recognize? Interestingly, the seven so-called universal emotions of anger, disgust, fear, joy, sadness, surprise and suspicion, which are very handy in the development of video games, medicine, and marketing may now even hold promise for driver safety.
Another challenge EPFL researchers ran into was how to get the device to recognize irritation on a driver’s face. As it turns out, each person expresses irritation somewhat differently. Some people have a tic, while others express an epithet of some sort, and still others don an impassive facial expression. EPFL researchers decided to simplify the task at hand and look at anger and disgust, only two of the seven universal emotions.
In two test phases, the system “learned” to identify the two emotions using a series of photos of people expressing them. Then, the same study participants were filmed in an office setting and in a car that was made available for the project.
Researchers say the project worked well and the device was able to detect irritation accurately most of the time. Where it failed, it usually had something to do with how this emotional state varies from one person to the next. More research is planned to update the system in real-time so that it complements the static database. This will likely one day result in a human-machine interface or a more advanced facial monitoring algorithm.
While an emotion detector is probably a few years off as an option on a new-car purchase, the research on driver emotion detection is just one indicator for improving driver safety. This study also made use of a fatigue detector that measures the percentage of eyelid closure. EPFL is also working on detecting driver distraction and lip reading for use in vocal recognition.
For now, though, maybe it’s best to look in the mirror before getting behind the wheel. If you’re angry or disgusted or feel any strong emotion, take a walk, clear your head, and attempt to calm yourself before putting the car in gear.