Researchers say in-car computers could one day tell whether a driver is drunk just by looking at their facial features. Constantly “monitoring” drivers for typical signs of intoxication could even reduce the number of drunk-driving accidents.
The project, outlined in a paper published at the Institute of Electrical and Electronics Engineers (IEEE) and Computer Vision Foundation (CVF) conference on April 9, allows automotive computing systems to assess a driver’s level of intoxication immediately after getting behind the wheel — with up to 75% accuracy.
This goes beyond current computerized methods that rely on observable behaviors such as driving style, pedaling, and vehicle speed. This data can only be collected and processed while the vehicle is in motion for extended periods of time.
Instead, the new design uses a monochrome camera that tracks variables such as gaze direction and head position. The overall system could also include 3D and infrared video recordings of the driver’s face and rear-view video footage showing the driver’s posture, as well as steering wheel interactions, event logs and screen recordings of driving behavior.
“Our system has the ability to detect the level of intoxication at the start of a ride, potentially preventing impaired drivers on the road,” Ensia Keshtkaran, a PhD student at Edith Cowan University in Australia who participated in the project, said in a statement.
He added that the software facilitates the transition to environments such as smartphones because it fits seamlessly into the digital architecture of smart vehicles, such as eye tracking and driver monitoring systems.
The World Health Organization (WHO) estimates that alcohol poisoning is responsible for 20% to 30% of fatal car accidents worldwide. In Australia, where the project originated, 30% of fatal accidents are associated with a blood alcohol content exceeding the legal limit of 0.05%.
“While efforts to integrate driver alcohol detection systems into next-generation vehicles continue and autonomous cars are on the horizon, the persistent problem of drunk driving remains relevant,” Keshtkaran said.
The study used simulators to record video recordings of drivers of different ages, alcohol consumption and driving experience in three levels of intoxication – alcohol intoxication, alcohol poisoning and severe intoxication. They worked with Powerfleet software company MiX to collect data on drunk drivers in a controlled but realistic environment.
The algorithm then looked for visible facial signs of intoxication in the video footage and successfully predicted the driver’s potential state three-quarters of the time. According to materials published by the Oregon Alcoholic Beverage and Cannabis Commission, some common visual signs of intoxication include bloodshot eyes, a flushed face, drooping eyelids and a dazed appearance.
Project leader Syed Zulkarnain Gilani, a senior lecturer in Edith Cowan University’s School of Science, said the next steps are to improve the resolution of the image data the algorithm receives, allowing it to make more accurate predictions. “If low-resolution video proves sufficient, the technology could be used by roadside surveillance cameras,” Gilani said in a statement.
But for now, this discovery is a big step forward because it can determine the level of intoxication before the car moves. This could be the start of a future where smart cars won’t start when a drunk driver is behind the wheel, and could even alert the authorities if the driver is too drunk.