The world’s first real-time human emotion recognition technology was introduced
April 9, 2024
0
Professor Jiyun Kim and his team from the Department of Materials Science and Engineering at Ulsan National Institute of Science and Technology (UNIST) have developed a groundbreaking technology
Professor Jiyun Kim and his team from the Department of Materials Science and Engineering at Ulsan National Institute of Science and Technology (UNIST) have developed a groundbreaking technology that can identify human emotions in real time. This cutting-edge innovation will revolutionize various industries, including next-generation wearable systems that provide emotion-based services.
Understanding and accurately extracting emotional information has long been a challenge due to the abstract and ambiguous nature of human influences such as emotions, moods, and feelings. To solve this problem, the research team developed a multimodal human emotion recognition system that combines verbal and nonverbal data to effectively utilize the full range of emotional information.
Innovations in wearable technologies
At the heart of this system is the Personal Skin-Integrated Facial Interface (PSiFI) system, which is self-contained, lightweight, stretchable and transparent. It is equipped with a first-of-its-kind bidirectional triboelectric voltage and vibration sensor that allows simultaneous detection and integration of verbal and non-verbal data. The system is fully integrated with a data processing circuit that enables real-time emotion recognition for wireless data transmission.
Using machine learning algorithms, the developed technology demonstrates real-time, accurate human emotion recognition tasks even when people are wearing masks. The system has also been successfully implemented in a digital concierge program in a virtual reality (VR) environment.
Schematic overview of a system with personalized front-end interfaces (PSiFI)
The technology is based on the phenomenon of “frictional charge”, in which objects split into positive and negative charges during friction. It is noteworthy that the system is self-generating and does not need an external power supply or complex measuring devices for data recognition.
Real-time customization and recognition
Professor Kim commented: “Based on these technologies, we developed a skin-integrated facial interface (PSiFI) system that can be customized to individuals.” The team used the semi-solidification technique to make a transparent conductor for friction electrodes. Moreover, the personalized mask was created using the multi-angle shooting technique, which combines elasticity, elasticity and transparency.
The research team successfully integrated the detection of facial muscle deformation and vocal cord vibration, enabling real-time emotion recognition. The system’s capabilities were demonstrated with the “digital concierge”, a virtual reality application in which personalized services are provided according to users’ emotions.
Jean Pio Lee, the first author of the study, said: “With this developed system, real-time emotion recognition can be achieved with just a few training steps, without the need for complex measurement equipment. “This opens up possibilities for wearable emotion recognition devices and next-generation emotion-based digital platforms.”
From left to right, Professor Jiyun Kim and Jin Pyo Lee from the UNIST Department of Materials Science and Engineering
The research team conducted real-time emotion recognition experiments by collecting multimodal data such as facial muscle tension and voice. The system demonstrated high emotional recognition accuracy with minimal training. Its adjustable wireless connection ensures wearability and comfort.
The team also applied the system to virtual reality environments, using it as a “digital concierge” for a variety of environments, including smart homes, private cinemas, and smart offices. The system’s ability to identify individual emotions in different situations enables personalized recommendations for music, movies and books.
Professor Kim emphasized: “For effective human-machine interaction, human machine interface (HMI) devices must be able to collect various types of data and process complex integrated information. This research exemplifies the potential for using emotions, complex forms of human knowledge, in next-generation wearable systems.”
As an experienced journalist and author, Mary has been reporting on the latest news and trends for over 5 years. With a passion for uncovering the stories behind the headlines, Mary has earned a reputation as a trusted voice in the world of journalism. Her writing style is insightful, engaging and thought-provoking, as she takes a deep dive into the most pressing issues of our time.