UNIST(총장 이용훈) 신소재공학과 김지윤 교수팀이 실시간으로 감정인식을 할 수 있는 ‘착용형 인간 감정인식 기술’을 세계 최초로 개발했다.

▲(From left) UNIST Professor Jiyoon Kim, first author Researcher Jinpyo Lee
Various application prospects including emotion-based customized services
A technology that can recognize human emotions in real time has been developed. It is expected to be applied in various ways, such as next-generation wearable systems that provide services based on emotions.
UNIST (President Yong-Hoon Lee) announced on the 29th that Professor Ji-Yoon Kim's research team in the Department of Materials Science and Engineering has developed the world's first 'wearable human emotion recognition technology' capable of recognizing emotions in real time.
The developed system is based on the phenomenon of 'friction charging', where two objects are separated into positive and negative charges when they are rubbed together. It is also capable of self-generation, so no additional external power supply or complex measuring devices are required when recognizing data.
Professor Kim Ji-yoon added, “Based on this technology, we developed a skin-integrated facial interface (PSiFI) system that can be provided in a personalized manner to each individual.”
The research team utilized a semi-curing technique that maintains a soft solid state. Using this technique, they created a conductor with high transparency and used it as an electrode for a triboelectric element. They also created a personalized mask using a multi-angle photography technique. We have created a system that is self-powering and is flexible, resilient and transparent.
The research team simultaneously detected facial muscle deformation and vocal cord vibration, and integrated this information to systemize it so that it can recognize emotions in real time. The information obtained in this way can be used in a virtual reality 'digital concierge' that can provide customized services based on the user's emotions.
First author Jinpyo Lee, a postdoctoral researcher, said, “With the system we developed this time, it is possible to implement real-time emotion recognition with just a few training sessions, without complex measuring equipment,” and “It has shown the possibility of being applied to portable emotion recognition devices and next-generation emotion-based digital platform service components in the future.”
The research team conducted a 'real-time emotion recognition' experiment with the developed system. Not only was it able to collect multimodal data such as facial muscle deformation and voice, but it was also able to use the collected data for 'transfer learning'.
The developed system showed high emotional recognition with just a few training sessions. It was manufactured to be personalized and can be used wirelessly, ensuring wearability and convenience.
The research team also applied the system to a VR environment and used it as a ‘digital concierge’, creating various situations such as a smart home, a personal movie theater, and a smart office. We have once again confirmed that it is possible to provide customized services that can recommend music, movies, books, etc. by understanding an individual's emotions depending on the situation.
Professor Kim Ji-yoon of the Department of Materials Science and Engineering said, “In order for people and machines to interact at a high level, HMI devices must also be able to collect various types of data and handle complex and integrated information.” He added, “This study will be an example that shows that it is possible to utilize very complex forms of information that people have, such as emotions, using next-generation wearable systems.”
This research was conducted in collaboration with Professor Lee Pooi See of the Department of Materials Science and Engineering at Nanyang Technological University, Singapore. It was published online on January 15 in Nature Communications, a world-renowned international academic journal. The research was supported by the National Research Foundation of Korea (NRF) under the Ministry of Science and ICT and the Korea Institute of Materials Science (KIMS).

▲Concept diagram of a wireless facial interface with personalized emotion recognition based on multimodal information