
▲(From left) Lee Yong-seok, PhD, Department of Mechanical Engineering, College of Engineering, Seoul National University (currently at Samsung Research), Researcher Do Won-gyeong (currently at Stanford University), Researcher Yoon Han-byeol (currently at UCLA), Researcher Heo Jin-wook, Researcher Lee Won-ha (currently at Samsung Electronics), and Professor Lee Dong-jun
Improved robustness and accuracy of hand gesture tracking using inertial sensors and cameras Technology has been developed to accurately track hand movements in various environments and tasks, allowing for the use of rich hand and finger movements in three dimensions, and is expected to be applied in various industries such as virtual and augmented reality, smart factories, and rehabilitation.
Seoul National University College of Engineering (Dean Lee Byung-ho) announced on the 30th that Professor Lee Dong-jun's research team from the Department of Mechanical Engineering developed 'VIST (visual-inertial skeleton tracking),' a robust and accurate hand motion tracking technology.
The VIST technology developed by Professor Lee Dong-jun's team complementarily fuses information from a glove equipped with seven inertial sensors and 37 dyeable markers and a camera worn on the head to robustly and accurately track hand and finger movements even in the face of frequent image occlusion when manipulating objects, geomagnetic disturbances occurring near electronic equipment or steel structures, and contact caused by wearing scissors, an electric drill, or tactile equipment.
Especially in hand movements where many fingers move quickly and complexly on a small palm, it is very easy for the camera to miss the marker and fail to track it. However, the marker tracking performance of the camera is improved by utilizing the inertial sensor.The core technology of VIST is the 'technology for tracking multiple skeletons based on tightly-coupled fusion of image and inertial sensors', which simultaneously achieves robustness and accuracy in hand motion tracking by correcting the divergence of inertial sensor information using camera information while increasing the sensitivity.
Professor Dong-Jun Lee, the principal investigator, said, “The developed VIST hand motion tracking technology will allow for intuitive and efficient control of robotic hands, collaborative robots, and swarm robots using hands and fingers, and at the same time, it is expected to enable natural interaction in virtual reality, augmented reality, and the metaverse.” He added, “In particular, compared to existing products, it has a high possibility of commercialization due to its light weight (55g), low price (material cost $100), high accuracy (tracking error of about 1cm), and durability (washable).”
The ability to use hands and fingers in a diverse and sophisticated manner is one of the most important characteristics of humans and is a vital element in enriching our interactions with the outside world.
On the other hand, the user interfaces currently used in robot control and virtual reality can only control movements on the tablet plane or control the avatar by holding the controller and using a fist, so they cannot utilize the rich movements of hands and fingers in three dimensions.
To this end, video-based hand gesture tracking technology using cameras and AI, technology that tracks hand gestures by measuring the angle of the finger using an inertial measurement unit and a geomagnetic sensor, and technology that estimates hand gestures by measuring the deformation of a soft wearable sensor are being developed. However, each of these technologies has fundamental problems such as image occlusion, magnetic disturbance, and signal disturbance due to contact with objects and the environment, so it has not been applied to various environments and industries.
Meanwhile, this study was supported by the Ministry of Science and ICTThis work was supported by the Early Research Project (Mid-career Research) and the Leading Research Center of the National Research Foundation of Korea, and was published in the international journal 'Science Robotics' on September 29.