대부분의 자율주행차 사고원인이 인지 센서의 한계 성능 및 오류 때문인 것으로 판별됨에 따라 인지 오류 및 악천후 대응이 가능하도록 센서를 고도화하고 센서 융합 기법 적용 등을 통해 한계 상황을 극복하는 것이 중요하다는 전문가의 의견이 제기됐다.
Cognitive Sensors, Key Technologies for Advanced and Popularized Autonomous Driving
Thermal imaging cameras, 4D imaging radars, and FMCW lidar draw attention
As most autonomous vehicle accidents are determined to be caused by limitations and errors in the performance of perception sensors, experts have raised the opinion that it is important to overcome limitations by upgrading sensors to enable perception errors and response to bad weather and applying sensor fusion techniques.
The Korea Automobile Research Institute (hereinafter referred to as Hanjayeon) held the 5th Jasan Eobo event at the COEX Startup Branch on the 18th and shared the current status and outlook of the technology of cognitive sensors, known as the 'eyes of autonomous driving.'
.jpg)
▲Director Na Seung-sik of Hanjayeon is giving a welcoming speech at the 5th Jasan Eobo event (Photo courtesy of Hanjayeon)
Director Na Seung-sik of Hanjaeyeon emphasized, “Cognitive sensors are a key technology for advancing and popularizing autonomous driving. In particular, in order to ensure driving safety and reliability, it is very important to advance the performance of cameras, lidar, radar, etc. and secure technology.”
The Tesla Model X, the first production car in 2015, had about 20 sensors, while Waymo’s fifth-generation vehicle in 2021 has about 40 sensors (29 cameras, five lidars, and five radars).
Despite the many sensors installed, autonomous vehicle accidents continue to occur, and even fatal accidents occur, such as the camera, which is a cognitive sensor., radar, and lidar have reached their limits, and there is a pressing need for advancement.
Radar and lidar are less capable of recognizing objects moving laterally, as evidenced by the accident in March 2018 when an Uber autonomous car struck and killed a jaywalking cyclist.
In addition, in the case of radar mounted under the front grill, there is also a problem with poor reception when there is severe icing on the surface during winter driving.
Camera performance deteriorates when there is no light source nearby, such as in heavy rain, heavy snow, or thick fog.
To overcome the above limitations, the importance of sensors and fusion technologies that can complement them is emerging.
Noh Hyeong-ju, head of the semiconductor and sensor technology division of the Autonomous Driving Technology Research Institute at the Korea Automobile Research Institute, said, “Beyond sensors that recognize visible objects, such as pedestrians and third-party vehicles, development of sensors to recognize invisible objects, such as black ice, is currently underway.” He added, “Thermal imaging cameras, 4D imaging radars, and FMCW lidar are being developed, and technology to fuse these sensors is also an important field.”
Hanwha Systems presented a thermal imaging camera as a solution to overcome the weaknesses of the above-mentioned cognitive sensors and ensure safety, which is the most important factor in autonomous driving.
.jpg)
▲Hanwha Systems Quantum Red thermal imaging camera performance (Image source: Quantum Red homepage)
According to market research firm Yole, the automotive thermal imaging camera market is expected to reach It is projected to grow from $70 million in 2019 to $160 million in 2026.
Thermal imaging cameras are passive, do not require an external light source, and detect all objects that emit their own heat energy with a temperature above absolute zero (0 Kelvin), regardless of the weather.
Choi Yong-jun, head of Hanwha Systems, said, “Thermal imaging sensors have the advantage of distinguishing between living things and objects, and their recognition rate is superior to that of daytime cameras even in situations such as entering and exiting tunnels, and they can accurately detect pedestrians even when driving at night.”
4D Imaging RADAR is a sensor that is receiving attention in the autonomous driving market as it is evaluated to have scalability that is more than 10 times better than 3D radar.
While existing 3D radars identify distance, direction (azimuth), and relative speed, 4D imaging radar technology adds another dimension of information: height (vertical angle).

▲4D imaging radar technology overview (Image source: LG Innotek)
Therefore, 4D imaging radar can provide richer and more accurate data about the driving environment than before by identifying the height of objects or how high they are located above the road.
Bitsensing, which exhibited its 4D imaging radar at CES 2022, said, “Existing automotive radars that only detect speed and horizontal information can now go one step further with 4D imaging radar to detect height and add speed data to provide a 3D dimension of the surrounding environment.”“It can provide detailed and precise detection information on the motion of moving objects, such as speed and angle,” he explained the advantages of imaging radar.
Kim Yong-jae, Vice President of Smart Radar Systems, said at an asset management event hosted by the Korea Automobile Research Institute on the 18th, “The current limitation of radar is that it is difficult to distinguish between the top and bottom of a target, which makes it difficult to identify overpasses, tunnel entrances, etc.” He added, “On the other hand, 4D imaging radar is a solution that can overcome these difficulties and can be utilized well in level 3 autonomous driving.”
FMCW (Frequency Modulated Continuous Wave) lidar is also emerging as a key component for autonomous driving.
A representative Korean company is Infoworks, which succeeded in developing the first FMCW lidar in Korea in 2019 and also participated in CES to make its technology known to the world.

▲FMCW LiDAR clarity comparison (Image source: Infoworks homepage)
Existing lidar uses the ToF (Time of Flight) method to detect direction and distance by shooting a pulsed laser beam and measuring the time it takes for the reflected light to hit an object and return.
On the other hand, FMCW lidar can determine the moving speed of an object by measuring the frequency change of the beam reflected from the object.
It is equipped with functions to overcome interference from sunlight, headlights, and lasers, and to recognize and track objects. It has a higher recognition rate than existing lidars, even in bad weather conditions such as heavy rain and low light, enabling safe autonomous driving. It is a further advancement of lidar.
Meanwhile, according to the Korea Automobile Research Institute, the autonomous vehicle market size is expected to grow by more than 40% annually, reaching $154.9 billion (approximately KRW 209 trillion) in 2025 and $1 trillion (approximately KRW 1,347 trillion) in 2035.
As the autonomous driving market expands, the market for related sensors is also expected to grow significantly.
MARKETSANDMARKETS forecasts that the market for autonomous driving cameras will grow at a compound annual growth rate of 11.7% from $8 billion in 2023 to $13.9 billion in 2028.
RESEARCHANDMARKETS, a research firm, forecasts that the autonomous driving radar market will grow at a compound annual growth rate of 14.72% from USD 5.66 billion in 2022 to USD 12.12 billion in 2028.
According to KOTRA's overseas market news, the global automotive laser lidar market is expected to grow significantly from USD 360 million in 2022 to USD 6 billion in 2025 and USD 11.01 billion in 2027, growing at a compound annual growth rate of 76.6%.