Autonomous driving in irregular road and bad weather environments requires sensor accuracy
Map data, localization, and driving assistance on blurred roads
The Electronics and Telecommunications Research Institute (ETRI) announced that it is currently conducting an autonomous driving technology development innovation project through government R&D with the goal of advancing it to Level 4 fully autonomous driving technology, and is conducting research to ensure that it can drive without problems even on irregular roads (back roads) and in bad weather environments.
In an interview with our newspaper, Min Kyung-wook, head of the Autonomous Driving Intelligence Lab at ETRI's Intelligent Robotics Research Center, explained ETRI's autonomous driving technology and the autonomous vehicle Autobee.

▲Min Kyung-wook, ETRI Director
When asked about the 2022 goals, Director Min stated that they are currently researching ways to enable the vehicle to drive without problems on irregular roads (back roads), such as country roads with unclear lane lines, and in bad weather conditions.
Sensor accuracy is essential for safe driving on irregular roads and in adverse weather conditions, as well as for use in defense.
To this end, we have developed core AI technologies that recognize and predict driving environments and situations based on the fusion of cameras and lidar, and provide AI learning and inference models to implement recognition and prediction functions.
By providing data that can learn models, autonomous driving software can be transformed into various forms.Oh, the technology could be applied.
By labeling 3D bounding boxes with camera image data and lidar data, a large amount of data is built by distinguishing between vehicle types and pedestrians, and through learning from this, dynamic objects in the surroundings can be detected in 3D more accurately.
The Ottobee vehicle has two cameras installed in the front and two lidars installed in the front, side, and rear.
One of its features is that it is equipped with six lidars to eliminate blind spots and increase stability. This is used to recognize dynamic objects (people, vehicles, etc.) in 3D form using artificial intelligence.
Camera sensors detect road markings such as lane lines and traffic lights on the road, and recognize whether the tail lights of a stopped vehicle are emergency lights or reverse lights.
In situations where crosswalks and lane lines are blurry and cameras cannot properly recognize them, map data and localization technology assist autonomous driving.
If map data is built, it is possible to know where vehicles are first, which can ensure pedestrian safety even when crosswalks are blurry.
Localization technology is a technology that fuses camera and lidar information.
This is a technology that allows for more accurate identification of a vehicle's location by using buildings recognized by LiDAR in addition to marks on camera images.
ETRI has been continuously building data for artificial intelligence learning for the past five years.
Raw data is built by synchronizing the camera, lidar, and GPS data of the collection vehicle with the fixed information data of the vehicle, and learning data is created through labeling work using this data.
Over the past five years, we have accumulated over 100,000 km of data, over 200 TB of data, and over 14 million training data sets, and are using them for learning.
Interviews related to the article can be found in the video above the article and on the e4ds YouTube channel 'Electronic's for Design and Software'.