인피니언 HV GaN
반도체 AI 인더스트리 4.0 SDV 스마트 IoT 컴퓨터 통신 특수 가스 소재 및 장비 e4ds plus

[IT 인사이트] On-device AI: Korean small and medium-sized fabless companies must seize the opportunity.

기사입력2025.09.15 15:57


▲Hwang Tae-ho, head of the Korea Electronics Technology Institute (KETI), is giving a keynote speech at the '2025 e4ds Tech Day' on the 9th.

AI semiconductors require specialized hardware for high-speed computing and large-scale data processing.
On-device AI is in high demand across various industries, autonomous driving, and defense.

“On-device AI used in autonomous driving, defense, and various industries presents an opportunity for Korea’s small and medium-sized fabless companies. The government should also support development and enhance industrial competitiveness in this field by strengthening collaboration between companies.”

Taeho Hwang, head of the Korea Electronics Technology Institute (KETI), said in his keynote speech at the '2025 e4ds Tech Day' on the 9th that on-device AI is an opportunity for Korean companies, and that Korean companies should ride the wave of this opportunity to secure competitiveness.

This technology, which performs AI functions on local devices without cloud connection, can be applied to various industries such as automobiles, home appliances, and defense.

In particular, high demand is expected in the fields of failure prediction systems (PHM), autonomous vehicles, and defense.

In line with this, the government is also supporting the development of on-device AI semiconductors and pursuing a strategy to enhance industrial competitiveness through collaboration between anchor companies and fabless companies.

In his keynote speech, CEO Hwang Tae-ho looked into the present and future of AI semiconductors, emphasizing the reality that AI technology has now deeply penetrated all industries, and that there is no going back to the way it was before.

Director Hwang explained that the center of AI technology is recently shifting to a transformer-based natural language processing model.

Moving away from CNN-based image processing in the past, sequence processing technology centered on language data has now become mainstream.

In particular, since the paper 'Attention is all you need', the transformer structure has shown advanced language understanding and generation capabilities through the encoder-decoder method and self-attention mechanism.

These technological advancements are not limited to the simple evolution of algorithms.

“AI semiconductors have reached a level where they can no longer be handled by general-purpose chips,” said Hwang, adding that specialized hardware for high-speed calculations and large-scale data processing is needed. He emphasized the necessity.

High-bandwidth memory (HBM) has emerged as the core of AI semiconductors.

“90% of all operations are memory-bound during the decode stage,” said CEO Hwang, emphasizing the importance of HBM.

Nvidia is leading the market with its DGX server that packages GPUs and HBM, and Samsung Electronics and SK Hynix are also accelerating the development of HBM4.

Also, semiconductor structures that separate learning and inference are attracting attention.

While training still relies on NVIDIA's CUDA ecosystem, inference is being challenged by various companies developing their own chips.

Representative examples include Broadcom and Marvell in the US, Halo in Israel, and Cambricon in China.

Optimization technologies for lightweighting and improving efficiency of AI semiconductors are also noteworthy.

We are reducing the amount of computation through various methods such as quantization, pruning, and network lightweighting, and are laying the foundation for operating LLM on small chips.

Neuromorphic semiconductors offer high pattern recognition performance at low power, and are attracting attention as an alternative for future on-device AI.

Concluding his speech, CEO Hwang emphasized, “In the future, the market will be led by custom AI semiconductors specialized for specific applications, rather than general-purpose chips.”

He also said, “Semiconductors, which combine various technologies such as high-speed interfaces, advanced packaging, high-bandwidth memory, and multi-die systems, are positioning themselves as core infrastructure in the AI era.”