“AI는 강력한 개방형 생태계가 필요하다. 네이버와의 전략적 파트너십도 같은 비전을 가지고 목표한 바를 이루고자 하는 것이다”
인텔 저스틴 호타드(Justin Hotard) 수석 부사장이 지난 5일 인텔 AI 서밋(Intel AI Summit)에서 AI에 대한 개방형 생태계 구축을 강조하며 인텔 중심의 AI 파트너십 확대 의지를 공고히 했다.

▲ Justin Hotard, Senior Vice President and General Manager of the Data Center and AI Group, responding to press questions at the Intel AI Summit
Emphasize Intel-centric AI open ecosystem
Naver, “AI Chip Monopoly Challenge”
Intel-Naver AI Joint Front Emphasized
6G Communication AI Ecosystem Long-term Groundwork
“AI requires a strong open ecosystem. Our strategic partnership with Naver is also about achieving the same vision and goals.”
At the Intel AI Summit held on the 5th, Intel Senior Vice President Justin Hotard emphasized the construction of an open ecosystem for AI, solidifying Intel's will to expand AI partnerships centered on it.
Intel held the 'Intel AI Summit' at the JW Marriott Hotel in Seocho-gu, Seoul on the 5th. Approximately 500 major Intel partners from Korea and abroad participated in the event to explain Intel's AI market outlook and AI-related solutions and progress.
At this year's Intel AI Summit, Justin Hotard, senior vice president and general manager of the Data Center and AI Group who was appointed in February of this year, delivered a keynote speech.
“We expect 80% of enterprises to use generative AI by 2026,” he said. “Enterprise spending on generative AI is expected to quadruple by 2027.”
Intel's AI Everywhere strategy focuses on providing scalable AI solutions that span from PCs to data centers, and from hardware to software and systems.
Vice President Hotard points out the core of this approach as the application ecosystem, software ecosystem, infrastructure ecosystem, and computing ecosystem. Intel is focusing on building an open ecosystem across the entire line from the edge to the data center.
As proof of this, around 500 partner companies participated in this Intel AI Summit and listened to the sessions. Naver CEO Ha Jung-woo presented the second keynote session following Vice President Hotard, saying, “The biggest challenge in the current era of generative AI is the monopoly centered around specific AI chips,” and pointed out that “the supply is not sufficient compared to demand, so the AI gap problem may arise depending on the supply priority.”
This is why Naver has formed an AI strategic partnership with Intel and is pursuing an open ecosystem. A supply bottleneck is occurring for NVIDIA's latest AI chipsets, and a workaround strategy is needed to secure the burgeoning AI market in a timely manner.
Intel-Naver is building a colab and is conducting research, experimentation, and evaluation on LLM construction through the Gaudi2 AI accelerator in collaboration with domestic universities and research institutes such as KAIST. Through this, the research results on vLLM will be open-sourced and disclosed by the end of the year.
He added that the source codes of the existing LLM will also be linked to the next-generation Intel AI accelerator, Gaudi 3, which will be equipped with HBM2e and will be released at the end of the year.
Intel is taking an ecosystem approach to increase adoption of Intel’s AI solutions in the future AI market. This is also evident in the long-term strategic collaboration on communication technology. Vice President Hotard said, “We are working with SKT on 6G technology,” adding, “6G is an important technology for the expansion and adoption of AI applications.” He also said that he is looking forward to opening the AI PC era with Samsung, LG, and others.
Vice President Hotard also emphasized the importance of balance when explaining why HBM2e was chosen instead of the latest HBM product in Gaudi3. “Among various use cases, the balance between performance and efficiency is important,” he said. “There should be a balance in networking, computing, and memory. Since Gaudi3 is a TPU, not a GPU, the inference area is important for enterprise use, and the training time has been improved by 50% compared to the H100.”