챗GPT의 AI 혁신이 비즈니스 운영, 고객 서비스, 콘텐츠 제작 등 디지털 업무 다방면에 도움을 주고 있는 반면, 휴머노이드, 공장, 기타 산업 시스템 내 장치 등에서 인공 지능을 구현하는 물리적 AI는 아직 획기적인 전환기에 이르기 못했다는 평가가 많다.

▲Nvidia's three computer systems / (Image: Nvidia)
NVIDIA DGX, Omniverse, and Jetson Thor Accelerate Physical AI Development
Boston Dynamics and Fourier Robotics Adopt NVIDIA Development Platform
While ChatGPT's AI innovations are helping in many aspects of digital work such as business operations, customer service, and content creation, many say that physical AI, which implements artificial intelligence in devices such as humanoids, factories, and other industrial systems, has not yet reached a groundbreaking turning point.
Amidst this, three computer systems that combine advanced training, simulation, and inference are emerging, and it is predicted that this situation will undergo significant changes.
NVIDIA announced on the 28th that it will support the construction of 'physical AI', the next-generation AI technology, through three computer systems and thereby strengthen the robotics ecosystem.
■ The rise of multimodal, physical AI 
▲NVIDIA Robot Development Platform
/ (Image: NVIDIAah)
Large Language Models (LLMs) are one-dimensional models that can predict the next token in a mode such as a character or word. Image and video generation models are two-dimensional models that can predict the next pixel.
These models cannot understand or interpret the three-dimensional world. This is where the need for physical AI comes in.
Physical AI models can perceive, understand, interact with, and explore the physical world through generative AI. Innovations in multimodal physical AI based on accelerated computing and large-scale physics-based simulations have enabled the world to realize the value of physical AI using robots.
Robots are systems that can △perceive △infer △plan △act △learn. Robots are often thought of as autonomous robots, manipulator arms, or humanoids, but there are also various other types of robots.
In the near future, anything that moves or monitors a moving object is expected to be an autonomous robotic system. This means that these systems will have the ability to sense and respond to their surroundings.
Everything from operating rooms to data centers, warehouses, factories, traffic control systems, and entire smart cities is expected to change from static, manual operating systems to autonomous, interactive systems implemented with physical AI.
■ Three computer systems for developing physical AI 
▲Humanoid robot concept
/ (Image: Nvidia) Developing humanoid robots requires three accelerated computing systems: physical AI and robot training, simulation, and runtime. Two computing advances are accelerating the development of humanoid robots: multimodal-based models and scalable physics-based simulations of the world, including the robot.
NVIDIA has built three computer systems and an accelerated development platform to help developers create physical AI.
First, we train the model on a supercomputer. Developers can train and fine-tune powerful foundational and generative AI models using NVIDIA NeMo on NVIDIA DGX platforms.
You can also leverage NVIDIA Project GR00T, an initiative to develop a general foundation model for humanoid robots that can understand natural language and mimic human movements by observing human behavior.
NVIDIA Omniverse, running on NVIDIA OVX Server, provides a development platform and simulation environment for testing and optimizing physical AI, leveraging application programming interfaces and frameworks such as NVIDIA Isaac Sim.
Developers can use Isaac Sim to simulate and validate robot models, or to generate massive amounts of physics-based synthetic data to train robot models. Researchers and developers can also use NVIDIA Isaac Lab, an open-source robot learning framework that supports reinforcement learning and imitation learning, to accelerate robot policy training and improvement.
Finally, the trained AI model is deployed to the execution computer. The NVIDIA Jetson Thor robot computer is specifically designed for compact onboard computing requirements. A set of models consisting of control policies, vision, and language models form the robot brain and are deployed on an energy-efficient onboard edge computing system.
Robot manufacturers and foundation model developers can use as many accelerated computing platforms and systems as they need, depending on their workflows and challenges.
■ Strengthening the capabilities of the developer ecosystem NVIDIA is also focused on accelerating the work of a global ecosystem of robotics developers and robotics foundation model builders through three computer systems.
Universal Robots, a Teradyne Robotics subsidiary, built the UR AI Accelerator using the NVIDIA Isaac Manipulator, Isaac acceleration libraries and AI models, and NVIDIA Jetson Orin, a ready-to-use hardware and software toolkit that helps cobot developers build applications, accelerate development, and shorten time to market for AI products.
RGo Robotics is leveraging NVIDIA Isaac Perceptor to enable its wheel.me AMR to operate anywhere, anytime, and make intelligent decisions with human-like perception and visual-spatial intelligence.
△1X Technologies △Agility Robotics △Apptronic △Boston Dynamics △Fourier △Galbot △Mentee △Sanctuary AI △Unitree Robotics △Xiaopeng Robotics and other humanoid robot manufacturers are adopting NVIDIA’s robot development platform.