UK-based chip designer Arm Holdings is stepping up its focus on physical AI as demand grows for intelligent systems capable of operating in real-world environments, from autonomous vehicles and robotics to industrial machines and connected infrastructure.
Speaking at embedded world 2026 in Nuremberg this week, Chris Bergey, Executive Vice President of Arm’s Edge AI Business Unit, described physical AI as a key strategic priority for the company as connected devices become more capable and autonomous. “Physical AI is where intelligence meets the real world. Devices don’t just need to compute — they need to perceive, reason, and act safely in real time,” he said.
In January, Arm reorganised its business units to create a dedicated Physical AI division, reflecting growing industry interest in technologies that combine AI with real-world movement and interaction. According to Bergey, organisations developing robotics, autonomous vehicles, and industrial systems increasingly require predictable, low-latency computing platforms capable of running AI workloads at the edge.
“We’ve seen companies increasingly demand predictable, low-latency systems,” he said. “That’s exactly the space Arm is uniquely positioned to serve — from autonomous vehicles to robotics to industrial automation.”

The shift reflects broader changes across the embedded and IoT landscape, where devices are evolving from static, single-purpose endpoints into intelligent, connected systems capable of localised “sensemaking” — interpreting context, processing data, and responding autonomously in real time.
“The pace of innovation is unlike anything I’ve seen in my career,” Bergey said. “What was difficult a year ago is becoming practical now. Edge AI isn’t just enabling smarter devices — it’s redefining what embedded intelligence means across industries.”
He added that the role of Edge computing itself is also changing as AI capabilities become more efficient and widely deployed. “The Edge is no longer just an extension of the Cloud. It’s becoming a place where AI is the foundation of the product itself — making decisions locally, collaborating with other devices, and acting instantly.”
The company’s expanded physical AI focus is already bearing fruit in high-profile collaborations. In late February, Tensor and Arm announced a multi-year strategic partnership to deliver the foundational compute architecture behind what Tensor calls the world’s first agentic AI personal Robocar. Each vehicle integrates more than 400 safety-capable, power-efficient Arm-based cores — the highest concentration of Arm technology in a consumer vehicle today — powering a Level 4 autonomous system with a comprehensive sensor suite including 37 cameras, 5 lidars, 11 radars, and triple-channel 5G connectivity.
Bergey said that AI-enabled devices are increasingly able to discover and coordinate with each other, forming distributed systems that share context and adapt in real time without relying on a central gateway.
“Robots, cameras, and controllers are working together as one unified system,” he said. “That’s a game-changer for industrial and autonomous applications, where speed and reliability are critical.”
According to Bergey, improvements in NPU acceleration and energy-efficient processing are making persistent AI capabilities more practical for OEMs.
“Persistent voice recognition used to be limited to high-power systems,” he said. “Now, with NPU acceleration and low-power Arm cores, OEMs can deploy always-on AI without breaking energy or cost constraints. Ambient intelligence becomes practical, durable, and repeatable.”
Other Arm demonstrations at the show highlighted on-device multimodal AI, where cameras and NPUs run vision and language models together to deliver personalised experiences entirely on-device, reducing latency while keeping sensitive data local. The company also demonstrated ways in which pre-trained models can be deployed rapidly on Arm-based embedded hardware using validated runtimes and development toolchains in collaboration with EmbedUR.
Speaking about what customers are asking, Bergey said: “A lot of them are asking us, ‘what’s possible?’ Everyone is aware of how disruptive these technologies are, and nobody wants to be left behind. There’s a huge desire to move faster and make things possible. I’ve seen people’s careers reinvigorated because they don’t want to miss out on this technology wave. For many, it’s an enabler, not a negative disruption.”
There’s plenty of other editorial on our sister site, Electronic Specifier! Or you can always join in the conversation by visiting our LinkedIn page.