Intelligent robots are bridging the gap from automation to autonomy

This new frontier is what we call "physical intelligence," the intersection point where AI gains a body, and robots gain a brain.
Dec. 22, 2025
9 min read

What you’ll learn:

  • The next step in manufacturing evolution is the strategic imperative to translate those deep digital insights into tangible, physical action.
  • To say there are a lot of moving parts to this evolution in robotics would be an understatement.
  • In the modern robotics market, the humanoid robot form factor garners intense media attention and takes up most of the space when it comes to discussions of physical intelligence.

Editor’s note: This is the third part of a four-part series, the Smart Operations Playbook. In Part 1, Colin Masson made the case for "taming the data beast" by establishing industrial data fabrics. With Part 2, ARC's Craig Resnick and Interpreet Shoker filled a very important role in our coverage of AI's impact on the manufacturing workforce by emphasizing how artificial intelligence should augment industrial roles—not replace human employees. Colin will conclude the series tomorrow, Dec. 23, with a roadmap to physically intelligent operations.


Industry has spent the last decade or so building out the infrastructure for the Industrial Internet of Things (IioT), cloud platforms, data fabrics and, more recently, AI.

This digital foundation has furnished the industrial world with a collective "brain." But a brain that is purely digital, powerful though it may be, can only take us so far.

The next step in manufacturing evolution is the strategic imperative to translate those deep digital insights into tangible, physical action. This new frontier is what we call physical intelligence, the in-tersection point where AI gains a body, and robots gain a brain.

The arrival of physical intelligence marks a clear break from industry's past, which relied heavily on rigid machines that executed pre-programmed, human-written instructions and had only a limited understanding of their environment at best.

See also: How smart industry is escaping the 'single cloud of failure'

The speed and precision of these robots make them indispensible for some manufacturing tasks, but the era of static robots is now making way for dynamic and autonomous systems.

The new generation of intelligent machines, such as collaborative robots (cobots) and autonomous mobile robots (AMRs), have been empowered by data fabrics (mentioned in our previous article) and AI ground-work to embody an entirely new operational philosophy.

See also: Inside the rise of manufacturing ‘co-intelligence’ in real factory operations

Instead of merely completing pre-set tasks, these intelligent systems can perceive their surroundings, perform reasoning based on those sensory inputs, and act to achieve a goal in harmony with the systems around them.

This capability allows modern robotics to manage unpredictable environments, dynamically adjusting their motion in real-time alongside human collaborators.

For this vision for industry to come to fruition, however, there are some non-negotiables.

Why the robotic edge is the foundation for physical intelligence

For a robot navigating a dynamic environment, latency, or the time delay in processing information, is not merely a performance issue but an operational necessity.

When a collaborative robot operates just inches away from a person, a delay of even a few seconds in recognizing an unexpected movement poses an unacceptable safety risk.

See also: Human intelligence plus AI and how supply chains are changing with this collaboration

The core issue lies in the limitations of remote cloud computing. Running AI models at the cloud level introduces connection time and data transfer latency that is prohibitive for the time-sensitive applications robots typically perform.

For intelligent systems to make split-second decisions while maintaining a safe environment, the "thinking" must happen on the robot itself, in real time, at the industrial edge.

This necessity for real-time processing has created demand for a new class of hardware. We are now seeing the emergence of industrial-grade, AI-accelerated computing platforms such as systems-on-modules (SoMs), microcontrollers, and embedded PCs that are purpose-built to run complex AI models in harsh, vibration-prone environments like the plant floor.

This high-performance computer hardware is essential for processing the constant stream of input data from the robot’s LiDAR, 3D cameras, and force sensors immediately on the device through a critical process known as sensor fusion. This ensures the robot has the real-time perception layer needed for safe and effective operation.

ARC’s guide to the modern robot platform

To say there are a lot of moving parts to this evolution in robotics would be an understatement. In response, ARC has developed a unified taxonomy for physical intelligence, structured around three integrated components: the Body, the Nervous System, and the Mind.

The Body: Robot form factors:

  • Articulated robot: This is the most common type of industrial robot, resembling a human arm. It features a series of rotary joints (typically 4 to 6 axes, but sometimes more) that provide exceptional flexibility and a large, spherical work envelope. It is the form factor most frequently adapted for collaborative use.
  • SCARA robot: These robots are designed for speed and precision in a single plane. They typically have four axes and are compliant in the X-Y plane but rigid in the Z-axis, making them ideal for tasks like pick-and-place, assembly, packaging, and sorting.
  • Cartesian/gantry robot: These robots operate along three linear axes (X, Y, Z) using a rigid, overhead structure. They are defined by a rectangular work envelope and are highly scalable. High payload capacity and easily customized.
  • Parallel robot (delta robot): These spider-like robots use multiple arms connected to a single platform. This design makes them exceptionally fast and precise for light-payload applications within a dome-shaped work envelope.
  • Mobile robots: This broad and rapidly evolving category covers all ground-based robots capable of navigation. It includes wheeled robots like traditional AGVs (automated guided vehicles) that follow fixed paths, AMRs (autonomous mobile robots) that navigate dynamically, as well as Legged Robots like quadrupeds that offer unparalleled mobility in difficult terrain.
  • Drones/unmanned aerial vehicles (UAVs): These are aerial robots, typically multirotor helicopters, capable of three-dimensional movement. They bring robotic capabilities to applications where ground-based robots cannot go, offering a unique perspective for data collection and inspection.
  • Humanoid robot: While technically a form of mobile robot, we classify humanoids as a distinct category due to their unique strategic importance, technological complexity, and application focus. These robots are designed with a human-like form factor to operate in environments built for people, using human tools.

The Nervous System: Sensing and perception hardware:

This layer encompasses the hardware components that enable perception, action, and computation.

See also: How digital transformation and AI can redefine supply chains

Understanding this layer is critical for assessing the performance, cost, and supplier ecosystem for any robotics solution.

Crucially, this layer contains the advanced sensors necessary for ensuring human safety:

  • The controller: This is the ruggedized, industrial-grade compute platform that handles real-time motion control and perception software directly on the machine.
  • Sensors and perception hardware: These are the robot's eyes and ears, including high-resolution LiDAR, 3D vision cameras, and other sensors that gather raw data about the physical world.
  • End effectors: The robot’s hands—the specialized tools, grippers, or welding implements that allow it to physically interact with its surroundings.
  • Mechanical structure: The physical frame, joints, and components that make up the robot's chassis or arm and define its reach, payload, and physical constraints.

The Mind: An intelligent software stack for robotics:

This is the nexus point where Industrial AI software connects directly to the physical machine. In other words, a cumulative set of software capabilities that truly define an intelligent robot:

  • Foundational middleware (OS): This is the essential software plumbing. Dominated by the Robot Operating System (ROS/ROS 2), it acts as a vendor-neutral integration bus.
  • Core robotic capabilities: These are latency-sensitive functions, such as Perception (under-standing sensor data), Navigation (moving through space), and Manipulation (interacting with objects), which must run at the Edge for safe, real-time operation.
  • AI and simulation: The higher-order brain functions, including the AI models trained in the Cloud via methods like simulation and digital twins, with the optimized models deployed to run at the Edge for operational use.
  • Fleet management and orchestration: The enterprise command and control software that coordinates and monitors entire fleets of robots and integrates them with higher-level systems.
  • Human-robot interface (HRI): This is the communication bridge to people. It includes eve-rything from traditional teach pendants to modern low-code programming interfaces and, increasingly, natural language and Generative AI-powered "copilots."

Humanoid hype: ARC recommends the pragmatic approach

In the modern robotics market, the humanoid robot form factor garners intense media attention and takes up the majority of space when it comes to discussions of physical intelligence.

While I cannot deny their technological triumph and futuristic appeal, manufacturers seeking practical solutions must consider the function, not the form.

See also: Data quality issues costing manufacturers billions, survey says

We must avoid the strategic error of "paving the cowpaths," that is, simply forcing a human-shaped robot into a process designed for humans rather than fundamentally rethinking the process to utilize the most efficient robotic form for the task. For a modern enterprise, the humanoid form factor is just another tool in the arsenal.

Humanoid designs promise incredible versatility and the potential to use human tools but currently face significant limitations.

Compared to dedicated robots that have been specifically designed for a certain purpose, humanoids have an order of magnitude greater programming complexity, higher price-points, and greater requirements for connectivity and power.

With these factors in mind, ARC does not expect Humanoids to replace existing robots, but rather complement them.

Next-generation robots must be intelligent, secure, and connected

While the subject of physical Intelligence of course puts the intelligence of robots at the forefront, that intelligence can only function properly if it is both connected and secure.

The powerful AI models that constitute the modern robot's mind allow them to manage complex environments and react to changes in the environment.

See also: The hardware problem that is stalling half of all digital transformation projects

However, unlocking the next stage of value comes when when the robot’s capabilities can be enhanced with contextual knowledge such as assembly processes, quality reports, and production changes provided by industrial data fabrics and orchestration systems.

As robots continue to bridge the digital and physical domains, they introduce another imperative: the need for security. A compromised robot is not just a data breach risk; it’s a potential safety and operational hazard.

As industry incorporates new technology and workflows for robotics, we must apply the same cybersecurity rigor and standards, like IEC 62443, to a fleet of mobile robots as we do to a plant's distributed control system.

The future of robotics will not be defined solely by technological breakthroughs, but by how well we integrate systems that are intelligent enough to adapt to their environment, connected enough to coordinate at a higher level, and secure enough to maintain operational integrity.

About the Author

Patrick Arnold

Patrick Arnold

Patrick Arnold is research analyst at ARC Advisory Group. His primary focus at ARC is industrial IoT networking solutions, including topics such as network infrastructure, software, and edge computing.

Sign up for our eNewsletters
Get the latest news and updates