As visitors moved through Hannover Messe 2026, one familiar scene unfolded across exhibition booths – robot arms reaching into bins of scattered parts, selecting a target, adjusting in real time and placing components onto the next tray within seconds.
At first glance, the demonstration seemed routine. Robotic picking has long been a staple of industrial trade fairs. Yet what distinguishes this year’s displays was not the motion itself but the intelligence behind it. With artificial intelligence (AI) more deeply integrated into production, machines are beginning to perceive, decide and act within workflows, rather than simply execute pre-programmed instructions.
Out of theory, into application
At Hannover Messe, that shift is becoming visible in real-world operations. “AI starts in the physical world, and it ends in the physical world,” Cedrik Neike, a member of the managing board of Siemens AG and CEO of Siemens Digital Industries, said. In his view, the industrial AI technology stack begins with data generated in real-world settings, including sensors, control systems and machinery. After being processed through computing power and specialised software models, the data is fed back into real-world industrial operations.
“Without bringing AI into the real world, it is just a brain in a jar,” he said. How, then, does AI move out of the “jar” and into real-world use? Neike pointed to Siemens’ Innovation Hub at this year’s fair, where the company showcased a flexible production of shoe soles using additive manufacturing. In this example, AI runs through the entire production chain.
Users submit customisation requests through an AI chat interface, while backend AI coordinates the appropriate design tools. AI agents manage production autonomously, humanoid robots transport the soles through the manufacturing process and AI-controlled robots complete final packaging.
“With industrial AI, factories become more adaptable, more flexible and more resilient,” he said.
Physical AI opens new scenarios
At the Robotics & Assembly Automation zone, visitors could hardly miss the humanoid robots moving through the exhibition space. Some shook hands with attendees, others carried out material-handling tasks, and some even sat down between demonstrations, reinforcing the impression that the factory of the future is coming into view.
For the first time at Hannover Messe, physical AI was presented as a central theme. Organizers define it as AI that interacts directly with the physical world — for example, through machines, plants and robots.
“AI thus becomes a productive force in the factory – especially in industrial and humanoid robots,” said Jochen Koeckler, chairman of the managing board of Deutsche Messe AG, the fair’s organizer. The view is echoed by recent industry research.
In a report released ahead of the fair, Capgemini surveyed 1,678 senior executives across 15 industries and found that 67 percent viewed physical AI as a game changer, saying it “enables robots to interpret context, adapt in real time, and operate in unstructured environments.”
Nearly eight in 10 organisations reported they were already engaging with physical AI. “One advantage of physical AI is that we can tackle use cases that were not possible before,” Jochen Lindermayr, intelligent mobile robots project leader at the Fraunhofer Institute for Manufacturing Engineering and Automation, said, pointing in particular to articulated objects such as cables, which are difficult to handle with conventional programming. He added that although physical AI has yet to match the robustness of classical approaches, it may eventually develop capabilities that surpass them, much as digital cameras ultimately did after their early stages.
AI agents drive coordinated action
As capabilities such as perception, decision-making and execution continue to move onto the factory floor, AI agents are emerging as an important step in the evolution of industrial AI. Alexander Zorn, data scientist at the Fraunhofer Institute for Intelligent Analysis and Information Systems, said an agent is “a system designed to perceive its environment, make decisions and take actions to achieve specific goals. Agents operate autonomously, without direct human control.”
In his view, the primary task of an AI agent system is to break down complex problems into subtasks using a main agent. “These subtasks are then executed using subagents or other tools. The system observes execution, detects errors, and independently revises the plan if necessary,” he said.
In industrial settings, such capability is pushing AI further into operational workflows and cross-functional coordination. Speaking at a panel discussion on AI agents, Norbert Jung, CEO of Bosch Connected Industry, said that expert knowledge in manufacturing remains scarce and fragmented, especially when it is needed most, such as in the early hours of the morning or on weekends, with experts may not be available on site.
“This is where AI can make a real difference,” he said, adding that AI agents can provide guidance when machines or production lines go down, helping non-experts restore operations.
Also, in multi-agent systems, a machine failure can trigger a maintenance agent to adjust schedules, which may in turn activate a continuous optimization agent to improve plant performance.
Speaking at the same panel, Manikantan NS, senior vice president and global head of manufacturing at Tech Mahindra, said that deploying AI agents in industry is not purely a technical matter.
“The human element and organizational culture are also important,” he said, emphasizing that the key lies in “coordinated action,” especially with humans in the loop.
- A Tell Media / Xinhua report





