At A Glance – Embodied AI

Artificial intelligence gets physical

Embodied AI is one of many terms associated with the relentless development of Artificial Intelligence. As the name suggests, it involves equipping software with a physical body and exploring how that body fits into real world environments. Embodied AI is based on embodied cognition – the idea that intelligence is as much a part of the body as the brain. By applying this logic to artificially intelligent systems, researchers hope to improve their functionality. Process automation, chatbots, advanced robotics, autonomous drive technology, and personal companions like Buddy and Jibo could all benefit from embodied intelligence.

The concept of Embodied AI was first put forward in a paper written by two University of Zurich researchers, Matej Hoffman and Rolf Pfiefer, in 2012. ‘Good Old Fashioned AI’, as they call it, relies on data streams to categorise the physical environment. Unfortunately, these streams can be limited. For example, a control system based on external data alone would struggle to comprehend distance and size. By also applying sensory information, the cognitive function of AI is thought to be improved. Today, there are already intelligent machines that can hand over certain functions – like walking – entirely to the mechanism. This may well be a low level task, but it’s only the first step.

Embodied AI could help to build AI that can function in society, taking note of physical surroundings and situations to create deeper data analysis. Autonomous vehicles and intelligent robots, for example, will need incredibly advanced depth of perception to work in real world environments. Inadvertently, this leads to debate surrounding development. Traditionally, AI has been likened to a manufactured brain. However, discussing and experimenting with the embodiment of software will take the technology into new, unclassified territory.