Alexa’s chief scientist thinks the assistant needs a robot body to understand the world

Amazon’s Rohit Prasad, head scientist and an instrumental member of the Alexa division, says the company’s personal software assistant would be far smarter if it had a robot body and cameras to move around in the real world. Prasad, speaking at MIT Technology Review’s EmTech Digital AI conference in San Francisco yesterday, said, “The only way to make smart assistants really smart is to give it eyes and let it explore the world.”

Some Alexa-enabled smart devices already have cameras. But a robot body would be new. Prasad’s comments suggest that work could be in service of one day giving Alexa a body — although he wouldn’t confirm this directly. Prasad works on natural language processing and other machine learning capabilities for Alexa, so it’s likely if he wanted to test these features out, he’d be one of the few Amazon employees who could easily go ahead and try it.

Giving smart assistants cameras and physical bodies would grant them more data on the world around them and help build what we might consider “common sense,” potentially making them more intelligent and more capable of performing complex tasks that require visual context and other important data human brains take for granted. (The combined theory around this blend of neurobiology, robotics, and AI is often referred to as embodied cognition.) And the comments from Amazon’s Prasad line up with rumors of a mobile Alexa robot from a Bloomberg report last year that could follow users around in the home.