Researchers at the Massachusetts Institute of Technology (MIT) have taught robots to understand commands that are spoken in natural language.  This will make them full-fledged assistants you can communicate with in the same way you do with other people.

The researchers explained that in the coming years, robots will help people both at home and at work.  But voice communication with them will become more complicated if the devices only understand clear instructions.  For this reason, robots must be able to receive commands and instructions in normal spoken language for people to be able to communicate with them in the same way we do with each other.

In order to do that, they developed a planner based on thousands of utterance samples and taught it to understand the sequence of structures of the spoken language.  The system they developed combines a deep neural network and a planner based on that sample selection.

Robots will help people both at home and at work

The MIT researchers said:

The main advantage of our planner is that it does not require big data, training a robot in the future should look like training a dog.

The planner can also collect data about commands and interactions that a robot has not previously encountered.  If the system gets confused, the planner remembers this and marks the problem for the engineer. After that, it will be able to interpret similar commands.

Many currently existing machine learning models cannot provide information about what went wrong when a robot is unable to complete a task. With the new method however, the researchers will see problems that prevented a successful completion of that action in the past, and will be able to make changes to the architecture.