Mertz active vision robotic head from MIT

by Mark R

I have seen realistic robot heads like the Mertz active vision head robot before, and they are usually designed to see what it would be like if robots could display facial expressions.

In that case, you might wonder why someone would design this robot head to look like a freaky baby head rather than a handsomer face.

Actually, the purpose of this robot isn’t to test how a human will respond to it, but how the robot will react to humans. Check out more on this MIT experiment after the jump.

According to my source, Mertz is “built to recognize and react to faces and expressions, aiming to research socially situated learning which is similar to an infant’s learning process”.

In other words, it is a robot that is able to “learn” just like my baby son is learning to recognize facial features, and learning to speak by parroting those around him.

So how much emotional expression is it capable of? Well, I once heard that humans have thousands of forms of expression on their face, and I believe that Mertz is capable of 13 degrees articulation in its neck alone. Who knows what its eyes and mouth are capable of.

This robot is the creation of Lijin Aryananda and Jeff Weber at the MIT Media Lab. I hope this experiment works, but if it does, it proves that machines can learn. Whoa, not certain that I’m ready for that one.


Write a review or comment

You may use <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong> in your comment.

Top Categories
Latest Posts
Subscribe to Newsletter