By Sudarshana Banerjee
Honda Motor has unveiled a new ASIMO humanoid robot equipped with what the company says could be the world’s first autonomous behavior control technology. What this means is that these robots will be able to function without being controlled by an operator. The robots also have ‘significantly improved intelligence’ and the physical ability to adapt to situations, which Honda says brings Asimo (the name stands for Advanced Step in Innovative Mobility) closer to being employed in practical use in real life situations.
Lets see what new Asimo comes with:
1) High-level postural balancing capability – which enables the robot to maintain its posture by putting out its leg in an instant,
2) External recognition capability – which enables the robot to integrate information, such as movements of people around it, from multiple sensors and estimate the changes that are taking place, and,
3) the capability to generate autonomous behavior – which enables the robot to make predictions from gathered information and autonomously determine the next behavior without being controlled by an operator.
Why is all this such a big deal?
You may have read your Steven Pinker and know what a Herculean feat this is (incase you are wondering what the heck, you may want to take a look at How The Mind Works). Electronics devices, lacking an actual ghost in the machine, is in the absolute mercy of code and operator. A Lexus factory near Tokyo for example has some 66 humans and 310 robots programmed to perform just one component of the manufacturing process, and together man and robot build 300 Lexus sedans a day.
Why does it have to be this way? Well, to begin with, robots lack ‘vision’ – the world around them, including the blue sky and the red rose, and all the mortals scattered in between the heaven and earth, are numbers to be crunched. A robot has no comprehension of space, and no perception of depth. Reason why my Nano Hex bug will nonchalantly fall off the table without the least bit concern of its safety. Oh yes, robots have no inherent concept of safety or self-preservation either. These are highly sophisticated machines, with advanced algorithms and adaptive learning capabilities, but at the end of the day, still machines. Robotics engineers are yet to be able to drill a ethical compass into robots, and robots are almost as capable of obeying the Three Laws of Robotics (to use a mythical benchmark) as your desk calculator.
The way robots see is, their lenses compute the environment around them as a series of variables and numbers in near real time. For Asimo to be able to interact with the environment will mean that the robot has to be able to not only compute, but react to external stimuli at lightning speed, almost as fast as human beings do. Thats a lot of visual data translated into numbers translated back into instructions even as the visual cues keep changing all the time. Now if the robots were pre-programmed to do just one thing, dodge humans say, then the instructions could have been hardwired into the robotic circuitry. The beauty of Asimo is that like computers, these robots can do a multiple range of things. Which nakes the coding bit infinitely more challenging.
Have to share this: I remember when I first ‘met’ Asimo in India in 2003 at a Honda media conference. The robot could walk sideways! And backwards! And shake a leg. You’ve come a long, long way baby!
When can I have my own robot?
Not anytime soon. Sorry. For one, Asimo is still too expensive to be mass-produced. Plus, the technology is still experimental. Say for example you have a client over for coffee, and Asimo spills hot coffee over that person accidentally – that is not the kind of liability I see Honda taking right now, or you either.
Imagine what will happen if we can marry AI capabilities (like iPhone’s Siri) with Honda’s robotics prowess… better get ready with those Turing Tests, eh?
(Sudarshana Banerjee is consulting editor with techtaffy.com. She can be reached at [email protected])