Saturday, 23 March 2013

Robotic hands matching human capabilities


As part of the on-going rise of consumer-level robotics, recent research in artificial intelligence and bio-inspired devices has reached a new plateau of possibilities. Modern robots are now able to fill an increasingly broad scope of roles in both home and work environments.* Easily one of the most important (and difficult) abilities for such machines is being able to recognise and interact with various physical objects. For simple or repetitive tasks, such as assembly line production, this knowledge was relatively straightforward, requiring simple programming and mechanical systems. However, the growing complexity of environments that commercial robots now have to encounter has driven research into more intricate and capable mechanisms.
As has often been the case, engineers turned to the human body itself to model both the
form and function of new robot apparatuses. Since almost all robots must interact with and handle physical objects in some way, among the most commonly emulated body parts is the hand. Along with their associated computer programs and visual recognition software, robotic hands in the 2000s and 2010s had already boasted some impressive abilities. They could pick up delicate objects,* catch objects thrown to them,* make a range of gestures,* fold towels,*pour drinks and even prepare meals.* Despite this, the sheer dexterity and flexibility of the human hand and the practical limits of mechanical components prevented scientists from achieving a perfect recreation.
By the second half of the 2020s, however, the techniques involved have become sufficiently advanced to overcome most of the obstacles faced in previous decades. Around this time, some of the first robot hands equalling the capabilities of human hands are appearing in the laboratory.* Advances in nanotechnology,*miniaturisation and micro-electronics have allowed engineers to account for almost all of the subtle movements performed by a living biological hand. Graphene-based actuators converting electricity into motion, artificial skin, tactile sensors,* flexible electronics and various other features are employed to emulate the real thing. This has also been the result of an improved biological understanding of how humans manipulate objects.
AI programs, using precise visual perception software, are able to recognise countless physical objects and intelligently plan for how they can be manipulated. The robotic hand is therefore able to function autonomously and self-adjust to different objects based on texture, weight and shape. All of this can be accomplished in fluid, natural movements that are largely indistinguishable from those of a real hand. Though still in the trial stage, such systems will prove extremely useful in the development of human-like robots and androids. By the middle of this century, the subtle capabilities offered by robotic hands will allow machines to interact with humans and their environment in myriad new ways.*

most advanced robot hand

No comments:

Post a Comment