Robots are posing a threat to take over our jobs. Will our hands prevent or slow down them? In fact, along with many other organs, we use our hands to perform many tasks. In fact, the role of our hands is indispensable to perform those tasks. For example, we use our hands to sense how ripe fruits are without causing any damage. To be eligible to take over those jobs, robots should have hands having a similar capability. It happens to be that our hands are incredibly complex to imitate. Although we intuitively use many of our hands’ innate features, they are highly complex, perhaps not impossible, to build them in robots. Does it mean that robots’ hands, as opposed to ours, are going to slow down, perhaps prevent, robots take over jobs?
The development of robot hand started more than 300 years ago. Japan’s tea serving robot doll, Karakuri, had human-like hands, fingers, and they were able to serve tea. Unfortunately, they could not move their fingers. Since then there have numerous attempts to advance robot hands to imitate human-like capabilities. From one perspective, it is very easy to get a human-like robot hand to grasp the object. For example, robot hands with two or three-finger grippers that are simple and straightforward are good enough for specific tasks. But the development of hands with four fingers and a thumb designed to mimic human hands is quite complicated. For simple manufacturing-like tasks, the first hand is quite adequate. However, they are not suitable for delivering services like taking care of the elderly or performing household chores. Furthermore, the Humanoid ASIMO robot has also ended the life cycle due to this limitation.
Robot Hand Dactyle—but too far for letting robots take over many jobs
In 2018, New York Times published a review on state-of-the-art robot hands. Notably, it explained the robot hand Dactyl. Researchers at OpenAI laboratory in San Francisco have been developing mechanical prosthetics like robot hands that bend and straighten like a human hand. Elon Musk and several other big Silicon Valley names established OpenAI lab. This hand having four autonomous fingers and a thumb shows the capability to spin, twist, and flip the alphabet cube block in nimble ways. Although this is a simple task for a 3-year-old, OpenAI’s multi-year-long journey found it quite an achievement. Even for high-caliber researchers working at the flagship AI laboratory at Silicon Valley, Dactyl is a notable achievement.
While researchers struggled to master much simpler tasks with much simpler hands in the recent past, Dactyle’s nimble performance in handling a cube appears to be an enormous leap in robotic research. However, it took intensive R&D over a few years. some researchers are finding long training exercises of Dactyle hand through numerous trials & errors by feeding thousands of images to neural net quite impractical.
Gripper, Picker, Bed Maker, and Pusher
Researchers at a robotics lab inside the University of California at Berkeley are developing Gripper. Equipped with a two-fingered “gripper” Robots can pick screwdriver-like objects. But to pick up a restaurant-style ketchup bottle, it messes up. This lab has also developed Picker—a gripper and a suction cup. More importantly, researchers modeled the physics of more than 10,000 objects. These models are useful for identifying the best way to pick up each one. But if the object is flexible, having thousands of combinations, the task of handling it becomes quite insurmountable? Another lab at Berkey is developing Pusher for allowing robots to push an object with a gripper and predict where it will go.
So far, the progress of developing multiple type robot hands for simple tasks. Robots can handle them in certain conditions. Despite showing progress, they fail often. Nevertheless, growth in machine learning algorithms is continuing. Unfortunately, the underlying neural network-based machine learning algorithms run the risk to saturate before reaching needed perfection. These algorithms impress us at the beginning. But they fail to scale up to reach the desired level. Hence, many initial signs of progress in AI in making robot hands and other devices fail to roll out in taking over jobs from humans.
General Purpose hands prevent robots from taking over jobs
An occupation consists of multiple tasks. Moreover, each task involves handling multiple objects. For example, cooking requires cutting, washing, steering, cleaning, and wiping. Furthermore, each of the tasks requires handling multiple objects in varying ways, demanding our hands’ different abilities. Hence, for developing a robot for occupation, we need to develop a robot’s ability to handle multiple objects in performing each of the given tasks. One of the attributes that the robot should have is excellent hands. These hands should be capable of precise fine movements and autonomous grasping. The challenge to make a single robot perform diverse but related tasks, such as hammering nails, changing batteries, and making similar movements is quite daunting. It involves inventing new designs that incorporate hard and soft elements – the way human bone gives strength to a grip, with skin spreading the pressure, so the glass containing water doesn’t shatter.
Of course, technological advancement is helping to make progress. For example, the miniaturization of electronic cameras and sensors allows us to pace pressure sensors and cameras in a robotic hand. Even a smaller camera could be installed underneath the fingertip. Feedback data obtained from these sensors could be used for run-time adjustment of grip to prevent objects from slipping from robots’ hands. Such progress would enable us to have a robot hand that can detect changes in objects it is handling or manipulates items while holding them. In many tasks such as cutting fruits, knot-tying, or wire-stripping such capability is vital.
Touch is vital for robots to handle delicate objects
Touch is crucial for tasks such as picking up objects – hard or soft, light or heavy, warm or cold – without damaging them. In executing tasks where the robot’s hand or gripper has to pick up an object, adding the sense of would remove uncertainties. Notably, it’s immensely useful in handling soft, fragile, and deformable objects. Robot hands need to know the force’s exact position, angle, and how it will interact with the object being manipulated to mimic our hands’ touch capability. The best touch sensing capability being developed by MIT researchers is to collect data from 548 sensors assembled on a knitted fabric containing a piezoresistive film connected by a network of conductive thread electrodes.
Let’s look into our hands
It’s quite intuitive to pick a coffee mug, even without carefully looking at it. We very clearly feel the curvature of the handle, the cup’s width, and the slipperiness of the ceramic. Our hand glides into place, senses the weight, and gently brings it to our lips. All these tasks seamlessly take place without requiring our conscious effort. However, over the last 300 hundred years, we could not imitate even a small fraction of this innate ability in robots. Moreover, our fingers and palms continuously keep sending sensory feedback to our brains, which is essential for optimal handling. Particularly, the fingertip is mysterious. It works as a barcode, a unique signature for every individual.
In comparison to a limited number of sensors that researchers are working on, our hands’ outer layer of skin is imbued with sensors to detect pressure, heat, and other stimuli. Notably, fingers and palms are particularly rich in touch sensors. There is an intricate biological mechanism with a sublayer of skin called the spinosum and bumpy microscopic terrain of hills and valleys for allowing our hands to have sophisticated touch sensing capability.
Primitive hands of Robots prevent Robots from taking over target jobs
In addition to the capability of hands, robots should also have a set of humans like Innate abilities to concentrate in getting tasks done with those hands. So far, the hands of robots are highly primitive. In most cases, they have a task-specific hand. To qualify for another task, they need to change their hands. Outside of the factory environment, to qualify for jobs, a single robot should perform a set of tasks instead of a single one. Hence, they need a general-purpose hand. So far, current progress indicates that the hands of robots are highly primitive, and the progress is very slow.
Therefore, it’s fair to say that the primitive hands of robots will prevent robots from taking over jobs from humans. It will have profound effects on some of the predictions that are being made on the future of work. Subsequently, the unfolding of the Fourth Industrial Revolution will slow down.