Machines and their maker, the human, are in an increasingly interactive relationship. Just as the primitive man forged hand tools to triumph over harsh nature, we continue to develop smart machines to interface with information-rich real and virtual worlds. Although machines range from a simple hand tool to a cochlear implant to a complex interplanetary spacecraft, they ultimately serve one purpose: to extend our ability to interact with the world around us. Rapid evolutions of technology in information processing, communication and robotics are bringing humans and machines ever closer, even occasionally merging the two within our bodies to augment our biological functions. Interactions with real and virtual environments through sophisticated machines can enhance both our sensorimotor and cognitive abilities, thus contributing not only to the extension of our bodies but also to the evolution of our brains.
Human-machine interaction includes two-way transmission of information: sensory information from the environment to the human that affects our perception of the environment, and action commands from the human to the environment to explore or modify the environment. The human–machine interface is thus the gatekeeper through which we convey our intentions to the machines and they, in turn, give us feedback on task performance. In order to fully realize the benefits of the rapid technological progress in the processing and communication of information, we need machines that augment our ability not only to perceive, but also to act on the environment. Current technologies that interface with our haptic system show the path towards removing some of the spatial, temporal and energetic limitations of our bodies in physically acting on the environment. By providing unprecedented stimuli and responses in real and virtual worlds, these machines enable evolution of our brains as well. In this talk, I will illustrate recent progress in and prospects for machines capable of extending our bodies and evolving our brains
Level 1 – exploration of the real world
How to hear through your hands : TADOMA
deaf and blind can “hear” speach at regular speed by touching the mouth and throat. With training they can use their hands to feel the speech.
Level 2 – manipulation of the real world
Another sensory motor loop
Level 3 – tool use for body enhancement
New technologies can enhance both our body and brains
Haptic robots as interfaces to computers.
Force feedback to give touch and feel to virtual objects using force feedback.
Haptics in teleoperation
Motion of one robot that is interface controls the slave robot.
Davinci system for medicine (which filters out surgeons tremors and can make the movement more precise)
Telemanipulation to translate from human scale input to nanoscale manipulation
Touch across the atlantic, but there are time delays which require predictive algorithms to mitigate
Application 2- micronanotelehaptics
be immersed in the micro world and then in a nanoworld.
Cell surgery. Cells are usually 5-20 microns.
Application 3- BEAMING. Being in Augmented Multi-modal Naturally Networked Gatherings
Hope to achieve in the next 3-4 years. A humanoid robot is controlled by you. Other with VR glasses see it as you.
Application 4- controlling matter with your thoughts.
Computer learns the brain signals to determine which signal controls of an arm.
Will cortical plasticity merge humans and machines ? Cortex learns and develops a section for controlling a robot arm
Level 5 – the new self
Superhuman reach and superfast reaction time. Instead of 30 millisecond reaction time to control regular arm it can be 1-2 milliseconds to robot arm.
Plug and play skill acquisition
real time modeling and simulation of huge data sets
Living comfortably in abstract spaces of high dimesnions.
Machine integration at multiple levels. Seamless.
If you liked this article, please give it a quick review on Reddit, or StumbleUpon. Thanks