Walk Again Project wants to have a brain machine interface exoskeleton help a quadriplegic walk by 2014

The Walk Again Project’s central goal is to develop and implement the first BMI (brain-machine interface) capable of restoring full mobility to patients suffering from a severe degree of paralysis. This lofty goal will be achieved by building a neuroprosthetic device that uses a BMI as its core, allowing the patients to capture and use their own voluntary brain activity to control the movements of a full-body prosthetic device. This “wearable robot,” also known as an “exoskeleton,” will be designed to sustain and carry the patient’s body according to his or her mental will.

The specific Walk Again Project goal is that on the opening day of the 2014 World Cup soccer tournament in Brazil, they hope to send a young quadriplegic striding out to midfield to open the games, suited up in the “prosthetic exoskeleton” they aim to build.

They will build a neuroprosthetic device that uses a BMI as its core, allowing the patients to capture and use their own voluntary brain activity to control the movements of a full-body prosthetic device. This “wearable robot,” also known as an “exoskeleton,” will be designed to sustain and carry the patient’s body according to his or her mental will.

Their latest work – Monkeys “Move and Feel” Virtual Objects Using Only Their Brains

In a first-ever demonstration of a two-way interaction between a primate brain and a virtual body, two monkeys trained at the Duke University Center for Neuroengineering learned to employ brain activity alone to move an avatar hand and identify the texture of virtual objects.

“Someday in the near future, quadriplegic patients will take advantage of this technology not only to move their arms and hands and to walk again, but also to sense the texture of objects placed in their hands, or experience the nuances of the terrain on which they stroll with the help of a wearable robotic exoskeleton,” said study leader Miguel Nicolelis, MD, PhD, professor of neurobiology at Duke University Medical Center and co-director of the Duke Center for Neuroengineering.

Without moving any part of their real bodies, the monkeys used their electrical brain activity to direct the virtual hands of an avatar to the surface of virtual objects and, upon contact, were able to differentiate their textures.

The latest experiment of the nonprofit consortium showed that electrical messages conveying sensation could be sent directly to the monkeys’ brains – in enough detail that both animals could distinguish among three identical circles by virtually “feeling” their differing textures.

Those sensations did not come from the animals’ fingers, but from specially coded electrical currents delivered straight to each monkey’s sensory cortex by four filaments the breadth of a hair.

Although no one really knows (and the monkeys are unlikely to tell us) whether one circle felt like sandpaper and another felt as smooth as glass, Mango and Nectarine quickly learned to discern one circle from another to complete a task and get their reward: a sip of juice.

The group’s latest effort builds upon an earlier accomplishment, in 2003, in which monkeys learned to move a cursor to designated targets on a computer screen using thought alone.

In another experiment, first described in 2008, Nicolelis’ team at Duke showed that monkeys could learn to initiate movement with their thought patterns and command a robotic device across the world in a Japanese robotics lab to walk in real time.

Nature – Active tactile exploration using a brain–machine–brain interface

Brain–machine interfaces use neuronal activity recorded from the brain to establish direct communication with external actuators, such as prosthetic arms. It is hoped that brain–machine interfaces can be used to restore the normal sensorimotor functions of the limbs, but so far they have lacked tactile sensation. Here we report the operation of a brain–machine–brain interface (BMBI) that both controls the exploratory reaching movements of an actuator and allows signalling of artificial tactile feedback through intracortical microstimulation (ICMS) of the primary somatosensory cortex. Monkeys performed an active exploration task in which an actuator (a computer cursor or a virtual-reality arm) was moved using a BMBI that derived motor commands from neuronal ensemble activity recorded in the primary motor cortex. ICMS feedback occurred whenever the actuator touched virtual objects. Temporal patterns of ICMS encoded the artificial tactile properties of each object. Neuronal recordings and ICMS epochs were temporally multiplexed to avoid interference. Two monkeys operated this BMBI to search for and distinguish one of three visually identical objects, using the virtual-reality arm to identify the unique artificial texture associated with each. These results suggest that clinical motor neuroprostheses might benefit from the addition of ICMS feedback to generate artificial somatic perceptions associated with mechanical, robotic or even virtual prostheses.

If you liked this article, please give it a quick review on ycombinator or StumbleUpon. Thanks