Monkey with Neuralink Brain Interface

Pager, a nine year old Macaque, plays MindPong with his Neuralink.

Neuralink has designed the first neural implant that will let you control a computer or mobile device anywhere you go.

Micron-scale threads are inserted into areas of the brain that control movement. Each thread contains many electrodes and connects them to an implant, the Link.

The initial goal of the technology will be to help people with paralysis to regain independence through the control of computers and mobile devices. The devices are designed to give people the ability to communicate more easily via text or speech synthesis, to follow their curiosity on the web, or to express their creativity through photography, art, or writing apps.

The device with be a small disc that will fit into your skull and enable higher bandwidth communication to a smartphone-like device.

Neuralink will increase the number of threads to perhaps millions and decrease the thickness of the threads over time.

The system will enable input and output from the brain.

The targets for the brain interface are vision, sound, smell and movement. Sounds, sights, smells and touch sensation could be added or augmented via the device. It would enable ultra high-fidelity augmented and virtual reality.

VISUAL CORTEX
Processes visual information from our eyes.

AUDITORY CORTEX
Assists with the perception and interpretation of sound.

SOMATOSENSORY CORTEX
Helps process sense of touch.

MOTOR CORTEX
Responsible for planning and executing motor movements.

WHAT WILL THE LINK DO?
They are designing the Link to connect to thousands of neurons in the brain. It will be able to record the activity of these neurons, process these signals in real time, and send that information to the Link. As a first application of this technology, we plan to help people with severe spinal cord injury by giving them the ability to control computers and mobile devices directly with their brains. We would start by recording neural activity in the brain’s movement areas. As users think about moving their arms or hands, we would decode those intentions, which would be sent over Bluetooth to the user’s computer. Users would initially learn to control a virtual mouse. Later, as users get more practice and our adaptive decoding algorithms continue to improve, we expect that users would be able to control multiple devices, including a keyboard or a game controller.

SOURCES- Neuralink
Written by Brian Wang, Nextbigfuture.com

Subscribe on Google News