Neuralink Raising $51 Million for Brain Interface Telepathy and Mind Waves to Speech Systems

Elon Musk Neuralink, brain interface company, is raising $51 million in funding.

Recently the Neuralink twitter has been highlighting the use of AI and brain interfaces to decode brainwaves into speech.

Nature – Speech synthesis from neural decoding of spoken sentences

Technology that translates neural activity into speech would be transformative for people who are unable to communicate as a result of neurological impairments. Decoding speech from neural activity is challenging because speaking requires very precise and rapid multi-dimensional control of vocal tract articulators. Here we designed a neural decoder that explicitly leverages kinematic and sound representations encoded in human cortical activity to synthesize audible speech. Recurrent neural networks first decoded directly recorded cortical activity into representations of articulatory movement, and then transformed these representations into speech acoustics. In closed vocabulary tests, listeners could readily identify and transcribe speech synthesized from cortical activity. Intermediate articulatory dynamics enhanced performance even with limited data. Decoded articulatory representations were highly conserved across speakers, enabling a component of the decoder to be transferrable across participants. Furthermore, the decoder could synthesize speech when a participant silently mimed sentences. These findings advance the clinical viability of using speech neuroprosthetic technology to restore spoken communication.

8 thoughts on “Neuralink Raising $51 Million for Brain Interface Telepathy and Mind Waves to Speech Systems”

  1. That was my fault for not explaining myself better. Universal quantum computers will not be used in the interface itself but only to figure out and separate the brain patterns from the static and perhaps to figure out which genes should be changed to enhance those signals. Using quantum computers we can run billions of simulations and even design new materials to enhance a brain interface.

  2. “The primary problem with organic cells is they have no inherent machine-interfaces.”

    That’s where optogenetics comes in. You use a retrovirus injection to locally modify nerve cells to add an optical interface, so that all you have to do is insert biologically inert optical fibers to communicate with them.

    https://en.wikipedia.org/wiki/Optogenetics

  3. Yeah, non invasive brain scanning would have a serious advantage for mind reading technologies.

    For taking data into your brain, some degree of surgery would be necessary. Although there are experiments using GMOs with neurons that are sensible to light, and that can be turned on and off at will.

    Having humans with such modification via CRISPR or the like, would make the I/O interface much less intrusive, only requiring the implant to stimulate the right neurons with light.

    For anyone thinking a brain surgery is too cumbersome for getting something you can also get from external devices, let’s remember there are people that would immediately benefit from this: people with sensory and neural disabilities.

  4. Openwater.cc has a noninvasive approach using red light and holography that’s a competitor.

  5. Quantum computers? Whatever we use will probably involve electricity and metals, too.

    The primary problem with organic cells is they have no inherent machine-interfaces.

    I play around with the idea of a constant brain scan that enables a mirror copy to be made and maintained in near real time, albeit not a fully organic brain, more like on a virtual server. Input could be provided to the mirror copy and its decisions interpreted by a conjoined AI, as all of it would be fully accessible to the AI at all times.

    The copy would be refreshed by new backups constantly to prevent divergence from the original.

    As I understand it, it would seem exactly as though it was us (the original) calling the shots because this is the way we go through life. Pretty much everything the brain does is done by parts of it, then presented to a committee of the parts (what we think of as us). The committee then rationalizes whatever the parts did to have been done consciously by itself. After assessment, it may try to train the parts to respond somewhat differently, should similar decision patterns present themselves in the future.

    Think of it like a corporation’s executives sitting around a big table. They are the company in one sense and, in another, all they are doing is looking at things that have already happened, assessing impact on the company, and either condoning or endorsing those things, or issuing guidance for what is to be done, or attempted, in the future.

  6. How on earth would quantum computers change this? Electric fields from the individual neurons overlap and make it impossible to separate them from each other. This is a classical measurement. How would applying a quantum computer to a zero signal-to-noise measurement help?

  7. This requires brain surgery. Eventually with quantum supercomputers leading the way, we will use non invasive techniques for the same or better effect.

Comments are closed.