SMOLNET PORTAL home about changes
(DIR) ←Back
Breakthrough brain-computer interface allows man with ALS to speak again
- researchers introduced BCI that can translate brain signals into
speech with remarkable precision, achieving up to 97% accuracy. The
breakthrough was published today in the New England Journal of Medicine.
(URL) https://www.psypost.org/breakthrough-brain-computer-interface-allows-... (https://www.psypost.org)
########################################################################
|u/AutoModerator - 1 month
|
|Welcome to r/science! This is a heavily moderated subreddit in order to
|keep the discussion on science. However, we recognize that many people
|want to discuss how they feel the research relates to their own personal
|lives, so to give people a space to do that, **personal anecdotes are
|allowed as responses to this comment**. Any anecdotal comments elsewhere
|in the discussion will be removed and our [normal comment rules](
|https://www.reddit.com/r/science/wiki/rules#wiki_comment_rules) apply to
|all other comments. **Do you have an academic degree?** We can verify
|your credentials in order to assign user flair indicating your area of
|expertise. [Click here to
|apply](https://www.reddit.com/r/science/wiki/flair/). --- User: u/mvea
|Permalink: https://www.psypost.org/breakthrough-brain-computer-
|interface-allows-man-with-als-to-speak-again/ --- *I am a bot, and
|this action was performed automatically. Please [contact the moderators
|of this subreddit](/message/compose/?to=/r/science) if you have any
|questions or concerns.*
|u/mvea - 1 month
|
|I’ve linked to the press release in the post above. In this comment, for
|those interested, here’s the link to the peer reviewed journal article:
|[https://www.nejm.org/doi/10.1056/NEJMoa2314132](https://www.nejm.org/do
|i/10.1056/NEJMoa2314132) Breakthrough brain-computer interface allows
|man with ALS to speak again In a groundbreaking development at UC Davis
|Health, researchers have introduced a brain-computer interface (BCI)
|that can translate brain signals into speech with remarkable precision,
|achieving up to 97% accuracy. This cutting-edge technology offers a ray
|of hope for individuals with severe speech impairments caused by
|neurological conditions such as amyotrophic lateral sclerosis (ALS).
|ALS, also known as Lou Gehrig’s disease, is a devastating neurological
|disorder that progressively robs individuals of their ability to move,
|speak, and eventually breathe. As the disease advances, it destroys the
|nerve cells that control voluntary muscle movements, leading to complete
|paralysis and a loss of speech. For those affected, the inability to
|communicate can be one of the most isolating and demoralizing aspects of
|the disease. But now, thanks to this innovative BCI, communication
|barriers are being dismantled. This system works by interpreting the
|brain’s speech-related signals and converting them into text, which is
|then vocalized by a computer. The breakthrough was highlighted in a
|study published today in the [*New England Journal of
|Medicine*](http://dx.doi.org/10.1056/NEJMoa2314132)*.* The BCI system
|used by Harrell showed unprecedented results right from the start.
|During the initial speech data training session, it took just 30 minutes
|to achieve 99.6% word accuracy with a 50-word vocabulary. This was a
|significant leap forward compared to previous systems, which often
|struggled with accuracy and required extensive training periods. The
|system was tested in both prompted and spontaneous conversational
|settings, decoding speech in real time. In the second session, the
|vocabulary size was increased dramatically to 125,000 words, and the BCI
|still maintained a 90.2% accuracy rate with only 1.4 additional hours of
|training. With ongoing use, the system has consistently delivered a
|remarkable 97.5% accuracy.
|u/rayfe - 1 month
|
|Man, the future can be awesome sometimes.
|u/mitchMurdra - 1 month
|
|This very same technology is also opening doors on the bad side of
|things. It’s the same for all technology.
|u/PhaseThreeProfit - 1 month
|
|I'd love to hear some examples of bad uses... The thought occurred
|to me, how crazy would it be if you could read people's "thoughts".
|I'm sure that's probably pure fantasy, but I recall an article or
|post that discussed how people can read the video playing on a
|monitor by analyzing the leaking electromagnetic signal from the
|HDMI cable. I guess I wonder if a helmet or device could be placed
|on people to read their speech. (I almost feel obliged to state
|that I'm not normally this wild-eyed and "magical" in my thinking.
|But damn if a computer that reads brain waves to produce speech
|doesn't sound like magic to me!)
|u/cuyler72 - 1 month
|
|That's exactly what this is doing, you need an implant to do it
|though, none of the external tech is close to being accurate
|enough,  except perhaps devices that interface with jaw nerves
|but I haven't seen anything about that in a while.
|u/BannedforaJoke - 1 month
|
|ghost in the shell and 1984 are the most dystopian examples i
|could think of.
|u/AnonDarkIntel - 1 month
|
|That is because the data speed in HDMI is so high they have to use
|waveforms instead of just switching 0 and 1s
|u/CloserToTheStars - 1 month
|
|It exists… people play games with thoughts. Use Apple vision with
|thoughts. Speak words with thoughts. Dreams can be recorded,
|somewhat accurately. Sharing thoughts through text is not far off.
|Neuralink patients will be able to do it in a year. Just like
|texting over phone. But sharing thoughts as experiences as a whole
|with emotion etc is something else though. That’s about 15-30
|years away. Depends on the definition. Those devices as of now
|have no protection so they can as of now be hacked, doing
|basically what you just said.
|u/ffman5446 - 1 month
|
|This guy has no idea what he’s talking about.
|u/CloserToTheStars - 1 month
|
|I just woke up and formulated it badly. I certainly know what
|im talking about. But have fun.
|u/cuyler72 - 1 month
|
|Nothing he said is wrong, Except that it's 15–30 years away,
|we essentially already have the tech to do all of this and
|that the devices have no protection, they absolutely have a
|encrypted key to access preventing any direct hack. we have
|copied low 'resolution' images of what people are
|seeing/visualizing with external devices, The Neuralink is
|more than an order of magnitude more sensitive. We already
|have videos of the first Neuralink patient playing complex,
|fast-paced 3D video games with his mind. Reading thoughts is
|exactly what this study is showing, and it's certainly with a
|device with less bandwidth than Neuralink. And Neuralink has
|the ability to stimulate individual neurons to fire, so if
|they can read thoughts they should be able to input them as
|well.
|u/BadHabitOmni - 1 month
|
|I think you and him might be extrapolating a bit much
|here... Reading thoughts is different than reading specific
|speech impulses related to established neural paths to the
|vocal chords, lips, jaw, etc. reading inputs that must be
|calibrated significantly in order to tune itself to the
|users inputs is not exactly "plug and play" "mind reading"
|capability, and any actual ability to do so is much longer
|than 15-20 years given we don't have some kind of insane
|breakthrough on the study of consciousness itself.
|u/PitytheOnlyFools - 1 month
|
|Ghost in the Shell here we come.
|u/thiiiipppttt - 1 month
|
|I wish auto-correct was that accurate
|u/WatermelonWithAFlute - 1 month
|
|Auto correct isn’t plugged into your brain tbf
|u/Ennocb - 1 month
|
|That's crazy. If the interface is interpreting the signals that are
|supposed to go to the muscles, could it not calculate speech formants
|and produce real-time vocalizations based on that information as well?
|I mean, since it's predicting words based on those action potentials,
|this is probably part of the process, but I'm thinking: Couldn't you
|skip the text and prediction part and just have raw vocalizations? Out
|of curiosity. This is not my field of expertise.
|u/osmiumo - 1 month
|
|Yes, great question. This BCI could likely facilitate that. They
|would just need to use vocalizations instead of prompted sentences for
|the training data. That was done with speaking-capable participants,
|and was referenced in the study (2023):
|https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10510111/
|u/Ennocb - 1 month
|
|Oh, wow. That's exciting. Thanks for sharing.
|u/Used_Weight_357 - 1 month
|
|Well they would to implant something into the vocal cords to make them
|move i guess if u want real voice. because with als the muscle
|propably not used and likely atrophied , the nerve in als is also
|destroyed too so theres no way to send signal to muscles . In
|medical field , despite Elon promoting to put computer in human , we
|dont really like putting in implants except when we have to due to our
|body not liking things not there to be put in , they just try
|destroying or containing it . Its a promising tech and i wish it can
|be mass produced
|u/Ennocb - 1 month
|
|Ah, I was actually thinking of generating digital vocalizations
|based on the calculated formants etc.
|u/StandardReceiver - 1 month
|
|My friends and I endured countless ice bucket challenges for ALS back in
|high school, roughly 2013-2014. Glad to see how far the science is
|coming.
|u/teebone_walker - 1 month
|
|Would this blurt out my inner thoughts too?
|u/AllEndsAreAnds - 1 month
|
|No, it’s pulling data only from signals created from his conscious
|attempt to activate the muscles he would use to speak. Source:
|there’s a news special in the link where they say they’re not reading
|thoughts, just “his intention to speak”.
|u/Nodan_Turtle - 1 month
|
|Doesn't seem too farfetched to use the same tech to read a different
|section of the brain then, but it might be that an internal
|monologue is vastly more difficult to read compared to speech muscle
|movements.
|u/AllEndsAreAnds - 1 month
|
|Could be. I’m not sure how they tracked down the section of his
|brain that sends signals to the muscles he uses to speak, but I
|imagine getting right to the source of whatever’s going on in
|brain regions associated with inner-dialogue is possible at least
|in principle. Would love to see it, but also that would put us on
|the cusp of a potentially dystopian nightmare scenario.
|u/BadHabitOmni - 1 month
|
|The aspect of internal manifestation of words, ideas, images, etc.
|are not exclusively present nor localized within the speech areas
|of the brain... We haven't conclusively determined how these
|things manifest in the first place as this is linked to the
|greater question of consciousness itself. I do think however this
|technology could be quickly adapted towards restoring vision,
|hearing, etc. which rely on more 'hard wired' neural paths leading
|to and from specific areas of the brain,generally being well
|established passive senses. This could also theoretically imbue
|sight to individuals who have never been able to see as well as
|other senses deprived from birth.
|u/TheManInTheShack - 1 month
|
|I have a friend with ALS. I’ll forward this to him.
|u/50YrOldNoviceGymMan - 1 month
|
|I hope they also work on something that resolves the bit where the poor
|souls suffering from this disease, eventually loose the ability to
|breathe.
|u/Holiday-Pay193 - 1 month
|
|Can anyone explain how is this different from the technology used by
|Stephen Hawking?
|u/MSXzigerzh0 - 1 month
|
|I thought that Stephen Hawkings typed out his thoughts? This
|requires no typing or spelling. So if you lost your ability to do
|those things you can use this.
|u/SpikeMyCoffee - 1 month
|
|I think Hawking used a version of the comms tablet my mom was given.
|She controlled hers by eye movement/blinking, but hated using it
|because it was a male electronic voice.
|u/linuxpriest - 1 month
|
|If your insurance will cover it, of course, because now comes the
|commodification phase.
|u/cuyler72 - 1 month
|
|Not really a breakthrough, the brain interface part of this would have
|been possible for a long time, they are just adding using an AI voice.
|u/United-Advisor-5910 - 1 month
|
|I wonder when it will be possible to read these brain signals
|wirelessly. At this point in time tin foil hats will become popular.
|Hats off to the 5G conspiracy believers out there.
|u/cuyler72 - 1 month
|
|I'm not sure, but it might not be possible, the signal strength
|required to pierce the skull and return enough data is well stronger
|than an MRI, it would likely fry your brain, and you would need
|something the size of an MRI machine at the bare minimum.
|u/Olibaba1987 - 1 month
|
|Would this make it possible to convert your thoughts to any language
|aswell?
|u/cuyler72 - 1 month
|
|You would need to implant the device, and I'm pretty sure I speak for
|the vast majority of people when I say that reading a person's
|thoughts no matter the circumstance is horribly immoral and a power no
|government or any power structure should have.
|u/korinthia - 1 month
|
|The first thing I’d do is ask them to end my life. ALS is absolutely
|tragic.
|u/HonestyMash - 1 month
|
|Yeah as someone with ALS the thought does cross your mind but family
|has a way of keeping you around.
|u/5minArgument - 1 month
|
|OCD people hate this one simple trick
|u/thgreatn - 1 month
|
|I am not trying to be argumentative, but I believe what you are
|referring to is CDO.
Response: application/gopher-menu
Original URLgopher://gopherddit.com/1/cgi-bin/reddit.cgi?view&1esg9dd...
Content-Typeapplication/gopher-menu; charset=utf-8