• lledrtx@lemmy.world
    link
    fedilink
    English
    arrow-up
    21
    ·
    edit-2
    10 months ago

    Basically three things -

    1. BCI - Brain Computer Interface. This can allow people with disabilities to control prosthetics using their brains. For example, this one from 20+ yrs ago. They are in clinical trial stages now - lot of data over 20yrs showing it’s pretty safe. There are some differences like BrainGate uses “Utah” electrodes which sit on the brain rather than go inside the brain.

    2. Medical diagnosis - Some patients (with things like epilepsy) get their brains recorded like this to find the region of the brain that is malfunctioning. Then sometimes this region is removed and believe it or not it actually helps! Edit: DBS is another option sometimes like the other commenter said but that needs “stimulation” also, not just passive recording.

    3. Understanding the brain - these recording data can help make sense of the brain. We still don’t understand much of how the brain works so this data can help and maybe help with treatments in the future.

    For all of these currently we only have patients (because “healthy” people wouldn’t want metal electrodes in their brain). But neuralink’s promise is to make these electrodes so thin and dense (so that you can record more) while keeping SNR high that it might be possible to put it in healthy people without brain damage. I wouldn’t hold my breath for that, though.

    • SoleInvictus@lemmy.world
      link
      fedilink
      English
      arrow-up
      10
      ·
      edit-2
      10 months ago

      I’m hoping so hard for a brain/computer interface. I have a chronic condition that makes me a walking repetitive stress injury generator. Being able to control a computer with my noggin would be a game changer. I currently use an eye tracker combined with a camera head tracker, plus speech recognition, but it’s not the best. It certainly killed my (non-existent) computer programming career.

    • LarmyOfLone@lemm.ee
      link
      fedilink
      English
      arrow-up
      6
      ·
      10 months ago

      Thanks! So how far away are we from something like this:

      • Create a kind of “virtual sense organ” that allows you to learn to “read” text or information through BCI
      • A virtual or augmented reality, able to close your eyes and see things that the BCI is feeding you
      • lledrtx@lemmy.world
        link
        fedilink
        English
        arrow-up
        8
        ·
        10 months ago

        Both of them can be done shitty-ly now. But to do it with quality that even healthy people will voluntarily get it? That would need several breakthroughs.

        We can stimulate some neurons now; to be able to stimulate enough neurons to do either of those in good quality will be hard. Cutting edge stuff can stimulate ~1000 neurons (only monkeys not even humans) but the human optical nerve is more than a million fibers. So we probably will need 3 orders of magnitude improvement and somehow do it in humans safely.

    • Ann Archy@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      1
      ·
      10 months ago

      Wow let’s wish for those handful of best case scenarios for sure, and hope it isn’t then adapted for mass consumer use like keeping track of your friends and family, and emails, and assets of various sorts, it might even come with emojis!

      If you thought it was hard deleting your Google Photos…