• @doublejay1999@lemmy.world
    cake
    link
    fedilink
    English
    29
    edit-2
    6 months ago

    I’m wheeling one f dish spout dj djdjyvap BIG TITS h right now. It’s pastry Amazon if BIG ROUND TITTIES but not sure how accurately NICE BIG TITS is really is.

    • JokeDeity
      link
      fedilink
      English
      26 months ago

      That’s DEFINITELY where this will go, and not a single politician will fight it, knowing they’ll be exempt from the rules for thee.

    • @modeler@lemmy.world
      link
      fedilink
      English
      66 months ago

      That is a very good question and may help trace where the monologue ‘sounds’ in the brain. It would also be interesting if this were done on sign-language speakers.

      The mental pathway from reading to idea to utterance goes through several portions of the brain:

      • Visual processing of the text
      • Text to phoneme to possible rehearsal of the muscles saying the word (which could be the source of internal monologue, at least while reading)
      • Idea/concept of the individual word
      • Grammatical analysis of the sentence
      • The mental model of the complete thought.

      One interesting thing that suggests it would work was where the author stated i) it was better for verbs than nouns and ii) it would often pick a similar, related word rather than the actual one being read.

      This suggests that (at least part of) what is being detected is the semantic idea rather than the phoneme encoding or the muscle rehearsal portions of the brain.

  • Kool_Newt
    link
    fedilink
    English
    66 months ago

    If this can bw made to work, it will be yet another tool in the chest for those in power to ensure organized resistance is impossible.

    • @CanadaPlus
      link
      English
      1
      edit-2
      6 months ago

      Usually with these technologies, they just straight up fail on the minority of the population that’s weird in any way. You can breath a bit easier.

  • Bob Robertson IX
    link
    fedilink
    English
    56 months ago

    The technology could aid communication for people who are unable to speak due to illness or injury, including stroke or paralysis. It could also enable seamless communication between humans and machines, such as the operation of a bionic arm or robot.

    This is great, but for the rest of us it also means it can be a way to have a conversation with someone else without needing to look at a screen or speak out loud.

    I wonder how this research compares to research on subvocalization. When ChatGPT was announced last year one of my first thoughts was that the technology could create an enormous leap in subvocalization technology where sensors on your neck could detect your inner voice and output it to text. Subvocalization seems much more precise than reading brain waves for this type of use case.

  • @NightAuthor@lemmy.world
    link
    fedilink
    English
    46 months ago

    I don’t wanna read, but someone confirm they did more than just predict between a small set of words?

    I’ve seen claims like this before, and then upon further reading it was something like 40% accurate at telling the difference between “yes” and “no” (ok that might be a slight exaggeration)

  • JokeDeity
    link
    fedilink
    English
    16 months ago

    I wish I could have just lived earlier in time and not seen how shit humanity gets going forward. How long until we’re living in Minority Report?