Showing posts with label brain-computer interface. Show all posts
Showing posts with label brain-computer interface. Show all posts

Thursday, December 14, 2023

Decoding Brain Waves

Can a computer program read thoughts? An experimental project uses AI as a "brain decoder," in combination with brain scans, to "transcribe 'the gist' of what people are thinking, in what was described as a step toward mind reading":

Scientists Use Brain Scans to "Decode" Thoughts

The example in the article discusses how the program interprets what a person thinks while listening to spoken sentences. Although the system doesn't translate the subject's thoughts into the exact same words, it's capable of accurately rendering the "gist" into coherent language. Moreover, it can even accomplish the same thing when the subject simply thinks about a story or watches a silent movie. Therefore, the program is "decoding something that is deeper than language, then converting it into language." Unlike earlier types of brain-computer interfaces, this noninvasive system doesn't require implanting anything in the person's brain.

However, the decoder isn't perfect yet; it has trouble with personal pronouns, for instance. Moreover, it's possible for the subject to "sabotage" the process with mental tricks. Participating scientists reassure people concerned about "mental privacy" that the system works only after it has been trained on the particular person's brain activity through many hours in an MRI scanner. Nevertheless, David Rodriguez-Arias Vailhen, a bioethics professor at Spain's Granada University, expresses apprehension that the more highly developed versions of such programs might lead to "a future in which machines are 'able to read minds and transcribe thought'. . . warning this could possibly take place against people's will, such as when they are sleeping."

Here's another article about this project, explaining that the program functions on a predictive model similar to ChatGPT. As far as I can tell, the system works only with thoughts mentally expressed in words, not pure images:

Brain Activity Decoder Can Read Stories in People's Minds

Researchers at the University of Texas in Austin suggest as one positive application that the system "might help people who are mentally conscious yet unable to physically speak, such as those debilitated by strokes, to communicate intelligibly again."

An article on the Wired site explores in depth the nature of thought and its connection with language from the perspective of cognitive science.

Decoders Won't Just Read Your Mind -- They'll Change It

Suppose the mind isn't, as traditionally assumed, "a self-contained, self-sufficient, private entity"? If not, is there a realistic risk that "these machines will have the power to characterize and fix a thought’s limits and bounds through the very act of decoding and expressing that thought"?

How credible is the danger foreshadowed in this essay? If AI eventually gains the power to decode anyone's thoughts, not just those of individuals whose brain scans the system has been trained on, will literal mind-reading come into existence? Could a future Big Brother society watch citizens not just through two-way TV monitors but by inspecting the contents of their brains?

Margaret L. Carter

Please explore love among the monsters at Carter's Crypt.

Thursday, June 01, 2023

Brain-Computer Interface

Elon Musk's Neuralink Corporation is developing an implant intended to treat severe brain disorders and enable paralyzed patients to control devices remotely. As a long-term goal, the company envisions "human enhancement, sometimes called transhumanism."

Neuralink

Here's a brief article on the capacities and limitations of brain implants:

Brain Implants

A Wikipedia article on brain-computer interface technology, which goes back further than I'd realized:

Brain-Computer Interface

In fields such as treatments for paraplegics and quadriplegics, this technology shows promise. It "was first developed to help people paralyzed with spinal injuries or conditions like Locked-in syndrome — when a patient is fully conscious but can't move any part of the body except the eyes — to communicate." Connection between the brain's motor cortex and a computer has enabled a paralyzed patient to type 90 characters per minute. Another kind of implant allowed a man with a robotic hand to feel sensations as if he still had natural skin. A "brain-spine interface" has enabled a man with a spinal cord injury to walk naturally. Deep brain stimulation has been helping people with Parkinson's disease since the 1990s. Most of these applications, however, are still in the experimental stage with human patients or have been tested only on animals. For instance, a monkey fitted with a Neuralink learned to control a pong paddle with its mind.

Will such an implant eventually achieve telepathy, though, as Musk claims? Experts say no, at least not in the current stage of neuroscience, because "we don't really know where or how thoughts are stored in the brain. We can't read thoughts if we don't understand the neuroscience behind them."

What about a paralyzed person controlling a whole robotic body, like the protagonist of AVATAR remotely living in an alien body? Probably not anytime soon, but I was amazed to learn how much closer we are to achieving that phase of "transhumanism" than I'd imagined. If it's ever reached, might the very rich choose to live their later years remotely in beautiful, strong robotic bodies and thereby enjoy a form of eternal youth -- as long as their flesh brains can be kept alive, anyway?

Margaret L. Carter

Carter's Crypt