The Fort Worth Press - Scientists use brain scans and AI to 'decode' thoughts

USD -
AED 3.6725
AFN 63.999856
ALL 83.297254
AMD 377.390194
ANG 1.790083
AOA 916.99998
ARS 1394.554799
AUD 1.420636
AWG 1.8
AZN 1.676996
BAM 1.696352
BBD 2.017025
BDT 122.885307
BGN 1.709309
BHD 0.377589
BIF 2970
BMD 1
BND 1.278723
BOB 6.920298
BRL 5.262897
BSD 1.001487
BTN 92.872847
BWP 13.580798
BYN 3.052406
BYR 19600
BZD 2.014155
CAD 1.372539
CDF 2270.000094
CHF 0.79234
CLF 0.023189
CLP 915.629821
CNY 6.87305
CNH 6.896165
COP 3706.06
CRC 467.742425
CUC 1
CUP 26.5
CVE 97.049706
CZK 21.344602
DJF 177.720249
DKK 6.516155
DOP 60.049918
DZD 132.620027
EGP 52.342902
ERN 15
ETB 156.999882
EUR 0.872031
FJD 2.221803
FKP 0.749449
GBP 0.753495
GEL 2.715024
GGP 0.749449
GHS 10.90497
GIP 0.749449
GMD 74.000226
GNF 8779.999887
GTQ 7.671558
GYD 209.520258
HKD 7.83725
HNL 26.569773
HRK 6.568903
HTG 131.24607
HUF 343.149029
IDR 17045.9
ILS 3.10005
IMP 0.749449
INR 93.290799
IQD 1310
IRR 1315000.00013
ISK 124.87016
JEP 0.749449
JMD 157.249479
JOD 0.708962
JPY 159.748036
KES 129.550334
KGS 87.449732
KHR 4010.000108
KMF 427.999847
KPW 899.9784
KRW 1500.430038
KWD 0.30666
KYD 0.834501
KZT 483.111229
LAK 21449.999846
LBP 89537.026148
LKR 311.844884
LRD 183.349751
LSL 16.820057
LTL 2.95274
LVL 0.60489
LYD 6.380477
MAD 9.37375
MDL 17.460159
MGA 4169.99987
MKD 53.768412
MMK 2100.10344
MNT 3571.101739
MOP 8.084959
MRU 40.120577
MUR 46.509644
MVR 15.460447
MWK 1736.000022
MXN 17.843802
MYR 3.935503
MZN 63.89611
NAD 16.820167
NGN 1355.530155
NIO 36.719893
NOK 9.601885
NPR 148.591748
NZD 1.72353
OMR 0.384488
PAB 1.001483
PEN 3.4275
PGK 4.30275
PHP 60.129681
PKR 279.302598
PLN 3.72725
PYG 6472.539624
QAR 3.644039
RON 4.440402
RSD 102.427051
RUB 83.867736
RWF 1459
SAR 3.75469
SBD 8.04524
SCR 14.436392
SDG 600.999742
SEK 9.40364
SGD 1.28295
SHP 0.750259
SLE 24.649971
SLL 20969.510825
SOS 571.501128
SRD 37.375017
STD 20697.981008
STN 21.5
SVC 8.762663
SYP 110.58576
SZL 16.820065
THB 32.793369
TJS 9.578717
TMT 3.5
TND 2.917501
TOP 2.40776
TRY 44.316099
TTD 6.788466
TWD 32.046199
TZS 2603.730034
UAH 44.042968
UGX 3767.67725
UYU 40.557008
UZS 12174.999564
VES 450.94284
VND 26310
VUV 119.592862
WST 2.733704
XAF 568.900934
XAG 0.013129
XAU 0.000207
XCD 2.70255
XCG 1.80488
XDR 0.70688
XOF 566.498164
XPF 103.8992
YER 238.57502
ZAR 16.965204
ZMK 9001.200819
ZMW 19.583865
ZWL 321.999592
  • RBGPF

    0.1000

    82.5

    +0.12%

  • BCE

    -0.2600

    25.75

    -1.01%

  • RIO

    -2.0800

    87.72

    -2.37%

  • BCC

    -1.0800

    71.84

    -1.5%

  • GSK

    -1.3500

    52.06

    -2.59%

  • BTI

    -2.4600

    58.09

    -4.23%

  • BP

    0.7600

    44.61

    +1.7%

  • CMSC

    -0.1200

    22.83

    -0.53%

  • NGG

    -3.0200

    87.4

    -3.46%

  • RYCEF

    -0.2100

    16.6

    -1.27%

  • RELX

    -0.4300

    33.86

    -1.27%

  • JRI

    -0.1370

    12.323

    -1.11%

  • CMSD

    0.0100

    22.89

    +0.04%

  • AZN

    -2.8700

    188.42

    -1.52%

  • VOD

    -0.3800

    14.37

    -2.64%

Scientists use brain scans and AI to 'decode' thoughts
Scientists use brain scans and AI to 'decode' thoughts / Photo: © AFP/File

Scientists use brain scans and AI to 'decode' thoughts

Scientists said Monday they have found a way to use brain scans and artificial intelligence modelling to transcribe "the gist" of what people are thinking, in what was described as a step towards mind reading.

Text size:

While the main goal of the language decoder is to help people who have the lost the ability to communicate, the US scientists acknowledged that the technology raised questions about "mental privacy".

Aiming to assuage such fears, they ran tests showing that their decoder could not be used on anyone who had not allowed it to be trained on their brain activity over long hours inside a functional magnetic resonance imaging (fMRI) scanner.

Previous research has shown that a brain implant can enable people who can no longer speak or type to spell out words or even sentences.

These "brain-computer interfaces" focus on the part of the brain that controls the mouth when it tries to form words.

Alexander Huth, a neuroscientist at the University of Texas at Austin and co-author of a new study, said that his team's language decoder "works at a very different level".

"Our system really works at the level of ideas, of semantics, of meaning," Huth told an online press conference.

It is the first system to be able to reconstruct continuous language without an invasive brain implant, according to the study in the journal Nature Neuroscience.

- 'Deeper than language' -

For the study, three people spent a total of 16 hours inside an fMRI machine listening to spoken narrative stories, mostly podcasts such as the New York Times' Modern Love.

This allowed the researchers to map out how words, phrases and meanings prompted responses in the regions of the brain known to process language.

They fed this data into a neural network language model that uses GPT-1, the predecessor of the AI technology later deployed in the hugely popular ChatGPT.

The model was trained to predict how each person's brain would respond to perceived speech, then narrow down the options until it found the closest response.

To test the model's accuracy, each participant then listened to a new story in the fMRI machine.

The study's first author Jerry Tang said the decoder could "recover the gist of what the user was hearing".

For example, when the participant heard the phrase "I don't have my driver's license yet", the model came back with "she has not even started to learn to drive yet".

The decoder struggled with personal pronouns such as "I" or "she," the researchers admitted.

But even when the participants thought up their own stories -- or viewed silent movies -- the decoder was still able to grasp the "gist," they said.

This showed that "we are decoding something that is deeper than language, then converting it into language," Huth said.

Because fMRI scanning is too slow to capture individual words, it collects a "mishmash, an agglomeration of information over a few seconds," Huth said.

"So we can see how the idea evolves, even though the exact words get lost."

- Ethical warning -

David Rodriguez-Arias Vailhen, a bioethics professor at Spain's Granada University not involved in the research, said it went beyond what had been achieved by previous brain-computer interfaces.

This brings us closer to a future in which machines are "able to read minds and transcribe thought," he said, warning this could possibly take place against people's will, such as when they are sleeping.

The researchers anticipated such concerns.

They ran tests showing that the decoder did not work on a person if it had not already been trained on their own particular brain activity.

The three participants were also able to easily foil the decoder.

While listening to one of the podcasts, the users were told to count by sevens, name and imagine animals or tell a different story in their mind. All these tactics "sabotaged" the decoder, the researchers said.

Next, the team hopes to speed up the process so that they can decode the brain scans in real time.

They also called for regulations to protect mental privacy.

"Our mind has so far been the guardian of our privacy," said bioethicist Rodriguez-Arias Vailhen.

"This discovery could be a first step towards compromising that freedom in the future."

T.Gilbert--TFWP