The Fort Worth Press - Scientists use brain scans and AI to 'decode' thoughts

USD -
AED 3.672502
AFN 63.999927
ALL 82.043218
AMD 370.903715
ANG 1.789884
AOA 918.000507
ARS 1392.5417
AUD 1.392312
AWG 1.8
AZN 1.701579
BAM 1.67146
BBD 2.014355
BDT 122.739548
BGN 1.668102
BHD 0.377997
BIF 2988.727748
BMD 1
BND 1.275858
BOB 6.936925
BRL 4.966501
BSD 1.000128
BTN 95.070143
BWP 13.576443
BYN 2.828953
BYR 19600
BZD 2.011854
CAD 1.361545
CDF 2319.999768
CHF 0.784075
CLF 0.022892
CLP 900.960525
CNY 6.82825
CNH 6.82704
COP 3657.25
CRC 454.739685
CUC 1
CUP 26.5
CVE 94.234327
CZK 20.84915
DJF 178.136337
DKK 6.386855
DOP 59.486478
DZD 132.513961
EGP 53.552104
ERN 15
ETB 156.202254
EUR 0.854696
FJD 2.196903
FKP 0.736222
GBP 0.738135
GEL 2.679786
GGP 0.736222
GHS 11.198899
GIP 0.736222
GMD 72.99995
GNF 8777.732198
GTQ 7.643867
GYD 209.252937
HKD 7.833135
HNL 26.586918
HRK 6.442101
HTG 130.892468
HUF 310.558503
IDR 17407.7
ILS 2.961698
IMP 0.736222
INR 95.16275
IQD 1310.206349
IRR 1313999.999557
ISK 122.96998
JEP 0.736222
JMD 157.565709
JOD 0.709044
JPY 157.101989
KES 129.190148
KGS 87.4205
KHR 4012.426129
KMF 420.000338
KPW 899.999998
KRW 1471.944971
KWD 0.30809
KYD 0.833593
KZT 463.980036
LAK 21978.181632
LBP 89580.425856
LKR 319.60688
LRD 183.563154
LSL 16.727816
LTL 2.95274
LVL 0.60489
LYD 6.333538
MAD 9.244476
MDL 17.22053
MGA 4167.11178
MKD 52.685791
MMK 2099.74975
MNT 3576.675528
MOP 8.070745
MRU 39.973678
MUR 46.75998
MVR 15.455032
MWK 1734.615828
MXN 17.49035
MYR 3.953046
MZN 63.893437
NAD 16.731176
NGN 1375.229712
NIO 36.800957
NOK 9.25453
NPR 152.110449
NZD 1.698675
OMR 0.384506
PAB 1.000329
PEN 3.50801
PGK 4.35
PHP 61.727499
PKR 278.713718
PLN 3.63858
PYG 6218.192229
QAR 3.646207
RON 4.442894
RSD 100.348987
RUB 75.552279
RWF 1462.591284
SAR 3.752195
SBD 8.04211
SCR 13.857154
SDG 600.516576
SEK 9.26051
SGD 1.275815
SHP 0.746601
SLE 24.622553
SLL 20969.496166
SOS 571.645885
SRD 37.458056
STD 20697.981008
STN 20.933909
SVC 8.752948
SYP 110.524984
SZL 16.727416
THB 32.627948
TJS 9.363182
TMT 3.505
TND 2.910569
TOP 2.40776
TRY 45.20121
TTD 6.794204
TWD 31.639011
TZS 2597.500226
UAH 44.075497
UGX 3753.577989
UYU 40.286638
UZS 12001.384479
VES 488.942755
VND 26339.5
VUV 118.778782
WST 2.715188
XAF 560.591908
XAG 0.013592
XAU 0.000219
XCD 2.70255
XCG 1.8029
XDR 0.69563
XOF 560.591908
XPF 101.92117
YER 238.604511
ZAR 16.72455
ZMK 9001.201516
ZMW 18.731492
ZWL 321.999592
  • RBGPF

    0.5000

    63.1

    +0.79%

  • JRI

    -0.0100

    12.98

    -0.08%

  • CMSC

    0.0600

    22.88

    +0.26%

  • RYCEF

    0.5500

    16.35

    +3.36%

  • CMSD

    0.1500

    23.28

    +0.64%

  • BCC

    -1.1400

    78.13

    -1.46%

  • BCE

    0.1800

    23.96

    +0.75%

  • GSK

    -0.7000

    51.61

    -1.36%

  • RIO

    0.1000

    100.58

    +0.1%

  • RELX

    -0.2400

    36.35

    -0.66%

  • NGG

    -1.0600

    88.48

    -1.2%

  • VOD

    0.3500

    16.15

    +2.17%

  • BP

    -0.9700

    46.41

    -2.09%

  • BTI

    -0.0900

    58.71

    -0.15%

  • AZN

    -2.6300

    184.74

    -1.42%

Scientists use brain scans and AI to 'decode' thoughts
Scientists use brain scans and AI to 'decode' thoughts / Photo: © AFP/File

Scientists use brain scans and AI to 'decode' thoughts

Scientists said Monday they have found a way to use brain scans and artificial intelligence modelling to transcribe "the gist" of what people are thinking, in what was described as a step towards mind reading.

Text size:

While the main goal of the language decoder is to help people who have the lost the ability to communicate, the US scientists acknowledged that the technology raised questions about "mental privacy".

Aiming to assuage such fears, they ran tests showing that their decoder could not be used on anyone who had not allowed it to be trained on their brain activity over long hours inside a functional magnetic resonance imaging (fMRI) scanner.

Previous research has shown that a brain implant can enable people who can no longer speak or type to spell out words or even sentences.

These "brain-computer interfaces" focus on the part of the brain that controls the mouth when it tries to form words.

Alexander Huth, a neuroscientist at the University of Texas at Austin and co-author of a new study, said that his team's language decoder "works at a very different level".

"Our system really works at the level of ideas, of semantics, of meaning," Huth told an online press conference.

It is the first system to be able to reconstruct continuous language without an invasive brain implant, according to the study in the journal Nature Neuroscience.

- 'Deeper than language' -

For the study, three people spent a total of 16 hours inside an fMRI machine listening to spoken narrative stories, mostly podcasts such as the New York Times' Modern Love.

This allowed the researchers to map out how words, phrases and meanings prompted responses in the regions of the brain known to process language.

They fed this data into a neural network language model that uses GPT-1, the predecessor of the AI technology later deployed in the hugely popular ChatGPT.

The model was trained to predict how each person's brain would respond to perceived speech, then narrow down the options until it found the closest response.

To test the model's accuracy, each participant then listened to a new story in the fMRI machine.

The study's first author Jerry Tang said the decoder could "recover the gist of what the user was hearing".

For example, when the participant heard the phrase "I don't have my driver's license yet", the model came back with "she has not even started to learn to drive yet".

The decoder struggled with personal pronouns such as "I" or "she," the researchers admitted.

But even when the participants thought up their own stories -- or viewed silent movies -- the decoder was still able to grasp the "gist," they said.

This showed that "we are decoding something that is deeper than language, then converting it into language," Huth said.

Because fMRI scanning is too slow to capture individual words, it collects a "mishmash, an agglomeration of information over a few seconds," Huth said.

"So we can see how the idea evolves, even though the exact words get lost."

- Ethical warning -

David Rodriguez-Arias Vailhen, a bioethics professor at Spain's Granada University not involved in the research, said it went beyond what had been achieved by previous brain-computer interfaces.

This brings us closer to a future in which machines are "able to read minds and transcribe thought," he said, warning this could possibly take place against people's will, such as when they are sleeping.

The researchers anticipated such concerns.

They ran tests showing that the decoder did not work on a person if it had not already been trained on their own particular brain activity.

The three participants were also able to easily foil the decoder.

While listening to one of the podcasts, the users were told to count by sevens, name and imagine animals or tell a different story in their mind. All these tactics "sabotaged" the decoder, the researchers said.

Next, the team hopes to speed up the process so that they can decode the brain scans in real time.

They also called for regulations to protect mental privacy.

"Our mind has so far been the guardian of our privacy," said bioethicist Rodriguez-Arias Vailhen.

"This discovery could be a first step towards compromising that freedom in the future."

T.Gilbert--TFWP