The Fort Worth Press - Scientists use brain scans and AI to 'decode' thoughts

USD -
AED 3.672499
AFN 66.150161
ALL 82.071137
AMD 381.637168
ANG 1.790403
AOA 916.999774
ARS 1438.243899
AUD 1.507807
AWG 1.8025
AZN 1.69797
BAM 1.664227
BBD 2.01353
BDT 122.174949
BGN 1.664685
BHD 0.376991
BIF 2953.186891
BMD 1
BND 1.288882
BOB 6.933288
BRL 5.414603
BSD 0.999745
BTN 90.68295
BWP 13.20371
BYN 2.923673
BYR 19600
BZD 2.010636
CAD 1.376995
CDF 2250.000265
CHF 0.795455
CLF 0.023307
CLP 914.329745
CNY 7.04725
CNH 7.03837
COP 3818
CRC 500.085092
CUC 1
CUP 26.5
CVE 93.826583
CZK 20.69665
DJF 178.029272
DKK 6.35325
DOP 63.504084
DZD 129.648007
EGP 47.42397
ERN 15
ETB 155.599813
EUR 0.85055
FJD 2.30425
FKP 0.747395
GBP 0.747835
GEL 2.694987
GGP 0.747395
GHS 11.496767
GIP 0.747395
GMD 73.496448
GNF 8693.802358
GTQ 7.658271
GYD 209.155888
HKD 7.780235
HNL 26.33339
HRK 6.408702
HTG 130.989912
HUF 327.028501
IDR 16683.3
ILS 3.21285
IMP 0.747395
INR 90.88735
IQD 1309.654993
IRR 42109.99962
ISK 126.049797
JEP 0.747395
JMD 159.76855
JOD 0.708988
JPY 154.732061
KES 128.914227
KGS 87.450078
KHR 4000.153165
KMF 420.000406
KPW 900.00025
KRW 1471.859667
KWD 0.30674
KYD 0.833138
KZT 515.642085
LAK 21663.54663
LBP 89542.083418
LKR 309.121852
LRD 176.477597
LSL 16.773656
LTL 2.95274
LVL 0.60489
LYD 5.419503
MAD 9.176481
MDL 16.875425
MGA 4456.262764
MKD 52.359562
MMK 2099.766038
MNT 3546.841984
MOP 8.014159
MRU 39.76855
MUR 45.949883
MVR 15.397532
MWK 1733.577263
MXN 17.98549
MYR 4.085501
MZN 63.904127
NAD 16.773727
NGN 1451.189663
NIO 36.793581
NOK 10.159396
NPR 145.07403
NZD 1.732605
OMR 0.384492
PAB 0.999745
PEN 3.36659
PGK 4.24862
PHP 58.863028
PKR 280.175459
PLN 3.58829
PYG 6714.60177
QAR 3.643635
RON 4.331098
RSD 99.848015
RUB 79.502014
RWF 1455.582029
SAR 3.752122
SBD 8.160045
SCR 15.103409
SDG 601.500301
SEK 9.287197
SGD 1.289685
SHP 0.750259
SLE 24.050474
SLL 20969.503664
SOS 570.371001
SRD 38.610158
STD 20697.981008
STN 20.847427
SVC 8.747484
SYP 11058.470992
SZL 16.776719
THB 31.490055
TJS 9.193736
TMT 3.5
TND 2.923758
TOP 2.40776
TRY 42.699596
TTD 6.785228
TWD 31.459
TZS 2482.484664
UAH 42.257233
UGX 3561.095984
UYU 39.181311
UZS 12095.014019
VES 267.43975
VND 26301
VUV 121.461818
WST 2.779313
XAF 558.16627
XAG 0.015937
XAU 0.000233
XCD 2.70255
XCG 1.801744
XDR 0.69418
XOF 558.16627
XPF 101.481031
YER 238.449994
ZAR 16.80125
ZMK 9001.203343
ZMW 23.168822
ZWL 321.999592
  • SCS

    0.0200

    16.14

    +0.12%

  • JRI

    -0.0065

    13.56

    -0.05%

  • BCC

    -1.1800

    75.33

    -1.57%

  • NGG

    1.1000

    76.03

    +1.45%

  • CMSC

    0.0000

    23.3

    0%

  • BCE

    0.2161

    23.61

    +0.92%

  • CMSD

    0.1150

    23.365

    +0.49%

  • GSK

    0.4300

    49.24

    +0.87%

  • RBGPF

    0.4300

    81.6

    +0.53%

  • AZN

    1.7300

    91.56

    +1.89%

  • RIO

    0.1600

    75.82

    +0.21%

  • BTI

    0.6400

    57.74

    +1.11%

  • BP

    -0.0100

    35.25

    -0.03%

  • RYCEF

    0.0100

    14.65

    +0.07%

  • VOD

    0.1100

    12.7

    +0.87%

  • RELX

    0.7000

    41.08

    +1.7%

Scientists use brain scans and AI to 'decode' thoughts
Scientists use brain scans and AI to 'decode' thoughts / Photo: © AFP/File

Scientists use brain scans and AI to 'decode' thoughts

Scientists said Monday they have found a way to use brain scans and artificial intelligence modelling to transcribe "the gist" of what people are thinking, in what was described as a step towards mind reading.

Text size:

While the main goal of the language decoder is to help people who have the lost the ability to communicate, the US scientists acknowledged that the technology raised questions about "mental privacy".

Aiming to assuage such fears, they ran tests showing that their decoder could not be used on anyone who had not allowed it to be trained on their brain activity over long hours inside a functional magnetic resonance imaging (fMRI) scanner.

Previous research has shown that a brain implant can enable people who can no longer speak or type to spell out words or even sentences.

These "brain-computer interfaces" focus on the part of the brain that controls the mouth when it tries to form words.

Alexander Huth, a neuroscientist at the University of Texas at Austin and co-author of a new study, said that his team's language decoder "works at a very different level".

"Our system really works at the level of ideas, of semantics, of meaning," Huth told an online press conference.

It is the first system to be able to reconstruct continuous language without an invasive brain implant, according to the study in the journal Nature Neuroscience.

- 'Deeper than language' -

For the study, three people spent a total of 16 hours inside an fMRI machine listening to spoken narrative stories, mostly podcasts such as the New York Times' Modern Love.

This allowed the researchers to map out how words, phrases and meanings prompted responses in the regions of the brain known to process language.

They fed this data into a neural network language model that uses GPT-1, the predecessor of the AI technology later deployed in the hugely popular ChatGPT.

The model was trained to predict how each person's brain would respond to perceived speech, then narrow down the options until it found the closest response.

To test the model's accuracy, each participant then listened to a new story in the fMRI machine.

The study's first author Jerry Tang said the decoder could "recover the gist of what the user was hearing".

For example, when the participant heard the phrase "I don't have my driver's license yet", the model came back with "she has not even started to learn to drive yet".

The decoder struggled with personal pronouns such as "I" or "she," the researchers admitted.

But even when the participants thought up their own stories -- or viewed silent movies -- the decoder was still able to grasp the "gist," they said.

This showed that "we are decoding something that is deeper than language, then converting it into language," Huth said.

Because fMRI scanning is too slow to capture individual words, it collects a "mishmash, an agglomeration of information over a few seconds," Huth said.

"So we can see how the idea evolves, even though the exact words get lost."

- Ethical warning -

David Rodriguez-Arias Vailhen, a bioethics professor at Spain's Granada University not involved in the research, said it went beyond what had been achieved by previous brain-computer interfaces.

This brings us closer to a future in which machines are "able to read minds and transcribe thought," he said, warning this could possibly take place against people's will, such as when they are sleeping.

The researchers anticipated such concerns.

They ran tests showing that the decoder did not work on a person if it had not already been trained on their own particular brain activity.

The three participants were also able to easily foil the decoder.

While listening to one of the podcasts, the users were told to count by sevens, name and imagine animals or tell a different story in their mind. All these tactics "sabotaged" the decoder, the researchers said.

Next, the team hopes to speed up the process so that they can decode the brain scans in real time.

They also called for regulations to protect mental privacy.

"Our mind has so far been the guardian of our privacy," said bioethicist Rodriguez-Arias Vailhen.

"This discovery could be a first step towards compromising that freedom in the future."

T.Gilbert--TFWP