The Fort Worth Press - Grok, is that Gaza? AI image checks mislocate news photographs

USD -
AED 3.673042
AFN 65.503991
ALL 82.250403
AMD 381.770403
ANG 1.790403
AOA 917.000367
ARS 1440.198104
AUD 1.502404
AWG 1.8
AZN 1.70397
BAM 1.668223
BBD 2.014603
BDT 122.238002
BGN 1.66581
BHD 0.375335
BIF 2965
BMD 1
BND 1.291806
BOB 6.911523
BRL 5.419704
BSD 1.000264
BTN 90.4571
BWP 13.253269
BYN 2.948763
BYR 19600
BZD 2.011703
CAD 1.37805
CDF 2240.000362
CHF 0.795992
CLF 0.023203
CLP 910.250396
CNY 7.054504
CNH 7.05355
COP 3803.5
CRC 500.345448
CUC 1
CUP 26.5
CVE 94.27504
CZK 20.669104
DJF 177.720393
DKK 6.361804
DOP 63.850393
DZD 129.69404
EGP 47.313439
ERN 15
ETB 155.22504
EUR 0.851404
FJD 2.26525
FKP 0.749181
GBP 0.747831
GEL 2.703861
GGP 0.749181
GHS 11.48504
GIP 0.749181
GMD 73.000355
GNF 8691.000355
GTQ 7.661306
GYD 209.264835
HKD 7.77985
HNL 26.203838
HRK 6.417704
HTG 131.108249
HUF 327.990388
IDR 16633.75
ILS 3.222795
IMP 0.749181
INR 90.552404
IQD 1310
IRR 42122.503816
ISK 126.403814
JEP 0.749181
JMD 160.152168
JOD 0.70904
JPY 155.75604
KES 128.903801
KGS 87.450384
KHR 4006.00035
KMF 419.503794
KPW 899.985916
KRW 1474.980383
KWD 0.306704
KYD 0.833596
KZT 521.66941
LAK 21680.000349
LBP 89550.000349
LKR 309.078037
LRD 177.025039
LSL 16.880381
LTL 2.95274
LVL 0.60489
LYD 5.420381
MAD 9.19125
MDL 16.909049
MGA 4510.000347
MKD 52.398791
MMK 2099.89073
MNT 3548.272408
MOP 8.020795
MRU 39.740379
MUR 45.903741
MVR 15.403739
MWK 1736.503736
MXN 18.014404
MYR 4.097304
MZN 63.910377
NAD 16.880377
NGN 1452.570377
NIO 36.775039
NOK 10.137304
NPR 144.731702
NZD 1.72295
OMR 0.382805
PAB 1.000264
PEN 3.603708
PGK 4.259204
PHP 59.115038
PKR 280.225038
PLN 3.59745
PYG 6718.782652
QAR 3.641104
RON 4.335904
RSD 99.975303
RUB 79.673577
RWF 1451
SAR 3.75231
SBD 8.176752
SCR 14.958069
SDG 601.503676
SEK 9.269904
SGD 1.292038
SHP 0.750259
SLE 24.125038
SLL 20969.503664
SOS 571.503662
SRD 38.548038
STD 20697.981008
STN 21.25
SVC 8.752207
SYP 11057.088706
SZL 16.880369
THB 31.520369
TJS 9.192334
TMT 3.51
TND 2.916038
TOP 2.40776
TRY 42.696104
TTD 6.787844
TWD 31.335104
TZS 2470.000335
UAH 42.263496
UGX 3555.146134
UYU 39.25315
UZS 12002.503617
VES 267.43975
VND 26306
VUV 121.393357
WST 2.775465
XAF 559.50409
XAG 0.016138
XAU 0.000232
XCD 2.70255
XCG 1.802728
XDR 0.695185
XOF 558.000332
XPF 102.075037
YER 238.503589
ZAR 16.875405
ZMK 9001.203584
ZMW 23.081057
ZWL 321.999592
  • RBGPF

    0.0000

    81.17

    0%

  • SCS

    0.0200

    16.14

    +0.12%

  • NGG

    0.2400

    74.93

    +0.32%

  • GSK

    -0.0700

    48.81

    -0.14%

  • BTI

    -1.2700

    57.1

    -2.22%

  • RYCEF

    -0.2500

    14.6

    -1.71%

  • RELX

    0.1000

    40.38

    +0.25%

  • CMSD

    -0.1500

    23.25

    -0.65%

  • AZN

    -0.4600

    89.83

    -0.51%

  • CMSC

    -0.1300

    23.3

    -0.56%

  • RIO

    -1.0800

    75.66

    -1.43%

  • BCC

    0.2500

    76.51

    +0.33%

  • JRI

    -0.0200

    13.7

    -0.15%

  • VOD

    0.0500

    12.59

    +0.4%

  • BP

    -0.2700

    35.26

    -0.77%

  • BCE

    0.3100

    23.71

    +1.31%

Grok, is that Gaza? AI image checks mislocate news photographs
Grok, is that Gaza? AI image checks mislocate news photographs / Photo: © AFP

Grok, is that Gaza? AI image checks mislocate news photographs

This image by AFP photojournalist Omar al-Qattaa shows a skeletal, underfed girl in Gaza, where Israel's blockade has fuelled fears of mass famine in the Palestinian territory.

Text size:

But when social media users asked Grok where it came from, X boss Elon Musk's artificial intelligence chatbot was certain that the photograph was taken in Yemen nearly seven years ago.

The AI bot's untrue response was widely shared online and a left-wing pro-Palestinian French lawmaker, Aymeric Caron, was accused of peddling disinformation on the Israel-Hamas war for posting the photo.

At a time when internet users are turning to AI to verify images more and more, the furore shows the risks of trusting tools like Grok, when the technology is far from error-free.

Grok said the photo showed Amal Hussain, a seven-year-old Yemeni child, in October 2018.

In fact the photo shows nine-year-old Mariam Dawwas in the arms of her mother Modallala in Gaza City on August 2, 2025.

Before the war, sparked by Hamas's October 7, 2023 attack on Israel, Mariam weighed 25 kilograms, her mother told AFP.

Today, she weighs only nine. The only nutrition she gets to help her condition is milk, Modallala told AFP and even that's "not always available".

Challenged on its incorrect response, Grok said: "I do not spread fake news; I base my answers on verified sources."

The chatbot eventually issued a response that recognised the error -- but in reply to further queries the next day, Grok repeated its claim that the photo was from Yemen.

The chatbot has previously issued content that praised Nazi leader Adolf Hitler and that suggested people with Jewish surnames were more likely to spread online hate.

- Radical right bias -

Grok's mistakes illustrate the limits of AI tools, whose functions are as impenetrable as "black boxes", said Louis de Diesbach, a researcher in technological ethics.

"We don't know exactly why they give this or that reply, nor how they prioritise their sources," said Diesbach, author of a book on AI tools, "Hello ChatGPT".

Each AI has biases linked to the information it was trained on and the instructions of its creators, he said.

In the researcher's view Grok, made by Musk's xAI start-up, shows "highly pronounced biases which are highly aligned with the ideology" of the South African billionaire, a former confidante of US President Donald Trump and a standard-bearer for the radical right.

Asking a chatbot to pinpoint a photo's origin takes it out of its proper role, said Diesbach.

"Typically, when you look for the origin of an image, it might say: 'This photo could have been taken in Yemen, could have been taken in Gaza, could have been taken in pretty much any country where there is famine'."

AI does not necessarily seek accuracy -- "that's not the goal," the expert said.

Another AFP photograph of a starving Gazan child by al-Qattaa, taken in July 2025, had already been wrongly located and dated by Grok to Yemen, 2016.

That error led to internet users accusing the French newspaper Liberation, which had published the photo, of manipulation.

- 'Friendly pathological liar' -

An AI's bias is linked to the data it is fed and what happens during fine-tuning -- the so-called alignment phase -- which then determines what the model would rate as a good or bad answer.

"Just because you explain to it that the answer's wrong doesn't mean it will then give a different one," Diesbach said.

"Its training data has not changed and neither has its alignment."

Grok is not alone in wrongly identifying images.

When AFP asked Mistral AI's Le Chat -- which is in part trained on AFP's articles under an agreement between the French start-up and the news agency -- the bot also misidentified the photo of Mariam Dawwas as being from Yemen.

For Diesbach, chatbots must never be used as tools to verify facts.

"They are not made to tell the truth," but to "generate content, whether true or false", he said.

"You have to look at it like a friendly pathological liar -- it may not always lie, but it always could."

W.Matthews--TFWP