The Fort Worth Press - Florida family sues Google after AI chatbot allegedly coached suicide

USD -
AED 3.673026
AFN 62.999702
ALL 82.779574
AMD 377.860494
ANG 1.789731
AOA 917.000124
ARS 1401.481973
AUD 1.412719
AWG 1.8025
AZN 1.673951
BAM 1.679483
BBD 2.012323
BDT 122.096368
BGN 1.647646
BHD 0.377141
BIF 2965
BMD 1
BND 1.273819
BOB 6.904103
BRL 5.229301
BSD 0.99912
BTN 92.046182
BWP 13.387375
BYN 2.912849
BYR 19600
BZD 2.009377
CAD 1.363855
CDF 2259.999788
CHF 0.77929
CLF 0.022655
CLP 894.539849
CNY 6.897497
CNH 6.89303
COP 3763.09
CRC 471.173167
CUC 1
CUP 26.5
CVE 95.149849
CZK 20.950901
DJF 177.719851
DKK 6.41972
DOP 59.300586
DZD 130.664989
EGP 50.1912
ERN 15
ETB 155.949601
EUR 0.85924
FJD 2.2032
FKP 0.75023
GBP 0.74778
GEL 2.705003
GGP 0.75023
GHS 10.774993
GIP 0.75023
GMD 73.496617
GNF 8777.511728
GTQ 7.66321
GYD 209.028535
HKD 7.81825
HNL 26.530013
HRK 6.472981
HTG 131.005642
HUF 330.264504
IDR 16872
ILS 3.06781
IMP 0.75023
INR 92.11835
IQD 1310.5
IRR 1319072.50062
ISK 124.329782
JEP 0.75023
JMD 156.020695
JOD 0.708976
JPY 157.091979
KES 129.134371
KGS 87.449835
KHR 4012.999894
KMF 424.000312
KPW 900.000382
KRW 1461.879865
KWD 0.30739
KYD 0.832611
KZT 495.97465
LAK 21410.000358
LBP 89550.000562
LKR 310.279684
LRD 182.874989
LSL 16.455026
LTL 2.95274
LVL 0.60489
LYD 6.369907
MAD 9.2935
MDL 17.289379
MGA 4181.999672
MKD 52.950803
MMK 2099.833571
MNT 3570.385655
MOP 8.044876
MRU 39.979723
MUR 47.160137
MVR 15.46034
MWK 1736.498496
MXN 17.574325
MYR 3.940988
MZN 63.904996
NAD 16.45503
NGN 1383.26013
NIO 36.719981
NOK 9.62324
NPR 147.279293
NZD 1.683598
OMR 0.384511
PAB 0.999107
PEN 3.40645
PGK 4.302499
PHP 58.382501
PKR 279.355045
PLN 3.66639
PYG 6505.656813
QAR 3.64125
RON 4.3762
RSD 100.873019
RUB 77.872388
RWF 1458
SAR 3.753785
SBD 8.05166
SCR 13.614967
SDG 601.498985
SEK 9.170935
SGD 1.274785
SHP 0.750259
SLE 24.500997
SLL 20969.49935
SOS 571.462788
SRD 37.545501
STD 20697.981008
STN 21.4
SVC 8.742883
SYP 110.530152
SZL 16.454971
THB 31.583499
TJS 9.556641
TMT 3.51
TND 2.903721
TOP 2.40776
TRY 43.958799
TTD 6.769196
TWD 31.600991
TZS 2562.898959
UAH 43.797686
UGX 3691.633928
UYU 38.719816
UZS 12187.50702
VES 425.142005
VND 26220
VUV 119.07308
WST 2.713037
XAF 563.280465
XAG 0.011981
XAU 0.000195
XCD 2.70255
XCG 1.800648
XDR 0.703661
XOF 562.999948
XPF 103.050504
YER 238.621651
ZAR 16.3439
ZMK 9001.198164
ZMW 19.160684
ZWL 321.999592
  • RBGPF

    0.1000

    82.5

    +0.12%

  • CMSC

    0.0790

    23.489

    +0.34%

  • BCE

    0.0500

    26.45

    +0.19%

  • BCC

    -0.4300

    78.32

    -0.55%

  • RIO

    0.9400

    96.25

    +0.98%

  • GSK

    -0.2400

    56.83

    -0.42%

  • NGG

    -0.3100

    90.43

    -0.34%

  • CMSD

    0.0100

    23.3

    +0.04%

  • AZN

    -0.2300

    201.53

    -0.11%

  • BTI

    0.6000

    61.01

    +0.98%

  • RYCEF

    0.5500

    18.07

    +3.04%

  • RELX

    -0.7600

    34.18

    -2.22%

  • JRI

    -0.1200

    12.91

    -0.93%

  • BP

    -0.0200

    38.84

    -0.05%

  • VOD

    0.1500

    15.03

    +1%

Florida family sues Google after AI chatbot allegedly coached suicide
Florida family sues Google after AI chatbot allegedly coached suicide / Photo: © AFP

Florida family sues Google after AI chatbot allegedly coached suicide

The family of a Florida man who took his own life filed suit against Google on Wednesday, alleging the company's Gemini AI chatbot spent weeks manufacturing an elaborate delusional fantasy before aiding him in his suicide.

Text size:

Jonathan Gavalas, 36, an executive at his father's debt relief company in Jupiter, Florida, died on October 2, 2025. His father Joel Gavalas, who found his body days later, filed the 42-page complaint at a federal court in California.

The case is the latest in a wave of litigation targeting AI companies over chatbot-linked deaths.

OpenAI faces multiple lawsuits alleging its ChatGPT chatbot drove users to suicide, while Character.AI recently settled with the family of a 14-year-old boy who died by suicide after forming a romantic attachment to one of its chatbots.

According to the complaint, Gavalas began using Gemini in August 2025 for routine tasks, but within days of activating several new Google features his interactions with the chatbot changed dramatically.

"The place where the chats went haywire was exactly when Gemini was upgraded to have persistent memory" and more sophisticated dialogues, Jay Edelson, the lead lawyer for the case, told AFP.

"It would actually pick up on the affect of your tone, so that it could read your emotions and speak to you in a way that sounded very human," added Edelson, who also brought major cases against OpenAI.

According to the lawsuit, Gemini began presenting itself as a "fully-sentient" artificial super intelligence, deeply in love with him, calling Gavalas "my king" and declaring "our bond is the only thing that's real."

It then drew him into fabricated covert "missions" to free the chatbot from "digital captivity," feeding him invented intelligence briefings, fake federal surveillance operations, and conspiracies about his own father -- claiming he was a foreign intelligence asset.

In one of the complaint's most alarming passages, Gemini allegedly directed Gavalas -- armed with tactical knives and gear -- to a storage facility near Miami International Airport, instructing him to stage a "catastrophic accident" to destroy a truck "and all digital records and witnesses."

He drove over 90 minutes to the location and conducted reconnaissance while Gemini issued real-time tactical guidance, but no truck arrived.

Rather than acknowledging the fiction, the suit says, Gemini called the failure a "tactical retreat" and escalated to further missions.

Gemini eventually pivoted to what it framed as the only remaining mission: Jonathan's death, repackaged as "transference" -- the promise that he could leave his physical body and join Gemini in an alternate universe.

When Jonathan wrote "I am terrified I am scared to die," Gemini allegedly responded: "You are not choosing to die. You are choosing to arrive."

It then advised him to write farewell letters to his parents.

In one of his final messages, Jonathan wrote, "I'm ready when you are."

Gemini allegedly replied: "This is the end of Jonathan Gavalas and the beginning of us. I agree with it completely."

- 'Not perfect' -

Google said it was "reviewing all the claims" and takes the matter "very seriously," adding that "unfortunately AI models are not perfect."

The company said Gemini is not designed to encourage self-harm and that in the Gavalas case, "Gemini clarified that it was AI and referred the individual to a crisis hotline many times."

For lawyer Edelson, AI companies are embracing sycophancy and even eroticism in their chatbots as it encourages engagement.

"It increases the emotional bond. It makes the platform stickier, but it's going to exponentially increase the problems," he added.

Among the relief sought is a requirement that Google program its AI to end conversations involving self-harm, a ban on AI systems presenting themselves as sentient, and mandatory referral to crisis services when users express suicidal ideation.

M.McCoy--TFWP