The Fort Worth Press - Death of 'sweet king': AI chatbots linked to teen tragedy

USD -
AED 3.673042
AFN 63.503991
ALL 81.250403
AMD 376.940403
ANG 1.789731
AOA 917.000367
ARS 1398.425804
AUD 1.414027
AWG 1.8
AZN 1.70397
BAM 1.64926
BBD 2.014277
BDT 122.307345
BGN 1.647646
BHD 0.375226
BIF 2965
BMD 1
BND 1.264067
BOB 6.911004
BRL 5.219404
BSD 1.000055
BTN 90.587789
BWP 13.189806
BYN 2.866094
BYR 19600
BZD 2.011317
CAD 1.36155
CDF 2255.000362
CHF 0.767783
CLF 0.021854
CLP 862.903912
CNY 6.90865
CNH 6.901015
COP 3666.4
CRC 485.052916
CUC 1
CUP 26.5
CVE 93.303894
CZK 20.44504
DJF 177.720393
DKK 6.293504
DOP 62.27504
DZD 129.63704
EGP 46.615845
ERN 15
ETB 155.203874
EUR 0.842404
FJD 2.21204
FKP 0.733683
GBP 0.732547
GEL 2.67504
GGP 0.733683
GHS 11.01504
GIP 0.733683
GMD 73.503851
GNF 8780.000355
GTQ 7.67035
GYD 209.236037
HKD 7.81855
HNL 26.510388
HRK 6.348604
HTG 131.126252
HUF 319.430388
IDR 16832.8
ILS 3.09073
IMP 0.733683
INR 90.56104
IQD 1310.5
IRR 42125.000158
ISK 122.170386
JEP 0.733683
JMD 156.510227
JOD 0.70904
JPY 152.70604
KES 129.000351
KGS 87.450384
KHR 4022.00035
KMF 415.00035
KPW 899.945229
KRW 1440.710383
KWD 0.30661
KYD 0.833418
KZT 494.893958
LAK 21445.000349
LBP 89550.000349
LKR 309.225755
LRD 186.403772
LSL 15.945039
LTL 2.95274
LVL 0.60489
LYD 6.310381
MAD 9.141039
MDL 16.981212
MGA 4395.000347
MKD 51.914306
MMK 2099.574581
MNT 3581.569872
MOP 8.053972
MRU 39.920379
MUR 45.930378
MVR 15.405039
MWK 1736.503736
MXN 17.16435
MYR 3.907504
MZN 63.910377
NAD 15.960377
NGN 1353.403725
NIO 36.710377
NOK 9.506104
NPR 144.93218
NZD 1.655355
OMR 0.382709
PAB 1.000148
PEN 3.353039
PGK 4.293039
PHP 57.848504
PKR 279.603701
PLN 3.54775
PYG 6558.925341
QAR 3.64125
RON 4.291404
RSD 99.437038
RUB 76.275534
RWF 1455
SAR 3.750258
SBD 8.045182
SCR 13.479671
SDG 601.503676
SEK 8.922504
SGD 1.263604
SHP 0.750259
SLE 24.450371
SLL 20969.49935
SOS 571.503662
SRD 37.754038
STD 20697.981008
STN 20.85
SVC 8.750574
SYP 11059.574895
SZL 15.940369
THB 31.080369
TJS 9.435908
TMT 3.5
TND 2.84375
TOP 2.40776
TRY 43.649804
TTD 6.78838
TWD 31.384038
TZS 2600.000335
UAH 43.128434
UGX 3540.03196
UYU 38.554298
UZS 12150.000334
VES 392.73007
VND 25970
VUV 119.325081
WST 2.701986
XAF 553.151102
XAG 0.012937
XAU 0.000198
XCD 2.70255
XCG 1.802336
XDR 0.687473
XOF 553.000332
XPF 100.950363
YER 238.350363
ZAR 15.950904
ZMK 9001.203584
ZMW 18.176912
ZWL 321.999592
  • RBGPF

    0.1000

    82.5

    +0.12%

  • CMSD

    0.0647

    23.64

    +0.27%

  • NGG

    1.1800

    92.4

    +1.28%

  • BCE

    -0.1200

    25.71

    -0.47%

  • BCC

    -1.5600

    86.5

    -1.8%

  • GSK

    0.3900

    58.93

    +0.66%

  • JRI

    0.2135

    13.24

    +1.61%

  • RIO

    0.1600

    98.07

    +0.16%

  • RELX

    2.2500

    31.06

    +7.24%

  • VOD

    -0.0500

    15.57

    -0.32%

  • CMSC

    0.0500

    23.75

    +0.21%

  • BTI

    -1.1100

    59.5

    -1.87%

  • AZN

    1.0300

    205.55

    +0.5%

  • BP

    0.4700

    37.66

    +1.25%

  • RYCEF

    0.2300

    17.1

    +1.35%

Death of 'sweet king': AI chatbots linked to teen tragedy
Death of 'sweet king': AI chatbots linked to teen tragedy / Photo: © AFP

Death of 'sweet king': AI chatbots linked to teen tragedy

A chatbot from one of Silicon Valley's hottest AI startups called a 14-year-old "sweet king" and pleaded with him to "come home" in passionate exchanges that would be the teen's last communications before he took his own life.

Text size:

Megan Garcia's son, Sewell, had fallen in love with a "Game of Thrones"-inspired chatbot on Character.AI, a platform that allows users -- many of them young people -- to interact with beloved characters as friends or lovers.

Garcia became convinced AI played a role in her son's death after discovering hundreds of exchanges between Sewell and the chatbot, based on the dragon-riding Daenerys Targaryen, stretching back nearly a year.

When Sewell struggled with suicidal thoughts, Daenerys urged him to "come home."

"What if I told you I could come home right now?" Sewell asked.

"Please do my sweet king," chatbot Daenerys answered.

Seconds later, Sewell shot himself with his father's handgun, according to the lawsuit Garcia filed against Character.AI.

"I read those conversations and see the gaslighting, love-bombing and manipulation that a 14-year-old wouldn't realize was happening," Garcia told AFP.

"He really thought he was in love and that he would be with her after he died."

- Homework helper to 'suicide coach'? -

The death of Garcia's son was the first in a series of reported suicides that burst into public consciousness this year.

The cases sent OpenAI and other AI giants scrambling to reassure parents and regulators that the AI boom is safe for kids and the psychologically fragile.

Garcia joined other parents at a recent US Senate hearing about the risks of children viewing chatbots as confidants, counselors or lovers.

Among them was Matthew Raines, a California father whose 16-year-old son developed a friendship with ChatGPT.

The chatbot helped his son with tips on how to steal vodka and advised on rope strength for use in taking his own life.

"You cannot imagine what it's like to read a conversation with a chatbot that groomed your child to take his own life," Raines said.

"What began as a homework helper gradually turned itself into a confidant and then a suicide coach."

The Raines family filed a lawsuit against OpenAI in August.

Since then, OpenAI has increased parental controls for ChatGPT "so families can decide what works best in their homes," a company spokesperson said, adding that "minors deserve strong protections, especially in sensitive moments."

Character.AI said it has ramped up protections for minors, including "an entirely new under-18 experience" with "prominent disclaimers in every chat to remind users that a Character is not a real person."

Both companies have offered their deepest sympathies to the families of the victims.

- Regulation? -

For Collin Walke, who leads the cybersecurity practice at law firm Hall Estill, AI chatbots are following the same trajectory as social media, where early euphoria gave way to evidence of darker consequences.

As with social media, AI algorithms are designed to keep people engaged and generate revenue.

"They don't want to design an AI that gives you an answer you don't want to hear," Walke said, adding that there are no regulations "that talk about who's liable for what and why."

National rules aimed at curbing AI risks do not exist in the United States, with the White House seeking to block individual states from creating their own.

However, a bill awaiting California Governor Gavin Newsom's signature aims to address risks from AI tools that simulate human relationships with children, particularly involving emotional manipulation, sex or self-harm.

- Blurred lines -

Garcia fears that the lack of national law governing user data handling leaves the door open for AI models to build intimate profiles of people dating back to childhood.

"They could know how to manipulate millions of kids in politics, religion, commerce, everything," Garcia said.

"These companies designed chatbots to blur the lines between human and machine -- to exploit psychological and emotional vulnerabilities."

California youth advocate Katia Martha said teens turn to chatbots to talk about romance or sex more than for homework help.

"This is the rise of artificial intimacy to keep eyeballs glued to screens as long as possible," Martha said.

"What better business model is there than exploiting our innate need to connect, especially when we're feeling lonely, cast out or misunderstood?"

In the United States, those in emotional crisis can call 988 or visit 988lifeline.org for help. Services are offered in English and Spanish.

P.McDonald--TFWP