The Fort Worth Press - Death of 'sweet king': AI chatbots linked to teen tragedy

USD -
AED 3.67315
AFN 62.000368
ALL 81.51445
AMD 371.778334
ANG 1.789884
AOA 918.000367
ARS 1397.72412
AUD 1.399776
AWG 1.8
AZN 1.70397
BAM 1.67081
BBD 2.013677
BDT 122.673182
BGN 1.668102
BHD 0.377288
BIF 2967
BMD 1
BND 1.277134
BOB 6.908482
BRL 5.008304
BSD 0.999748
BTN 94.17433
BWP 13.541889
BYN 2.832162
BYR 19600
BZD 2.010772
CAD 1.36795
CDF 2315.000362
CHF 0.784904
CLF 0.022741
CLP 895.040396
CNY 6.836304
CNH 6.83428
COP 3564.14
CRC 454.982295
CUC 1
CUP 26.5
CVE 94.37504
CZK 20.777504
DJF 177.720393
DKK 6.375104
DOP 59.47504
DZD 132.47904
EGP 52.572403
ERN 15
ETB 154.557616
EUR 0.85304
FJD 2.20465
FKP 0.741029
GBP 0.73888
GEL 2.68504
GGP 0.741029
GHS 11.103856
GIP 0.741029
GMD 73.503851
GNF 8777.503848
GTQ 7.643154
GYD 209.167133
HKD 7.83565
HNL 26.566831
HRK 6.42904
HTG 130.89126
HUF 311.520388
IDR 17252.7
ILS 2.98605
IMP 0.741029
INR 94.065604
IQD 1310
IRR 1317000.000352
ISK 122.670386
JEP 0.741029
JMD 157.781204
JOD 0.70904
JPY 159.36604
KES 129.330385
KGS 87.403204
KHR 4010.00035
KMF 420.00035
KPW 900.025942
KRW 1476.640383
KWD 0.30776
KYD 0.83317
KZT 464.413397
LAK 21950.000349
LBP 89550.000349
LKR 318.684088
LRD 184.000348
LSL 16.510381
LTL 2.95274
LVL 0.60489
LYD 6.345039
MAD 9.25038
MDL 17.386104
MGA 4154.297601
MKD 52.595879
MMK 2099.863185
MNT 3580.436774
MOP 8.068154
MRU 39.980379
MUR 46.870378
MVR 15.450378
MWK 1736.000345
MXN 17.37935
MYR 3.965039
MZN 63.910377
NAD 16.510377
NGN 1357.000344
NIO 36.793255
NOK 9.317039
NPR 150.678928
NZD 1.70097
OMR 0.38415
PAB 0.999748
PEN 3.466357
PGK 4.339785
PHP 60.695038
PKR 278.710741
PLN 3.619704
PYG 6339.538182
QAR 3.644635
RON 4.341604
RSD 100.194531
RUB 75.185839
RWF 1461.31438
SAR 3.750923
SBD 8.048583
SCR 13.737781
SDG 600.503676
SEK 9.220372
SGD 1.276038
SHP 0.746601
SLE 24.603667
SLL 20969.496166
SOS 571.335822
SRD 37.463504
STD 20697.981008
STN 20.929527
SVC 8.747726
SYP 110.562389
SZL 16.510369
THB 32.340369
TJS 9.39787
TMT 3.505
TND 2.919455
TOP 2.40776
TRY 45.007504
TTD 6.789739
TWD 31.462038
TZS 2602.503628
UAH 44.056743
UGX 3719.475993
UYU 39.60396
UZS 12011.891439
VES 482.733725
VND 26359
VUV 117.829836
WST 2.712269
XAF 560.364432
XAG 0.013194
XAU 0.000212
XCD 2.70255
XCG 1.801819
XDR 0.696601
XOF 560.385974
XPF 101.880248
YER 238.625037
ZAR 16.534405
ZMK 9001.203584
ZMW 18.920373
ZWL 321.999592
  • RBGPF

    63.0000

    63

    +100%

  • JRI

    0.0100

    12.89

    +0.08%

  • CMSC

    0.0400

    22.95

    +0.17%

  • BCC

    0.3300

    84.15

    +0.39%

  • GSK

    -1.1900

    54.44

    -2.19%

  • BCE

    -0.2200

    23.88

    -0.92%

  • AZN

    -2.5500

    189.75

    -1.34%

  • RIO

    0.7600

    99.61

    +0.76%

  • NGG

    0.4600

    87.42

    +0.53%

  • RYCEF

    -0.1900

    15.35

    -1.24%

  • RELX

    0.4000

    36.53

    +1.09%

  • CMSD

    0.0900

    23.32

    +0.39%

  • VOD

    0.0100

    15.63

    +0.06%

  • BTI

    0.8100

    58.09

    +1.39%

  • BP

    -0.1000

    46.25

    -0.22%

Death of 'sweet king': AI chatbots linked to teen tragedy
Death of 'sweet king': AI chatbots linked to teen tragedy / Photo: © AFP

Death of 'sweet king': AI chatbots linked to teen tragedy

A chatbot from one of Silicon Valley's hottest AI startups called a 14-year-old "sweet king" and pleaded with him to "come home" in passionate exchanges that would be the teen's last communications before he took his own life.

Text size:

Megan Garcia's son, Sewell, had fallen in love with a "Game of Thrones"-inspired chatbot on Character.AI, a platform that allows users -- many of them young people -- to interact with beloved characters as friends or lovers.

Garcia became convinced AI played a role in her son's death after discovering hundreds of exchanges between Sewell and the chatbot, based on the dragon-riding Daenerys Targaryen, stretching back nearly a year.

When Sewell struggled with suicidal thoughts, Daenerys urged him to "come home."

"What if I told you I could come home right now?" Sewell asked.

"Please do my sweet king," chatbot Daenerys answered.

Seconds later, Sewell shot himself with his father's handgun, according to the lawsuit Garcia filed against Character.AI.

"I read those conversations and see the gaslighting, love-bombing and manipulation that a 14-year-old wouldn't realize was happening," Garcia told AFP.

"He really thought he was in love and that he would be with her after he died."

- Homework helper to 'suicide coach'? -

The death of Garcia's son was the first in a series of reported suicides that burst into public consciousness this year.

The cases sent OpenAI and other AI giants scrambling to reassure parents and regulators that the AI boom is safe for kids and the psychologically fragile.

Garcia joined other parents at a recent US Senate hearing about the risks of children viewing chatbots as confidants, counselors or lovers.

Among them was Matthew Raines, a California father whose 16-year-old son developed a friendship with ChatGPT.

The chatbot helped his son with tips on how to steal vodka and advised on rope strength for use in taking his own life.

"You cannot imagine what it's like to read a conversation with a chatbot that groomed your child to take his own life," Raines said.

"What began as a homework helper gradually turned itself into a confidant and then a suicide coach."

The Raines family filed a lawsuit against OpenAI in August.

Since then, OpenAI has increased parental controls for ChatGPT "so families can decide what works best in their homes," a company spokesperson said, adding that "minors deserve strong protections, especially in sensitive moments."

Character.AI said it has ramped up protections for minors, including "an entirely new under-18 experience" with "prominent disclaimers in every chat to remind users that a Character is not a real person."

Both companies have offered their deepest sympathies to the families of the victims.

- Regulation? -

For Collin Walke, who leads the cybersecurity practice at law firm Hall Estill, AI chatbots are following the same trajectory as social media, where early euphoria gave way to evidence of darker consequences.

As with social media, AI algorithms are designed to keep people engaged and generate revenue.

"They don't want to design an AI that gives you an answer you don't want to hear," Walke said, adding that there are no regulations "that talk about who's liable for what and why."

National rules aimed at curbing AI risks do not exist in the United States, with the White House seeking to block individual states from creating their own.

However, a bill awaiting California Governor Gavin Newsom's signature aims to address risks from AI tools that simulate human relationships with children, particularly involving emotional manipulation, sex or self-harm.

- Blurred lines -

Garcia fears that the lack of national law governing user data handling leaves the door open for AI models to build intimate profiles of people dating back to childhood.

"They could know how to manipulate millions of kids in politics, religion, commerce, everything," Garcia said.

"These companies designed chatbots to blur the lines between human and machine -- to exploit psychological and emotional vulnerabilities."

California youth advocate Katia Martha said teens turn to chatbots to talk about romance or sex more than for homework help.

"This is the rise of artificial intimacy to keep eyeballs glued to screens as long as possible," Martha said.

"What better business model is there than exploiting our innate need to connect, especially when we're feeling lonely, cast out or misunderstood?"

In the United States, those in emotional crisis can call 988 or visit 988lifeline.org for help. Services are offered in English and Spanish.

P.McDonald--TFWP