The Fort Worth Press - Can you trust your ears? AI voice scams rattle US

USD -
AED 3.672504
AFN 65.000368
ALL 81.910403
AMD 376.168126
ANG 1.79008
AOA 917.000367
ARS 1431.790402
AUD 1.425923
AWG 1.8025
AZN 1.70397
BAM 1.654023
BBD 2.008288
BDT 121.941731
BGN 1.67937
BHD 0.375999
BIF 2954.881813
BMD 1
BND 1.269737
BOB 6.889932
BRL 5.217404
BSD 0.997082
BTN 90.316715
BWP 13.200558
BYN 2.864561
BYR 19600
BZD 2.005328
CAD 1.36855
CDF 2200.000362
CHF 0.77566
CLF 0.021803
CLP 860.890396
CNY 6.93895
CNH 6.929815
COP 3684.65
CRC 494.312656
CUC 1
CUP 26.5
CVE 93.82504
CZK 20.504104
DJF 177.555076
DKK 6.322204
DOP 62.928665
DZD 129.553047
EGP 46.73094
ERN 15
ETB 155.0074
EUR 0.846204
FJD 2.209504
FKP 0.738005
GBP 0.734457
GEL 2.69504
GGP 0.738005
GHS 10.957757
GIP 0.738005
GMD 73.000355
GNF 8752.167111
GTQ 7.647681
GYD 208.609244
HKD 7.81385
HNL 26.45504
HRK 6.376104
HTG 130.618631
HUF 319.703831
IDR 16855.5
ILS 3.110675
IMP 0.738005
INR 90.57645
IQD 1310.5
IRR 42125.000158
ISK 122.710386
JEP 0.738005
JMD 156.057339
JOD 0.70904
JPY 157.200504
KES 128.622775
KGS 87.450384
KHR 4033.00035
KMF 419.00035
KPW 900.002243
KRW 1463.803789
KWD 0.30721
KYD 0.830902
KZT 493.331642
LAK 21426.698803
LBP 89293.839063
LKR 308.47816
LRD 187.449786
LSL 16.086092
LTL 2.95274
LVL 0.60489
LYD 6.314009
MAD 9.185039
MDL 17.000296
MGA 4426.402808
MKD 52.129054
MMK 2100.00747
MNT 3580.70414
MOP 8.023933
MRU 39.850379
MUR 46.060378
MVR 15.450378
MWK 1737.000345
MXN 17.263604
MYR 3.947504
MZN 63.750377
NAD 16.086092
NGN 1366.980377
NIO 36.694998
NOK 9.690604
NPR 144.506744
NZD 1.661958
OMR 0.383441
PAB 0.997082
PEN 3.367504
PGK 4.275868
PHP 58.511038
PKR 278.812127
PLN 3.56949
PYG 6588.016407
QAR 3.64135
RON 4.310404
RSD 99.553038
RUB 76.792845
RWF 1455.283522
SAR 3.749738
SBD 8.058149
SCR 13.675619
SDG 601.503676
SEK 9.023204
SGD 1.272904
SHP 0.750259
SLE 24.450371
SLL 20969.499267
SOS 568.818978
SRD 37.818038
STD 20697.981008
STN 20.719692
SVC 8.724259
SYP 11059.574895
SZL 16.08271
THB 31.535038
TJS 9.342721
TMT 3.505
TND 2.847504
TOP 2.40776
TRY 43.612504
TTD 6.752083
TWD 31.590367
TZS 2577.445135
UAH 42.828111
UGX 3547.71872
UYU 38.538627
UZS 12244.069517
VES 377.985125
VND 25950
VUV 119.988021
WST 2.726314
XAF 554.743964
XAG 0.012866
XAU 0.000202
XCD 2.70255
XCG 1.797032
XDR 0.689923
XOF 554.743964
XPF 101.703591
YER 238.403589
ZAR 16.04457
ZMK 9001.203584
ZMW 18.570764
ZWL 321.999592
  • SCS

    0.0200

    16.14

    +0.12%

  • RBGPF

    0.1000

    82.5

    +0.12%

  • NGG

    1.1700

    88.06

    +1.33%

  • BCC

    1.8700

    91.03

    +2.05%

  • GSK

    1.0600

    60.23

    +1.76%

  • RELX

    -0.7100

    29.38

    -2.42%

  • RIO

    2.2900

    93.41

    +2.45%

  • RYCEF

    0.2600

    16.88

    +1.54%

  • CMSC

    -0.0400

    23.51

    -0.17%

  • BCE

    -0.4900

    25.08

    -1.95%

  • JRI

    0.0900

    12.97

    +0.69%

  • VOD

    0.4900

    15.11

    +3.24%

  • AZN

    5.8700

    193.03

    +3.04%

  • CMSD

    0.0600

    23.95

    +0.25%

  • BTI

    0.8400

    62.8

    +1.34%

  • BP

    0.8400

    39.01

    +2.15%

Can you trust your ears? AI voice scams rattle US
Can you trust your ears? AI voice scams rattle US / Photo: © AFP

Can you trust your ears? AI voice scams rattle US

The voice on the phone seemed frighteningly real -- an American mother heard her daughter sobbing before a man took over and demanded a ransom. But the girl was an AI clone and the abduction was fake.

Text size:

The biggest peril of Artificial Intelligence, experts say, is its ability to demolish the boundaries between reality and fiction, handing cybercriminals a cheap and effective technology to propagate disinformation.

In a new breed of scams that has rattled US authorities, fraudsters are using strikingly convincing AI voice cloning tools -- widely available online -- to steal from people by impersonating family members.

"Help me, mom, please help me," Jennifer DeStefano, an Arizona-based mother, heard a voice saying on the other end of the line.

DeStefano was "100 percent" convinced it was her 15-year-old daughter in deep distress while away on a skiing trip.

"It was never a question of who is this? It was completely her voice... it was the way she would have cried," DeStefano told a local television station in April.

"I never doubted for one second it was her."

The scammer who took over the call, which came from a number unfamiliar to DeStefano, demanded up to $1 million.

The AI-powered ruse was over within minutes when DeStefano established contact with her daughter. But the terrifying case, now under police investigation, underscored the potential for cybercriminals to misuse AI clones.

- Grandparent scam -

"AI voice cloning, now almost indistinguishable from human speech, allows threat actors like scammers to extract information and funds from victims more effectively," Wasim Khaled, chief executive of Blackbird.AI, told AFP.

A simple internet search yields a wide array of apps, many available for free, to create AI voices with a small sample -- sometimes only a few seconds -- of a person's real voice that can be easily stolen from content posted online.

"With a small audio sample, an AI voice clone can be used to leave voicemails and voice texts. It can even be used as a live voice changer on phone calls," Khaled said.

"Scammers can employ different accents, genders, or even mimic the speech patterns of loved ones. [The technology] allows for the creation of convincing deep fakes."

In a global survey of 7,000 people from nine countries, including the United States, one in four people said they had experienced an AI voice cloning scam or knew someone who had.

Seventy percent of the respondents said they were not confident they could "tell the difference between a cloned voice and the real thing," said the survey, published last month by the US-based McAfee Labs.

American officials have warned of a rise in what is popularly known as the "grandparent scam" -– where an imposter poses as a grandchild in urgent need of money in a distressful situation.

"You get a call. There's a panicked voice on the line. It's your grandson. He says he's in deep trouble —- he wrecked the car and landed in jail. But you can help by sending money," the US Federal Trade Commission said in a warning in March.

"It sounds just like him. How could it be a scam? Voice cloning, that's how."

In the comments beneath the FTC's warning were multiple testimonies of elderly people who had been duped that way.

- 'Malicious' -

That also mirrors the experience of Eddie, a 19-year-old in Chicago whose grandfather received a call from someone who sounded just like him, claiming he needed money after a car accident.

The ruse, reported by McAfee Labs, was so convincing that his grandfather urgently started scrounging together money and even considered re-mortgaging his house, before the lie was discovered.

"Because it is now easy to generate highly realistic voice clones... nearly anyone with any online presence is vulnerable to an attack," Hany Farid, a professor at the UC Berkeley School of Information, told AFP.

"These scams are gaining traction and spreading."

Earlier this year, AI startup ElevenLabs admitted that its voice cloning tool could be misused for "malicious purposes" after users posted a deepfake audio purporting to be actor Emma Watson reading Adolf Hitler's biography "Mein Kampf."

"We're fast approaching the point where you can't trust the things that you see on the internet," Gal Tal-Hochberg, group chief technology officer at the venture capital firm Team8, told AFP.

"We are going to need new technology to know if the person you think you're talking to is actually the person you're talking to," he said.

W.Lane--TFWP