The Fort Worth Press - Can you trust your ears? AI voice scams rattle US

USD -
AED 3.672504
AFN 64.503991
ALL 81.624824
AMD 375.516815
AOA 917.000367
ARS 1371.22092
AUD 1.41603
AWG 1.8
AZN 1.70397
BAM 1.667278
BBD 2.011082
BDT 122.671668
BHD 0.377307
BIF 2967.989429
BMD 1
BND 1.272324
BOB 6.899962
BRL 5.009204
BSD 0.998508
BTN 92.62947
BWP 13.405226
BYN 2.865862
BYR 19600
BZD 2.008184
CAD 1.38415
CDF 2300.000362
CHF 0.789223
CLF 0.02274
CLP 892.843442
CNY 6.828041
CNH 6.824955
COP 3636.503133
CRC 462.128639
CUC 1
CUP 26.5
CVE 93.998551
CZK 20.788404
DJF 177.809983
DKK 6.372904
DOP 60.125314
DZD 132.246707
EGP 53.108563
ERN 15
ETB 156.679852
EUR 0.852704
FJD 2.211504
FKP 0.743031
GBP 0.743218
GEL 2.690391
GGP 0.743031
GHS 10.988449
GIP 0.743031
GMD 73.503851
GNF 8760.922382
GTQ 7.638208
GYD 208.899876
HKD 7.83195
HNL 26.518904
HRK 6.425904
HTG 130.923661
HUF 320.203831
IDR 17089.3
ILS 3.03421
IMP 0.743031
INR 93.090504
IQD 1308.043135
IRR 1316125.000352
ISK 122.190386
JEP 0.743031
JMD 157.870509
JOD 0.70904
JPY 159.27504
KES 129.210179
KGS 87.450384
KHR 3997.272069
KMF 420.00035
KPW 899.981018
KRW 1484.570383
KWD 0.30869
KYD 0.832104
KZT 471.85542
LAK 22019.52176
LBP 89419.71783
LKR 315.118708
LRD 183.726184
LSL 16.382337
LTL 2.95274
LVL 0.60489
LYD 6.347556
MAD 9.280849
MDL 17.20387
MGA 4143.898385
MKD 52.54678
MMK 2100.296476
MNT 3579.27255
MOP 8.05507
MRU 39.91049
MUR 46.520378
MVR 15.460378
MWK 1731.383999
MXN 17.301404
MYR 3.965039
MZN 63.960377
NAD 16.382337
NGN 1359.503725
NIO 36.741827
NOK 9.524904
NPR 148.206811
NZD 1.713797
OMR 0.384504
PAB 0.998508
PEN 3.369933
PGK 4.322066
PHP 59.876504
PKR 278.505946
PLN 3.627503
PYG 6457.525255
QAR 3.640254
RON 4.342304
RSD 100.055411
RUB 77.104556
RWF 1458.164614
SAR 3.753582
SBD 8.058149
SCR 15.185201
SDG 601.000339
SEK 9.27195
SGD 1.273804
SLE 24.625038
SOS 570.649162
SRD 37.449038
STD 20697.981008
STN 20.885725
SVC 8.737053
SYP 110.53314
SZL 16.386343
THB 32.208038
TJS 9.490729
TMT 3.505
TND 2.917693
TRY 44.665038
TTD 6.776352
TWD 31.741804
TZS 2591.108648
UAH 43.382209
UGX 3694.642172
UYU 40.288138
UZS 12141.852436
VES 475.837804
VND 26336
VUV 119.536694
WST 2.734496
XAF 559.189293
XAG 0.01312
XAU 0.00021
XCD 2.70255
XCG 1.799582
XDR 0.695452
XOF 559.189293
XPF 101.666596
YER 237.150363
ZAR 16.41806
ZMK 9001.203584
ZMW 18.996633
ZWL 321.999592
  • RBGPF

    -13.5000

    69

    -19.57%

  • CMSD

    0.0400

    22.63

    +0.18%

  • RELX

    -0.0400

    33.3

    -0.12%

  • NGG

    -0.0300

    90.29

    -0.03%

  • GSK

    -0.1500

    58.21

    -0.26%

  • RYCEF

    -0.2700

    16.96

    -1.59%

  • RIO

    1.1300

    98.26

    +1.15%

  • AZN

    -0.9600

    204.03

    -0.47%

  • BTI

    -0.0400

    58.81

    -0.07%

  • BCE

    -0.5400

    23.35

    -2.31%

  • CMSC

    0.0400

    22.43

    +0.18%

  • BCC

    -0.4100

    80.17

    -0.51%

  • JRI

    0.0400

    13.02

    +0.31%

  • VOD

    -0.1600

    15.69

    -1.02%

  • BP

    0.5400

    46.44

    +1.16%

Can you trust your ears? AI voice scams rattle US
Can you trust your ears? AI voice scams rattle US / Photo: © AFP

Can you trust your ears? AI voice scams rattle US

The voice on the phone seemed frighteningly real -- an American mother heard her daughter sobbing before a man took over and demanded a ransom. But the girl was an AI clone and the abduction was fake.

Text size:

The biggest peril of Artificial Intelligence, experts say, is its ability to demolish the boundaries between reality and fiction, handing cybercriminals a cheap and effective technology to propagate disinformation.

In a new breed of scams that has rattled US authorities, fraudsters are using strikingly convincing AI voice cloning tools -- widely available online -- to steal from people by impersonating family members.

"Help me, mom, please help me," Jennifer DeStefano, an Arizona-based mother, heard a voice saying on the other end of the line.

DeStefano was "100 percent" convinced it was her 15-year-old daughter in deep distress while away on a skiing trip.

"It was never a question of who is this? It was completely her voice... it was the way she would have cried," DeStefano told a local television station in April.

"I never doubted for one second it was her."

The scammer who took over the call, which came from a number unfamiliar to DeStefano, demanded up to $1 million.

The AI-powered ruse was over within minutes when DeStefano established contact with her daughter. But the terrifying case, now under police investigation, underscored the potential for cybercriminals to misuse AI clones.

- Grandparent scam -

"AI voice cloning, now almost indistinguishable from human speech, allows threat actors like scammers to extract information and funds from victims more effectively," Wasim Khaled, chief executive of Blackbird.AI, told AFP.

A simple internet search yields a wide array of apps, many available for free, to create AI voices with a small sample -- sometimes only a few seconds -- of a person's real voice that can be easily stolen from content posted online.

"With a small audio sample, an AI voice clone can be used to leave voicemails and voice texts. It can even be used as a live voice changer on phone calls," Khaled said.

"Scammers can employ different accents, genders, or even mimic the speech patterns of loved ones. [The technology] allows for the creation of convincing deep fakes."

In a global survey of 7,000 people from nine countries, including the United States, one in four people said they had experienced an AI voice cloning scam or knew someone who had.

Seventy percent of the respondents said they were not confident they could "tell the difference between a cloned voice and the real thing," said the survey, published last month by the US-based McAfee Labs.

American officials have warned of a rise in what is popularly known as the "grandparent scam" -– where an imposter poses as a grandchild in urgent need of money in a distressful situation.

"You get a call. There's a panicked voice on the line. It's your grandson. He says he's in deep trouble —- he wrecked the car and landed in jail. But you can help by sending money," the US Federal Trade Commission said in a warning in March.

"It sounds just like him. How could it be a scam? Voice cloning, that's how."

In the comments beneath the FTC's warning were multiple testimonies of elderly people who had been duped that way.

- 'Malicious' -

That also mirrors the experience of Eddie, a 19-year-old in Chicago whose grandfather received a call from someone who sounded just like him, claiming he needed money after a car accident.

The ruse, reported by McAfee Labs, was so convincing that his grandfather urgently started scrounging together money and even considered re-mortgaging his house, before the lie was discovered.

"Because it is now easy to generate highly realistic voice clones... nearly anyone with any online presence is vulnerable to an attack," Hany Farid, a professor at the UC Berkeley School of Information, told AFP.

"These scams are gaining traction and spreading."

Earlier this year, AI startup ElevenLabs admitted that its voice cloning tool could be misused for "malicious purposes" after users posted a deepfake audio purporting to be actor Emma Watson reading Adolf Hitler's biography "Mein Kampf."

"We're fast approaching the point where you can't trust the things that you see on the internet," Gal Tal-Hochberg, group chief technology officer at the venture capital firm Team8, told AFP.

"We are going to need new technology to know if the person you think you're talking to is actually the person you're talking to," he said.

W.Lane--TFWP