The Fort Worth Press - Biden robocall: Audio deepfake fuels election disinformation fears

USD -
AED 3.672504
AFN 64.503991
ALL 81.277337
AMD 374.792985
ANG 1.789884
AOA 918.000367
ARS 1368.812858
AUD 1.393704
AWG 1.80125
AZN 1.70397
BAM 1.661047
BBD 2.017495
BDT 123.155973
BGN 1.668102
BHD 0.377935
BIF 2978.470423
BMD 1
BND 1.274789
BOB 6.921738
BRL 4.979504
BSD 1.001741
BTN 92.955964
BWP 13.440061
BYN 2.845131
BYR 19600
BZD 2.014608
CAD 1.37785
CDF 2310.000362
CHF 0.781647
CLF 0.022275
CLP 876.690396
CNY 6.81775
CNH 6.81664
COP 3606.23
CRC 456.834685
CUC 1
CUP 26.5
CVE 93.647289
CZK 20.634504
DJF 178.377001
DKK 6.352304
DOP 60.053505
DZD 132.66041
EGP 51.884156
ERN 15
ETB 156.407066
EUR 0.849404
FJD 2.218304
FKP 0.737751
GBP 0.739426
GEL 2.703861
GGP 0.737751
GHS 11.068835
GIP 0.737751
GMD 73.503851
GNF 8788.483587
GTQ 7.660623
GYD 209.571532
HKD 7.83905
HNL 26.615143
HRK 6.404704
HTG 131.173298
HUF 307.310388
IDR 17140
ILS 2.95979
IMP 0.737751
INR 92.60245
IQD 1312.242558
IRR 1321500.000352
ISK 122.070386
JEP 0.737751
JMD 158.376152
JOD 0.70904
JPY 158.630385
KES 129.103801
KGS 87.450384
KHR 4006.964202
KMF 418.00035
KPW 900.016021
KRW 1467.040383
KWD 0.30836
KYD 0.83477
KZT 469.692981
LAK 22100.301499
LBP 89702.068028
LKR 316.633403
LRD 184.313559
LSL 16.418192
LTL 2.95274
LVL 0.60489
LYD 6.334027
MAD 9.242091
MDL 17.219415
MGA 4154.741178
MKD 52.350418
MMK 2100.011828
MNT 3575.508238
MOP 8.080173
MRU 40.038218
MUR 46.290378
MVR 15.460378
MWK 1736.973969
MXN 17.311104
MYR 3.952504
MZN 63.955039
NAD 16.418192
NGN 1342.480377
NIO 36.859315
NOK 9.368704
NPR 148.729882
NZD 1.700392
OMR 0.384504
PAB 1.001741
PEN 3.446261
PGK 4.342435
PHP 59.564038
PKR 279.298569
PLN 3.59435
PYG 6381.587329
QAR 3.65196
RON 4.330404
RSD 99.664529
RUB 76.231517
RWF 1463.671493
SAR 3.751456
SBD 8.035647
SCR 15.058814
SDG 601.000339
SEK 9.164404
SGD 1.270104
SHP 0.746601
SLE 24.625038
SLL 20969.496166
SOS 572.508387
SRD 37.706038
STD 20697.981008
STN 20.807678
SVC 8.764703
SYP 110.597048
SZL 16.413436
THB 32.120369
TJS 9.446006
TMT 3.505
TND 2.907215
TOP 2.40776
TRY 44.844404
TTD 6.803686
TWD 31.480367
TZS 2594.935038
UAH 44.099112
UGX 3709.711665
UYU 39.848826
UZS 12155.930188
VES 479.657038
VND 26335
VUV 117.475878
WST 2.715253
XAF 557.099665
XAG 0.012375
XAU 0.000207
XCD 2.70255
XCG 1.805342
XDR 0.692853
XOF 557.099665
XPF 101.286679
YER 238.603589
ZAR 16.316204
ZMK 9001.203584
ZMW 19.057285
ZWL 321.999592
  • RBGPF

    -13.5000

    69

    -19.57%

  • BCE

    -0.0700

    24.09

    -0.29%

  • RIO

    0.4400

    100.15

    +0.44%

  • CMSC

    0.1500

    22.77

    +0.66%

  • RELX

    0.4700

    36.68

    +1.28%

  • RYCEF

    0.5600

    17.66

    +3.17%

  • AZN

    4.3300

    204.8

    +2.11%

  • CMSD

    0.1800

    23.08

    +0.78%

  • GSK

    1.2200

    58.35

    +2.09%

  • NGG

    -0.6000

    86.92

    -0.69%

  • BTI

    0.5400

    56.68

    +0.95%

  • JRI

    0.1800

    13.09

    +1.38%

  • BCC

    4.2400

    83.04

    +5.11%

  • BP

    -3.0400

    44.59

    -6.82%

  • VOD

    -0.2200

    15.48

    -1.42%

Biden robocall: Audio deepfake fuels election disinformation fears
Biden robocall: Audio deepfake fuels election disinformation fears / Photo: © AFP

Biden robocall: Audio deepfake fuels election disinformation fears

The 2024 White House race faces the prospect of a firehose of AI-enabled disinformation, with a robocall impersonating US President Joe Biden already stoking particular alarm about audio deepfakes.

Text size:

"What a bunch of malarkey," said the phone message, digitally spoofing Biden's voice and echoing one of his signature phrases.

The robocall urged New Hampshire residents not to cast ballots in the Democratic primary last month, prompting state authorities to launch a probe into possible voter suppression.

It also triggered demands from campaigners for stricter guardrails around generative artificial intelligence tools or an outright ban on robocalls.

Disinformation researchers fear rampant misuse of AI-powered applications in a pivotal election year thanks to proliferating voice cloning tools, which are cheap and easy to use and hard to trace.

"This is certainly the tip of the iceberg," Vijay Balasubramaniyan, chief executive and co-founder of cybersecurity firm Pindrop, told AFP.

"We can expect to see many more deepfakes throughout this election cycle."

A detailed analysis published by Pindrop said a text-to-speech system developed by the AI voice cloning startup ElevenLabs was used to create the Biden robocall.

The scandal comes as campaigners on both sides of the US political aisle harness advanced AI tools for effective campaign messaging and as tech investors pump millions of dollars into voice cloning startups.

Balasubramaniyan refused to say whether Pindrop had shared its findings with ElevenLabs, which last month announced a financing round from investors that, according to Bloomberg News, gave the firm a valuation of $1.1 billion.

ElevenLabs did not respond to repeated AFP requests for comment. Its website leads users to a free text-to-speech generator to "create natural AI voices instantly in any language."

Under its safety guidelines, the firm said users were allowed to generate voice clones of political figures such as Donald Trump without their permission if they "express humor or mockery" in a way that makes it "clear to the listener that what they are hearing is a parody, and not authentic content."

- 'Electoral chaos' -

US regulators have been considering making AI-generated robocalls illegal, with the fake Biden call giving the effort new impetus.

"The political deepfake moment is here," said Robert Weissman, president of the advocacy group Public Citizen.

"Policymakers must rush to put in place protections or we're facing electoral chaos. The New Hampshire deepfake is a reminder of the many ways that deepfakes can sow confusion."

Researchers fret the impact of AI tools that create videos and text so seemingly real that voters could struggle to decipher truth from fiction, undermining trust in the electoral process.

But audio deepfakes used to impersonate or smear celebrities and politicians around the world have sparked the most concern.

"Of all the surfaces -- video, image, audio -- that AI can be used for voter suppression, audio is the biggest vulnerability," Tim Harper, a senior policy analyst at the Center for Democracy & Technology, told AFP.

"It is easy to clone a voice using AI, and it is difficult to identify."

- 'Election integrity' -

The ease of creating and disseminating fake audio content complicates an already hyperpolarized political landscape, undermining confidence in the media and enabling anyone to claim that fact-based "evidence has been fabricated," Wasim Khaled, chief executive of Blackbird.AI, told AFP.

Such concerns are rife as the proliferation of AI audio tools outpaces detection software.

China's ByteDance, owner of the wildly popular platform TikTok, recently unveiled StreamVoice, an AI tool for real-time conversion of a user's voice to any desired alternative.

"Even though the attackers used ElevenLabs this time, it is likely to be a different generative AI system in future attacks," Balasubramaniyan said.

"It is imperative that there are enough safeguards available in these tools."

Balasubramaniyan and other researchers recommended building audio watermarks or digital signatures into tools as possible protections as well as regulation that makes them available only for verified users.

"Even with those actions, detecting when these tools are used to generate harmful content that violates your terms of service is really hard and really expensive," Harper said.

"(It) requires investment in trust and safety and a commitment to building with election integrity centred as a risk."

J.P.Cortez--TFWP