The Fort Worth Press - Anthropic vows court fight in Pentagon row

USD -
AED 3.673042
AFN 63.503991
ALL 82.550403
AMD 377.310403
ANG 1.789731
AOA 917.000367
ARS 1420.072104
AUD 1.421464
AWG 1.8
AZN 1.70397
BAM 1.691751
BBD 2.014385
BDT 122.318525
BGN 1.647646
BHD 0.378597
BIF 2965
BMD 1
BND 1.281522
BOB 6.911257
BRL 5.245041
BSD 1.00019
BTN 91.862623
BWP 13.572809
BYN 2.943209
BYR 19600
BZD 2.01153
CAD 1.35725
CDF 2222.50392
CHF 0.776535
CLF 0.023082
CLP 911.390396
CNY 6.89675
CNH 6.907005
COP 3784.41
CRC 477.526997
CUC 1
CUP 26.5
CVE 95.37504
CZK 20.991804
DJF 177.720393
DKK 6.431204
DOP 60.37504
DZD 131.30404
EGP 50.302379
ERN 15
ETB 156.103874
EUR 0.86055
FJD 2.210504
FKP 0.75091
GBP 0.745879
GEL 2.72504
GGP 0.75091
GHS 10.77504
GIP 0.75091
GMD 73.503851
GNF 8780.000355
GTQ 7.673344
GYD 209.24027
HKD 7.82195
HNL 26.57504
HRK 6.485804
HTG 131.210075
HUF 337.810388
IDR 16939.9
ILS 3.09326
IMP 0.75091
INR 91.94045
IQD 1310.5
IRR 1320700.000352
ISK 124.950386
JEP 0.75091
JMD 156.632759
JOD 0.70904
JPY 157.802504
KES 129.150385
KGS 87.450384
KHR 4010.00035
KMF 424.00035
KPW 900.009268
KRW 1485.290383
KWD 0.307504
KYD 0.833467
KZT 494.150517
LAK 21425.000349
LBP 89550.000349
LKR 311.132062
LRD 183.000348
LSL 16.570381
LTL 2.95274
LVL 0.60489
LYD 6.365039
MAD 9.296039
MDL 17.297288
MGA 4185.000347
MKD 53.021635
MMK 2099.899945
MNT 3569.0757
MOP 8.055288
MRU 40.130379
MUR 47.425039
MVR 15.460378
MWK 1737.000345
MXN 17.797904
MYR 3.946039
MZN 63.903729
NAD 16.570377
NGN 1388.000344
NIO 36.730377
NOK 9.583104
NPR 146.970372
NZD 1.694916
OMR 0.385845
PAB 1.000186
PEN 3.481504
PGK 4.30766
PHP 59.045038
PKR 279.425038
PLN 3.67655
PYG 6543.664798
QAR 3.64125
RON 4.382504
RSD 100.674038
RUB 79.26285
RWF 1458
SAR 3.753249
SBD 8.045182
SCR 14.715038
SDG 601.503676
SEK 9.182504
SGD 1.274804
SHP 0.750259
SLE 24.525038
SLL 20969.49935
SOS 571.503662
SRD 37.656504
STD 20697.981008
STN 21.5
SVC 8.751124
SYP 110.821403
SZL 16.570369
THB 31.775038
TJS 9.616092
TMT 3.51
TND 2.910504
TOP 2.40776
TRY 44.065038
TTD 6.776714
TWD 31.817304
TZS 2580.000335
UAH 43.704242
UGX 3690.921044
UYU 39.348488
UZS 12210.000334
VES 425.142005
VND 26220
VUV 119.29626
WST 2.726253
XAF 567.350963
XAG 0.011854
XAU 0.000193
XCD 2.70255
XCG 1.802549
XDR 0.702398
XOF 564.503593
XPF 103.150363
YER 238.503589
ZAR 16.54513
ZMK 9001.203584
ZMW 19.337678
ZWL 321.999592
  • RBGPF

    0.1000

    82.5

    +0.12%

  • AZN

    -3.3000

    194.22

    -1.7%

  • BTI

    -0.7200

    57.87

    -1.24%

  • RYCEF

    -0.2500

    17

    -1.47%

  • RELX

    0.5000

    35.68

    +1.4%

  • RIO

    -0.6200

    90.21

    -0.69%

  • GSK

    -0.7600

    54.51

    -1.39%

  • BP

    1.1400

    40.44

    +2.82%

  • NGG

    0.1200

    89.86

    +0.13%

  • VOD

    -0.1100

    14.51

    -0.76%

  • CMSC

    -0.1050

    23.185

    -0.45%

  • CMSD

    -0.0100

    23.2

    -0.04%

  • BCE

    0.0800

    26.06

    +0.31%

  • BCC

    -1.9600

    75.35

    -2.6%

  • JRI

    -0.2300

    12.57

    -1.83%

Anthropic vows court fight in Pentagon row
Anthropic vows court fight in Pentagon row / Photo: © AFP/File

Anthropic vows court fight in Pentagon row

Anthropic chief executive Dario Amodei has said the company has "no choice" but to challenge in court the Pentagon's formal designation of the artificial intelligence firm as a risk to US national security.

Text size:

The CEO, writing in a blog post on Thursday, insisted however that the ruling's practical scope is narrower than initially suggested, signaling that the designation would not have a catastrophic effect on the company.

Amodei said the Department of War -- the name preferred by the Trump administration for the Department of Defense -- confirmed in a letter that Anthropic and its products, including its widely-used Claude AI model, have been deemed a supply chain risk.

It is the first time a US company has ever been publicly given such a designation, a label typically reserved for organizations from foreign adversary countries, like Chinese tech company Huawei.

Amodei, in his blog post, said the company disputes the legal basis of the action but sought to reassure customers.

"It plainly applies only to the use of Claude by customers as a direct part of contracts with the Department of War, not all use of Claude by customers who have such contracts," he wrote.

The designation will require defense vendors and contractors to certify that they don't use Anthropic's models in their work with the Pentagon.

But Amodei argued that under the relevant statute, the intention is "to protect the government rather than to punish a supplier" and requires the Department of Defense to use "the least restrictive means necessary."

Microsoft, one of Anthropic's biggest partners, agreed with that reading, telling US media its lawyers studied the designation and concluded that Anthropic products, including Claude, can remain available to its customers other than the Department of War.

- 'Sloppy' -

The dispute erupted after Anthropic infuriated Pentagon chief Pete Hegseth by insisting its technology should not be used for mass surveillance or fully autonomous weapons systems.

Washington hit back, saying the Pentagon operates within the law and that contracted suppliers cannot dictate terms on how their products are used.

Amodei also used the statement to apologize for an internal company memo leaked to the press this week, in which he told staff the actions against the company were politically motivated.

"The real reasons" the Trump administration "do not like us is that we haven't donated to Trump (while OpenAI/Greg have donated a lot)," Amodei said, referring to Greg Brockman, the president of ChatGPT-maker OpenAI, who has donated $25 million to Trump.

Amodei called the memo an "out-of-date assessment of the current situation," written under duress on a day that saw his company under extreme pressure from the government.

OpenAI initially swooped in to replace Anthropic in its contract with the US military, but that move backfired when senior OpenAI staff expressed discomfort with the deal.

OpenAI CEO Sam Altman later said the deal was "sloppy" and that he was working to revise it.

The standoff with the Pentagon has had some silver lining for Anthropic, which was founded in 2021 by former staffers of OpenAI, with a focus on AI safety.

The conflict has helped propel the Claude app to the top of download rankings on Apple and Google smartphones.

Anthropic also indicated to AFP that the number of paying users of its Claude model had doubled since the beginning of the year and that its app is currently downloaded more than a million times a day.

S.Jones--TFWP