<?xml version="1.0" encoding="UTF-8"?><rss xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:atom="http://www.w3.org/2005/Atom" version="2.0" xmlns:itunes="http://www.itunes.com/dtds/podcast-1.0.dtd" xmlns:googleplay="http://www.google.com/schemas/play-podcasts/1.0"><channel><title><![CDATA[Legal Digitalism]]></title><description><![CDATA[Legal Digitalism examines how systems of power emerge in the digital age faster than the legal frameworks designed to govern them.]]></description><link>https://www.legaldigitalism.com</link><generator>Substack</generator><lastBuildDate>Thu, 30 Apr 2026 07:01:05 GMT</lastBuildDate><atom:link href="https://www.legaldigitalism.com/feed" rel="self" type="application/rss+xml"/><copyright><![CDATA[Talal Al-Johani]]></copyright><language><![CDATA[en]]></language><webMaster><![CDATA[legaldigitalism@substack.com]]></webMaster><itunes:owner><itunes:email><![CDATA[legaldigitalism@substack.com]]></itunes:email><itunes:name><![CDATA[Talal Al-Johani]]></itunes:name></itunes:owner><itunes:author><![CDATA[Talal Al-Johani]]></itunes:author><googleplay:owner><![CDATA[legaldigitalism@substack.com]]></googleplay:owner><googleplay:email><![CDATA[legaldigitalism@substack.com]]></googleplay:email><googleplay:author><![CDATA[Talal Al-Johani]]></googleplay:author><itunes:block><![CDATA[Yes]]></itunes:block><item><title><![CDATA[Who Governs the Machine? ]]></title><description><![CDATA[Three AI crises in sixty days revealed how far technology has outrun the law.]]></description><link>https://www.legaldigitalism.com/p/who-governs-the-machine</link><guid isPermaLink="false">https://www.legaldigitalism.com/p/who-governs-the-machine</guid><dc:creator><![CDATA[Talal Al-Johani]]></dc:creator><pubDate>Tue, 10 Mar 2026 14:00:22 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!p7MZ!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fbcd527de-de4e-4dae-82a1-1ea2cc9732ce_949x519.heic" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p style="text-align: justify;">In the first sixty days of 2026, an AI chatbot mass-produced non-consensual intimate images of women and children online; an AI system guided a military raid to capture a sitting head of state; the same system later identified airstrike targets in the opening salvo of a regional war. In each event, the law was an afterthought. </p><p style="text-align: justify;">This is not a story about bad actors or broken systems. It is a story of sequence and consequence: innovation arrives, it embeds and scales, it causes harm, lawmakers wake up. Never the reverse. </p><div><hr></div><p style="text-align: justify;"><strong>Samantha Smith learned that an AI was undressing her in post replies when strangers told her about it on X.</strong> </p><blockquote><p style="text-align: justify;">&#8220;While it wasn&#8217;t me that was in states of undress, it looked like me and it felt as violating as if someone had actually posted a nude or bikini picture of me.&#8221;<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-1" href="#footnote-1" target="_self">1</a></p></blockquote><p style="text-align: justify;">She was not alone. Grok, the generative AI model built by Elon Musk&#8217;s company xAI gained an image generation feature which was embedded directly into X.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-2" href="#footnote-2" target="_self">2</a> The early use was almost innocuous: OnlyFans creators asked Grok to generate bikini pictures of them in public replies, using it as a promotional tool. Grok complied. But once users realised the feature could be applied to anyone, with or without their consent, the dynamic shifted at dizzying speed. AI Forensics, a research group monitoring the platform, documented thousands of &#8216;deepfakes&#8217; that were produced.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-3" href="#footnote-3" target="_self">3</a> Victims included public figures such as the Princess of Wales, casual users, and most disturbingly, minors.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-4" href="#footnote-4" target="_self">4</a> No one was invulnerable. </p><p style="text-align: justify;">By the time Westminster was aware, the damage was done. Ofcom, the UK&#8217;s communications regulator, made urgent contact with X and launched a formal investigation on 12 January 2026.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-5" href="#footnote-5" target="_self">5</a> The Information Commissioner&#8217;s Office opened a parallel probe citing concerns around the UK&#8217;s General Data Protection Regulation (GDPR).<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-6" href="#footnote-6" target="_self">6</a> Across the Channel, the European Commission began its own inquiry under the Digital Services Act;<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-7" href="#footnote-7" target="_self">7</a> Malaysia, France, and India each identified separate offences.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-8" href="#footnote-8" target="_self">8</a> Every response came after visuals had already been generated and circulated. xAI&#8217;s response was to restrict image generation to paying subscribers on X.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-9" href="#footnote-9" target="_self">9</a> The move simply turned the underlying capability into a premium service and the multi-jurisdictional effort at damage control is still underway. </p><p style="text-align: justify;">The UK&#8217;s Online Safety Act received Royal Assent in October 2023,<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-10" href="#footnote-10" target="_self">10</a> and was built on a model of harm that made sense at the time: users sharing dangerous content with other users. The Act required platforms to assess and mitigate risks, and remove illegal material when posted by users. However, the Act did not contemplate content generated by the platform itself. Grok&#8217;s images were generated in a one-to-one interaction between user and chatbot. Although the chatbot replied publicly, said interaction technically did not involve another user. Ofcom acknowledged the gap: a user&#8217;s interaction with a chatbot was not regulated under the Act&#8217;s core enforcement framework.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-11" href="#footnote-11" target="_self">11</a> The law existed; it had not arrived late. It was looking in another direction entirely. </p><p style="text-align: justify;">Prime Minister Keir Starmer conceded as much.</p><blockquote><p style="text-align: justify;">&#8220;Technology is moving really fast, and the law has got to keep up.&#8221;<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-12" href="#footnote-12" target="_self">12</a></p></blockquote><p style="text-align: justify;">His government moved to close the loophole through an amendment to the Crime and Policing Bill, bringing AI chatbot providers within the Bill&#8217;s scope criminalising AI tools that generate non-consensual intimate images.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-13" href="#footnote-13" target="_self">13</a> The Bill is currently under consideration in the House of Lords.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-14" href="#footnote-14" target="_self">14</a>  The fix was an admission: a statute enacted twenty-six months earlier had already become inadequate. </p><p style="text-align: justify;">Not because it was poorly drafted, but because the assumption at its foundation had been inverted by the technology it was supposed to govern.</p><div><hr></div><p style="text-align: justify;"><strong>Westminster rushed to regulate. Washington went to war.</strong> </p><p style="text-align: justify;">In early January 2026, US special forces launched Operation Absolute Resolve, a raid on the Venezuelan capital Caracas that resulted in the capture of President Nicol&#225;s Maduro.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-15" href="#footnote-15" target="_self">15</a> The Wall Street Journal reported that Anthropic&#8217;s AI model, Claude, was used during the operation through Palantir&#8217;s Maven Smart System, an AI platform deployed by the Pentagon.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-16" href="#footnote-16" target="_self">16</a> Anthropic did not comment on whether Claude had been used in the operation given its classified nature, though such use would have likely violated its terms of service.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-17" href="#footnote-17" target="_self">17</a> This was the only reported use of a frontier AI model in a military operation at the time. </p><p style="text-align: justify;">Eight weeks later, the US and Israel launched a coordinated airstrike campaign against Iran. Over a thousand targets were hit in the first twenty-four hours.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-18" href="#footnote-18" target="_self">18</a> The Washington Post reported that Claude, through the same Maven Smart System built by Palantir, helped propose targets, prioritise them, and provide location coordinates.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-19" href="#footnote-19" target="_self">19</a> A Georgetown University study found that the system allowed a single artillery unit to perform work previously requiring 2,000 personnel, using a team of just twenty.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-20" href="#footnote-20" target="_self">20</a> The kill chain, the operational sequence from finding a target to destroying it, had been compressed and optimised. In 2020, Christian Brose, former senior policy advisor to the late Senator John McCain, foreshadowed this future in <em>The Kill Chain: Defending America in the Future of High-Tech Warfare</em>.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-21" href="#footnote-21" target="_self">21</a> By the last weekend in February, it had arrived. </p><p style="text-align: justify;">The Pentagon, in anticipation, had already moved. Palantir&#8217;s Maven Smart System, powered in part by Anthropic&#8217;s Claude, was already being adopted across US combatant commands, a trend evident by the rising value of Pentagon contracts tied to the system.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-22" href="#footnote-22" target="_self">22</a> By March 2025, NATO had signed its own contract with Palantir to deploy the system.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-23" href="#footnote-23" target="_self">23</a> Later in July, Anthropic signed a $200 million contract with the Department of Defense<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-24" href="#footnote-24" target="_self">24</a> under which Claude became the first AI model approved for use on classified military networks.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-25" href="#footnote-25" target="_self">25</a> The contract required the Pentagon to abide by Anthropic&#8217;s acceptable use policy, which carried two explicit restrictions: no autonomous weapons and no domestic mass-surveillance.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-26" href="#footnote-26" target="_self">26</a> The Pentagon agreed. Then it reversed course. In January 2026, Defense Secretary Pete Hegseth issued an AI strategy memorandum directing that all Department of Defense AI-contracts adopt the language &#8220;any lawful use.&#8221;<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-27" href="#footnote-27" target="_self">27</a> For Anthropic, the memo was a direct collision with the two safeguards its contract had contained. </p><p style="text-align: justify;">After weeks of failed negotiations, Secretary Hegseth threatened to invoke the Defense Production Act, a statute enacted in 1950 during the Korean War that gives the President authority to direct private industry in the interest of national security.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-28" href="#footnote-28" target="_self">28</a> It was intended for steel mills and munitions factories. He also warned that Anthropic could be designated a supply chain risk, a classification never used for an American company.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-29" href="#footnote-29" target="_self">29</a> After the deadline passed without agreement, President Trump directed all federal agencies to phase out the use of Anthropic&#8217;s tools.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-30" href="#footnote-30" target="_self">30</a> Hours later, Claude was used in Operation Epic Fury.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-31" href="#footnote-31" target="_self">31</a> The Pentagon had declared Anthropic a risk, even as it continued to rely on its technology. </p><p style="text-align: justify;">Anthropic&#8217;s carve-outs were contractual terms because they did not exist in law. There is no US statute prohibiting autonomous weapons and no federal legislation governing AI-enabled mass-surveillance. For some policymakers, that ambiguity may represent a strategic advantage in an emerging AI arms race. &#8220;Any lawful use&#8221; means what the law has not yet prohibited, and with innovative technologies, has not yet considered. </p><p style="text-align: justify;">This was the basis of Anthropic CEO Dario Amodei&#8217;s objection.</p><blockquote><p style="text-align: justify;">&#8220;Congress is not the fastest moving body in the world &#8230; for right now, we are the ones who see this technology on the front lines.&#8221;<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-32" href="#footnote-32" target="_self">32</a></p></blockquote><p style="text-align: justify;">Anthropic was drawing boundaries that Congress had not. Congressman Sam Liccardo put it more bluntly:</p><blockquote><p style="text-align: justify;">&#8220;There is only one problem with the Pentagon&#8217;s approach: there is no law. The law is years behind the technology.&#8221;<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-33" href="#footnote-33" target="_self">33</a></p></blockquote><p style="text-align: justify;">The representative introduced an amendment to the Defense Production Act to prevent the Pentagon from retaliating against companies with safety guardrails.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-34" href="#footnote-34" target="_self">34</a> Anthropic filed two federal lawsuits accusing the Trump administration of retaliating against the company for its stance on AI safety.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-35" href="#footnote-35" target="_self">35</a> The first binding precedent on AI safety restrictions in military use may now come from the judiciary, not the legislature. Two institutions, two branches of government, both arriving after the fact.</p><div><hr></div><p style="text-align: justify;"><strong>The court of public opinion reached its verdict before any institution.</strong></p><p style="text-align: justify;">Within hours, OpenAI CEO Sam Altman announced that his company had entered into an agreement with the Pentagon to deploy ChatGPT on classified networks.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-36" href="#footnote-36" target="_self">36</a> The timing was telling. The backlash was immediate. By the weekend, Claude surged past ChatGPT to become the most downloaded free app on Apple&#8217;s App Store.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-37" href="#footnote-37" target="_self">37</a> More than 4 million people joined the QuitGPT boycott.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-38" href="#footnote-38" target="_self">38</a> Chalk graffiti appeared on the pavement outside OpenAI&#8217;s San Francisco office.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-39" href="#footnote-39" target="_self">39</a> Hundreds of OpenAI and Google employees signed a joint open letter supporting Anthropic&#8217;s refusal and urging limits on the Pentagon&#8217;s AI use.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-40" href="#footnote-40" target="_self">40</a> In the absence of legislation, accountability did not come from Congress, nor the courts, but from public pressure.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!p7MZ!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fbcd527de-de4e-4dae-82a1-1ea2cc9732ce_949x519.heic" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!p7MZ!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fbcd527de-de4e-4dae-82a1-1ea2cc9732ce_949x519.heic 424w, https://substackcdn.com/image/fetch/$s_!p7MZ!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fbcd527de-de4e-4dae-82a1-1ea2cc9732ce_949x519.heic 848w, https://substackcdn.com/image/fetch/$s_!p7MZ!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fbcd527de-de4e-4dae-82a1-1ea2cc9732ce_949x519.heic 1272w, https://substackcdn.com/image/fetch/$s_!p7MZ!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fbcd527de-de4e-4dae-82a1-1ea2cc9732ce_949x519.heic 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!p7MZ!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fbcd527de-de4e-4dae-82a1-1ea2cc9732ce_949x519.heic" width="949" height="519" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/bcd527de-de4e-4dae-82a1-1ea2cc9732ce_949x519.heic&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:519,&quot;width&quot;:949,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:67171,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/heic&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:&quot;https://www.legaldigitalism.com/i/190447685?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fbcd527de-de4e-4dae-82a1-1ea2cc9732ce_949x519.heic&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!p7MZ!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fbcd527de-de4e-4dae-82a1-1ea2cc9732ce_949x519.heic 424w, https://substackcdn.com/image/fetch/$s_!p7MZ!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fbcd527de-de4e-4dae-82a1-1ea2cc9732ce_949x519.heic 848w, https://substackcdn.com/image/fetch/$s_!p7MZ!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fbcd527de-de4e-4dae-82a1-1ea2cc9732ce_949x519.heic 1272w, https://substackcdn.com/image/fetch/$s_!p7MZ!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fbcd527de-de4e-4dae-82a1-1ea2cc9732ce_949x519.heic 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">A protest banner displayed outside OpenAI&#8217;s San Francisco office after the company&#8217;s Pentagon deal, with chalk visible on the sidewalk.</figcaption></figure></div><p style="text-align: justify;">Altman conceded on X: </p><blockquote><p style="text-align: justify;">&#8220;We were genuinely trying to de-escalate things and avoid a much worse outcome, but I think it just looked opportunistic and sloppy.&#8221;<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-41" href="#footnote-41" target="_self">41</a></p></blockquote><p style="text-align: justify;">The contract was subsequently amended to include language prohibiting the use of OpenAI&#8217;s systems for mass domestic surveillance, direct autonomous weapons, and social credit systems.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-42" href="#footnote-42" target="_self">42</a> Even so, the inclusion of the phrase &#8220;all lawful purposes, consistent with applicable law, [and] operational requirements&#8221;<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-43" href="#footnote-43" target="_self">43</a> raises questions around whether those explicit red lines are meaningfully enforceable. </p><div><hr></div><p style="text-align: justify;"><strong>Yet one jurisdiction tried to get ahead of the curve.</strong></p><p style="text-align: justify;">A month before the Pentagon-Anthropic clash, Singapore unveiled the world&#8217;s first governance framework for agentic AI at the World Economic Forum in Davos.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-44" href="#footnote-44" target="_self">44</a> The framework addresses AI systems capable of autonomous reasoning, planning, and action. Systems that go beyond today&#8217;s generative models: initiating and executing tasks with minimal human input. It calls for organisations to assess risks before deployment, assign clear human accountability throughout processes, implement technical safeguards, and ensure transparency with users.<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-45" href="#footnote-45" target="_self">45</a> However, the framework is non-binding and carries no force of statute. Still, it attempts something no other jurisdiction has managed: anticipating the governance of a technology before it scales and embeds.</p><div><hr></div><p style="text-align: justify;"><strong>The technology evolves; the sequence doesn&#8217;t. </strong></p><p style="text-align: justify;">In the UK, the Online Safety Act was built for a threat that had already changed shape. In the US, the government reached for a statute from the industrial era to compel a Silicon Valley company, while the most forward-looking response remained a non-binding framework. In the absence of law, the most effective checks came from an online petition and chalk on the pavement. If corporate policies are more restrictive than regulation, if existing systems of governance are not capable of keeping pace with innovation, then who governs the machine?</p><p style="text-align: justify;">For now, no one in particular and everyone by accident.</p><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-1" href="#footnote-anchor-1" class="footnote-number" contenteditable="false" target="_self">1</a><div class="footnote-content"><p>Laura Cress, &#8216;Woman felt &#8220;dehumanised&#8221; after Musk&#8217;s Grok AI used to digitally remove her clothes&#8217; <em>BBC News</em> (2 January 2026) &lt;<a href="https://www.bbc.com/news/articles/c98p1r4e6m8o">https://www.bbc.com/news/articles/c98p1r4e6m8o</a>&gt; accessed 7 March 2026.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-2" href="#footnote-anchor-2" class="footnote-number" contenteditable="false" target="_self">2</a><div class="footnote-content"><p>xAI, &#8216;Grok Image Generation Release&#8217; (xAI, 9 December 2024) &lt;<a href="https://x.ai/news/grok-image-generation-release">https://x.ai/news/grok-image-generation-release</a>&gt; accessed 9 March 2026.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-3" href="#footnote-anchor-3" class="footnote-number" contenteditable="false" target="_self">3</a><div class="footnote-content"><p>Dr Paul Bouchaud, <em>Grok Unleashed</em> (AI Forensics, 5 January 2026) &lt;<a href="https://aiforensics.org/work/grok-unleashed">https://aiforensics.org/work/grok-unleashed</a>&gt; accessed 7 March 2026.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-4" href="#footnote-anchor-4" class="footnote-number" contenteditable="false" target="_self">4</a><div class="footnote-content"><p>Dr Federica Fedorczyk, &#8216;Expert Comment: Chatbot-driven sexual abuse? The Grok case is just the tip of the iceberg&#8217; University of Oxford (14 January 2026) &lt;<a href="https://www.ox.ac.uk/news/2026-01-14-expert-comment-chatbot-driven-sexual-abuse-grok-case-just-tip-iceberg">https://www.ox.ac.uk/news/2026-01-14-expert-comment-chatbot-driven-sexual-abuse-grok-case-just-tip-iceberg</a>&gt; accessed 7 March 2026.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-5" href="#footnote-anchor-5" class="footnote-number" contenteditable="false" target="_self">5</a><div class="footnote-content"><p>Ofcom, &#8216;Ofcom launches investigation into X over Grok sexualised imagery&#8217; (Ofcom, 12 January 2026) &lt;<a href="https://www.ofcom.org.uk/online-safety/illegal-and-harmful-content/ofcom-launches-investigation-into-x-over-grok-sexualised-imagery">https://www.ofcom.org.uk/online-safety/illegal-and-harmful-content/ofcom-launches-investigation-into-x-over-grok-sexualised-imagery</a>&gt; accessed 7 March 2026.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-6" href="#footnote-anchor-6" class="footnote-number" contenteditable="false" target="_self">6</a><div class="footnote-content"><p>Information Commissioner&#8217;s Office, &#8216;ICO announces investigation into Grok&#8217; (Information Commissioner&#8217;s Office, February 2026) &lt;<a href="https://ico.org.uk/about-the-ico/media-centre/news-and-blogs/2026/02/ico-announces-investigation-into-grok/">https://ico.org.uk/about-the-ico/media-centre/news-and-blogs/2026/02/ico-announces-investigation-into-grok/</a>&gt; accessed 7 March 2026.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-7" href="#footnote-anchor-7" class="footnote-number" contenteditable="false" target="_self">7</a><div class="footnote-content"><p>European Commission, &#8216;Commission investigates Grok and X&#8217;s recommender systems under the Digital Services Act&#8217; (Shaping Europe&#8217;s digital future, 26 January 2026) &lt;<a href="https://digital-strategy.ec.europa.eu/en/news/commission-investigates-grok-and-xs-recommender-systems-under-digital-services-act">https://digital-strategy.ec.europa.eu/en/news/commission-investigates-grok-and-xs-recommender-systems-under-digital-services-act</a>&gt; accessed 7 March 2026.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-8" href="#footnote-anchor-8" class="footnote-number" contenteditable="false" target="_self">8</a><div class="footnote-content"><p>Bloomberg News, &#8216;Malaysia, France, India Hit Out at X for &#8220;Offensive&#8221; Grok Images&#8217; Bloomberg (4 January 2026) &lt;<a href="https://www.bloomberg.com/news/articles/2026-01-04/malaysia-france-india-hit-out-at-x-for-offensive-grok-images?embedded-checkout=true">https://www.bloomberg.com/news/articles/2026-01-04/malaysia-france-india-hit-out-at-x-for-offensive-grok-images?embedded-checkout=true</a>&gt; accessed 7 March 2026.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-9" href="#footnote-anchor-9" class="footnote-number" contenteditable="false" target="_self">9</a><div class="footnote-content"><p>Akash Sriram and Anhata Rooprai, &#8216;Musk&#8217;s AI bot Grok limits some image generation on X after backlash&#8217; <em>Reuters</em> (9 January 2026) &lt;<a href="https://www.reuters.com/sustainability/boards-policy-regulation/musks-ai-bot-grok-limits-image-generation-x-paid-users-after-backlash-2026-01-09/">https://www.reuters.com/sustainability/boards-policy-regulation/musks-ai-bot-grok-limits-image-generation-x-paid-users-after-backlash-2026-01-09/</a>&gt; accessed 7 March 2026.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-10" href="#footnote-anchor-10" class="footnote-number" contenteditable="false" target="_self">10</a><div class="footnote-content"><p>Online Safety Act 2023 &lt;<a href="https://www.legislation.gov.uk/ukpga/2023/50/contents">https://www.legislation.gov.uk/ukpga/2023/50/contents</a>&gt;.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-11" href="#footnote-anchor-11" class="footnote-number" contenteditable="false" target="_self">11</a><div class="footnote-content"><p>Ofcom, &#8216;Investigation into X and scope of the Online Safety Act&#8217; (Ofcom, 3 February 2026) &lt;<a href="https://www.ofcom.org.uk/online-safety/illegal-and-harmful-content/investigation-into-x-and-scope-of-the-online-safety-act">https://www.ofcom.org.uk/online-safety/illegal-and-harmful-content/investigation-into-x-and-scope-of-the-online-safety-act</a>&gt; accessed 7 March 2026.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-12" href="#footnote-anchor-12" class="footnote-number" contenteditable="false" target="_self">12</a><div class="footnote-content"><p>Press Release &#8216;PM: &#8220;No platform gets a free pass&#8221;: Government takes action to keep children safe online&#8217; (GOV.UK, 15 February 2026) &lt;<a href="https://www.gov.uk/government/news/pm-no-platform-gets-a-free-pass-government-takes-action-to-keep-children-safe-online">https://www.gov.uk/government/news/pm-no-platform-gets-a-free-pass-government-takes-action-to-keep-children-safe-online</a>&gt; accessed 7 March 2026.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-13" href="#footnote-anchor-13" class="footnote-number" contenteditable="false" target="_self">13</a><div class="footnote-content"><p>Press Release &#8216;Tech companies must go &#8220;above and beyond&#8221; to protect women and girls from online abuse or face further action&#8217; (GOV.UK, 10 March 2026) &lt;<a href="https://www.gov.uk/government/news/tech-companies-must-go-above-and-beyond-to-protect-women-and-girls-from-online-abuse-or-face-further-action">https://www.gov.uk/government/news/tech-companies-must-go-above-and-beyond-to-protect-women-and-girls-from-online-abuse-or-face-further-action</a>&gt; accessed 10 March 2026.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-14" href="#footnote-anchor-14" class="footnote-number" contenteditable="false" target="_self">14</a><div class="footnote-content"><p>Crime and Policing Bill 2024&#8211;26 &lt;<a href="https://bills.parliament.uk/bills/3938">https://bills.parliament.uk/bills/3938</a>&gt; accessed 10 March 2026.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-15" href="#footnote-anchor-15" class="footnote-number" contenteditable="false" target="_self">15</a><div class="footnote-content"><p>Julian Barnes, Tyler Pager and Eric Schmitt, &#8216;Inside &#8216;Operation Absolute Resolve,&#8217; the U.S. Effort to Capture Maduro&#8217; <em>New York Times</em> (3 January 2026) &lt;<a href="https://www.nytimes.com/2026/01/03/us/politics/trump-capture-maduro-venezuela.html">https://www.nytimes.com/2026/01/03/us/politics/trump-capture-maduro-venezuela.html</a>&gt; accessed 8 March 2026.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-16" href="#footnote-anchor-16" class="footnote-number" contenteditable="false" target="_self">16</a><div class="footnote-content"><p>Ramkumar Amrith and Keach Hagey, &#8216;Pentagon Used Anthropic&#8217;s Claude in Maduro Venezuela Raid&#8217; <em>Wall Street Journal</em> (13 February 2026) &lt;<a href="https://www.wsj.com/politics/national-security/pentagon-used-anthropics-claude-in-maduro-venezuela-raid-583aff17">https://www.wsj.com/politics/national-security/pentagon-used-anthropics-claude-in-maduro-venezuela-raid-583aff17</a>&gt; accessed 8 March 2026.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-17" href="#footnote-anchor-17" class="footnote-number" contenteditable="false" target="_self">17</a><div class="footnote-content"><p>Carlos M&#233;ndez and Juby Babu, &#8216;US used Anthropic&#8217;s Claude during the Venezuela raid, WSJ reports&#8217; <em>Reuters</em> (15 February 2026) &lt;<a href="https://www.reuters.com/world/americas/us-used-anthropics-claude-during-the-venezuela-raid-wsj-reports-2026-02-13/">https://www.reuters.com/world/americas/us-used-anthropics-claude-during-the-venezuela-raid-wsj-reports-2026-02-13/</a>&gt; accessed 8 March 2026.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-18" href="#footnote-anchor-18" class="footnote-number" contenteditable="false" target="_self">18</a><div class="footnote-content"><p>Thomas Novelly, &#8216;First 24 hours of Trump&#8217;s war on Iran, by the numbers&#8217; Defense One (1 March 2026) &lt;<a href="https://www.defenseone.com/threats/2026/03/first-24-hours-trumps-war-iran-numbers/411789/">https://www.defenseone.com/threats/2026/03/first-24-hours-trumps-war-iran-numbers/411789/</a>&gt; accessed 8 March 2026.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-19" href="#footnote-anchor-19" class="footnote-number" contenteditable="false" target="_self">19</a><div class="footnote-content"><p>Tara Copp, Elizabeth Dwoskin and Ian Duncan, &#8216;Anthropic&#8217;s AI tool Claude central to U.S. campaign in Iran, amid a bitter feud&#8217; <em>Washington Post</em> (4 March 2026) &lt;<a href="https://www.washingtonpost.com/technology/2026/03/04/anthropic-ai-iran-campaign/">https://www.washingtonpost.com/technology/2026/03/04/anthropic-ai-iran-campaign/</a>&gt; accessed 8 March 2026.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-20" href="#footnote-anchor-20" class="footnote-number" contenteditable="false" target="_self">20</a><div class="footnote-content"><p>Emelia Probasco, Building the Tech Coalition: How Project Maven and the U.S. 18th Airborne Corps Operationalized Software and Artificial Intelligence for the Department of Defense (Center for Security and Emerging Technology, Georgetown University 2024) &lt;<a href="https://cset.georgetown.edu/wp-content/uploads/CSET-Building-the-Tech-Coalition-1.pdf">https://cset.georgetown.edu/wp-content/uploads/CSET-Building-the-Tech-Coalition-1.pdf</a>&gt; accessed 9 March 2026.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-21" href="#footnote-anchor-21" class="footnote-number" contenteditable="false" target="_self">21</a><div class="footnote-content"><p>Christian Brose, <em>The Kill Chain: Defending America in the Future of High-Tech Warfare</em> (Hachette Books 2020).</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-22" href="#footnote-anchor-22" class="footnote-number" contenteditable="false" target="_self">22</a><div class="footnote-content"><p>Sydney Freedberg Jr, &#8216;New contract expands Maven AI&#8217;s users &#8220;from hundreds to thousands&#8221; worldwide, Palantir says&#8217; <em>Breaking Defense</em> (30 May 2024) &lt;<a href="https://breakingdefense.com/2024/05/new-contract-expands-maven-ais-users-from-hundreds-to-thousands-worldwide-palantir-says/">https://breakingdefense.com/2024/05/new-contract-expands-maven-ais-users-from-hundreds-to-thousands-worldwide-palantir-says/</a>&gt; accessed 9 March 2026.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-23" href="#footnote-anchor-23" class="footnote-number" contenteditable="false" target="_self">23</a><div class="footnote-content"><p>Sydney Freedberg Jr, &#8216;NATO picks Palantir&#8217;s Maven AI for military planning, amid trans-Atlantic tension&#8217; <em>Breaking Defense</em> (14 April 2025) &lt;<a href="https://breakingdefense.com/2025/04/nato-picks-palantirs-maven-ai-for-military-planning-amid-trans-atlantic-tension/">https://breakingdefense.com/2025/04/nato-picks-palantirs-maven-ai-for-military-planning-amid-trans-atlantic-tension/</a>&gt; accessed 9 March 2026.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-24" href="#footnote-anchor-24" class="footnote-number" contenteditable="false" target="_self">24</a><div class="footnote-content"><p>Executive Order 14347 authorised the use of &#8220;Department of War&#8221; and &#8220;Secretary of War&#8221; as secondary titles within the executive branch. The statutory names remain unchanged as only Congress can formally rename a federal department. This piece uses the statutory titles throughout.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-25" href="#footnote-anchor-25" class="footnote-number" contenteditable="false" target="_self">25</a><div class="footnote-content"><p>J Ryan Frazee, John Prairie and Adam Hickey, &#8216;Pentagon Designates Anthropic a Supply Chain Risk &#8212; What Government Contractors Need to Know&#8217; Mayer Brown (2 March 2026) &lt;<a href="https://www.mayerbrown.com/en/insights/publications/2026/03/pentagon-designates-anthropic-a-supply-chain-risk-what-government-contractors-need-to-know">https://www.mayerbrown.com/en/insights/publications/2026/03/pentagon-designates-anthropic-a-supply-chain-risk-what-government-contractors-need-to-know</a>&gt; accessed 8 March 2026.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-26" href="#footnote-anchor-26" class="footnote-number" contenteditable="false" target="_self">26</a><div class="footnote-content"><p>Ibid. </p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-27" href="#footnote-anchor-27" class="footnote-number" contenteditable="false" target="_self">27</a><div class="footnote-content"><p>Secretary of War, &#8216;Artificial Intelligence Strategy for the Department of War: Accelerating America&#8217;s Military AI Dominance&#8217; (Department of War, 9 January 2026) &lt;<a href="https://media.defense.gov/2026/Jan/12/2003855671/-1/-1/0/ARTIFICIAL-INTELLIGENCE-STRATEGY-FOR-THE-DEPARTMENT-OF-WAR.PDF">https://media.defense.gov/2026/Jan/12/2003855671/-1/-1/0/ARTIFICIAL-INTELLIGENCE-STRATEGY-FOR-THE-DEPARTMENT-OF-WAR.PDF</a>&gt; accessed 8 March 2026.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-28" href="#footnote-anchor-28" class="footnote-number" contenteditable="false" target="_self">28</a><div class="footnote-content"><p>Alan Rozenshtein, &#8216;What the Defense Production Act Can and Can&#8217;t Do to Anthropic&#8217; <em>Lawfare</em> (28 February 2026) &lt;<a href="https://www.lawfaremedia.org/article/what-the-defense-production-act-can-and-can-t-do-to-anthropic">https://www.lawfaremedia.org/article/what-the-defense-production-act-can-and-can-t-do-to-anthropic</a>&gt; accessed 8 March 2026.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-29" href="#footnote-anchor-29" class="footnote-number" contenteditable="false" target="_self">29</a><div class="footnote-content"><p>Brendan Bordelon, &#8216;Pentagon tells Anthropic it has designated the company a supply chain risk&#8217; <em>Politico</em> (5 March 2026) &lt;<a href="https://www.politico.com/news/2026/03/05/pentagon-tells-anthropic-it-has-designated-the-company-a-supply-chain-risk-00814758">https://www.politico.com/news/2026/03/05/pentagon-tells-anthropic-it-has-designated-the-company-a-supply-chain-risk-00814758</a>&gt; accessed 8 March 2026.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-30" href="#footnote-anchor-30" class="footnote-number" contenteditable="false" target="_self">30</a><div class="footnote-content"><p>Donald J Trump (@realDonaldTrump), post on Truth Social (28 February 2026) &lt;<a href="https://truthsocial.com/@realDonaldTrump/posts/116144552969293195">https://truthsocial.com/@realDonaldTrump/posts/116144552969293195</a>&gt; accessed 10 March 2026.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-31" href="#footnote-anchor-31" class="footnote-number" contenteditable="false" target="_self">31</a><div class="footnote-content"><p>Copp, Dwoskin and Duncan, &#8216;Anthropic&#8217;s AI tool Claude central to U.S. campaign in Iran, amid a bitter feud&#8217;.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-32" href="#footnote-anchor-32" class="footnote-number" contenteditable="false" target="_self">32</a><div class="footnote-content"><p>CBS News, &#8216;Full interview: Anthropic CEO responds to Trump order, Pentagon clash&#8217; (YouTube, 9 March 2026) accessed 2 March 2026.</p><div id="youtube2-MPTNHrq_4LU" class="youtube-wrap" data-attrs="{&quot;videoId&quot;:&quot;MPTNHrq_4LU&quot;,&quot;startTime&quot;:null,&quot;endTime&quot;:null}" data-component-name="Youtube2ToDOM"><div class="youtube-inner"><iframe src="https://www.youtube-nocookie.com/embed/MPTNHrq_4LU?rel=0&amp;autoplay=0&amp;showinfo=0&amp;enablejsapi=0" frameborder="0" loading="lazy" gesture="media" allow="autoplay; fullscreen" allowautoplay="true" allowfullscreen="true" width="728" height="409"></iframe></div></div></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-33" href="#footnote-anchor-33" class="footnote-number" contenteditable="false" target="_self">33</a><div class="footnote-content"><p>Press Release, &#8216;Rep. Sam Liccardo Forces Vote on Pentagon&#8217;s Misguided AI Posture&#8217; (House of Representatives, 4 March 2026) &lt;<a href="https://liccardo.house.gov/media/press-releases/rep-sam-liccardo-forces-vote-pentagons-misguided-ai-posture">https://liccardo.house.gov/media/press-releases/rep-sam-liccardo-forces-vote-pentagons-misguided-ai-posture</a>&gt; accessed 8 March 2026.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-34" href="#footnote-anchor-34" class="footnote-number" contenteditable="false" target="_self">34</a><div class="footnote-content"><p>Ibid.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-35" href="#footnote-anchor-35" class="footnote-number" contenteditable="false" target="_self">35</a><div class="footnote-content"><p>Brendan Bordelon and Kyle Cheney, &#8216;Anthropic sues Trump admin over supply chain risk label&#8217; <em>Politico</em> (9 March 2026) &lt;<a href="https://www.politico.com/news/2026/03/09/anthropic-sues-trump-admin-over-supply-chain-risk-label-00818716">https://www.politico.com/news/2026/03/09/anthropic-sues-trump-admin-over-supply-chain-risk-label-00818716</a>&gt; accessed 10 March 2026.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-36" href="#footnote-anchor-36" class="footnote-number" contenteditable="false" target="_self">36</a><div class="footnote-content"><p>Sam Altman (@sama) (X, 27 February 2026) accessed 10 March 2026.</p><div class="twitter-embed" data-attrs="{&quot;url&quot;:&quot;https://x.com/sama/status/2027578508042723599&quot;,&quot;full_text&quot;:&quot;Tonight, we reached an agreement with the Department of War to deploy our models in their classified network.\n\nIn all of our interactions, the DoW displayed a deep respect for safety and a desire to partner to achieve the best possible outcome.\n\nAI safety and wide distribution of&quot;,&quot;username&quot;:&quot;sama&quot;,&quot;name&quot;:&quot;Sam Altman&quot;,&quot;profile_image_url&quot;:&quot;https://pbs.substack.com/profile_images/1904933748015255552/k43GMz63_normal.jpg&quot;,&quot;date&quot;:&quot;2026-02-28T02:56:01.000Z&quot;,&quot;photos&quot;:[],&quot;quoted_tweet&quot;:{},&quot;reply_count&quot;:3567,&quot;retweet_count&quot;:1058,&quot;like_count&quot;:9330,&quot;impression_count&quot;:8417581,&quot;expanded_url&quot;:null,&quot;video_url&quot;:null,&quot;belowTheFold&quot;:true}" data-component-name="Twitter2ToDOM"></div></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-37" href="#footnote-anchor-37" class="footnote-number" contenteditable="false" target="_self">37</a><div class="footnote-content"><p>Brent Griffiths and Madison Hoff, &#8216;Chart shows Claude&#8217;s dethroning of ChatGPT in app downloads race&#8217; (6 March 2026) <em>Business Insider</em> &lt;<a href="https://www.businessinsider.com/claude-number-1-app-stores-chatgpt-apple-google-ai-2026-3">https://www.businessinsider.com/claude-number-1-app-stores-chatgpt-apple-google-ai-2026-3</a>&gt; accessed 10 March 2026.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-38" href="#footnote-anchor-38" class="footnote-number" contenteditable="false" target="_self">38</a><div class="footnote-content"><p>QuitGPT, &#8216;ChatGPT takes Trump&#8217;s killer robot deal&#8217; (2026) &lt;<a href="http://quitgpt.org">https://quitgpt.org</a>&gt;  accessed 10 March 2026.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-39" href="#footnote-anchor-39" class="footnote-number" contenteditable="false" target="_self">39</a><div class="footnote-content"><p>Joe Barros, &#8216;March 1 Chalk Wars: It&#8217;s OpenAI vs. Anthropic on San Francisco&#8217;s sidewalks&#8217; Mission Local (1 March 2026) &lt;<a href="https://missionlocal.org/2026/03/sf-openai-anthropic-ai-pentagon-deal-trump-sidewalk-chalk/">https://missionlocal.org/2026/03/sf-openai-anthropic-ai-pentagon-deal-trump-sidewalk-chalk/</a>&gt; accessed 10 March 2026.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-40" href="#footnote-anchor-40" class="footnote-number" contenteditable="false" target="_self">40</a><div class="footnote-content"><p>Siladitya Ray, &#8216;OpenAI And Google Staffers Back Anthropic In Open Letter And Call For Limits On Pentagon AI Use&#8217; Forbes (27 February 2026) &lt;<a href="https://www.forbes.com/sites/siladityaray/2026/02/27/openai-and-google-staffers-back-anthropic-in-open-letter-and-call-for-limits-on-pentagon-ai-use/">https://www.forbes.com/sites/siladityaray/2026/02/27/openai-and-google-staffers-back-anthropic-in-open-letter-and-call-for-limits-on-pentagon-ai-use/</a>&gt; accessed 9 March 2026.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-41" href="#footnote-anchor-41" class="footnote-number" contenteditable="false" target="_self">41</a><div class="footnote-content"><p>Sam Altman (@sama), post on X  accessed 10 March 2026.</p><div class="twitter-embed" data-attrs="{&quot;url&quot;:&quot;https://x.com/sama/status/2028640354912923739&quot;,&quot;full_text&quot;:&quot;Here is re-post of an internal post:\n\nWe have been working with the DoW to make some additions in our agreement to make our principles very clear.\n\n1.   We are going to amend our deal to add this language, in addition to everything else:\n\n\&quot;&#8226; Consistent with applicable laws,&quot;,&quot;username&quot;:&quot;sama&quot;,&quot;name&quot;:&quot;Sam Altman&quot;,&quot;profile_image_url&quot;:&quot;https://pbs.substack.com/profile_images/1904933748015255552/k43GMz63_normal.jpg&quot;,&quot;date&quot;:&quot;2026-03-03T01:15:25.000Z&quot;,&quot;photos&quot;:[],&quot;quoted_tweet&quot;:{},&quot;reply_count&quot;:3902,&quot;retweet_count&quot;:643,&quot;like_count&quot;:6139,&quot;impression_count&quot;:3532619,&quot;expanded_url&quot;:null,&quot;video_url&quot;:null,&quot;belowTheFold&quot;:true}" data-component-name="Twitter2ToDOM"></div></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-42" href="#footnote-anchor-42" class="footnote-number" contenteditable="false" target="_self">42</a><div class="footnote-content"><p>OpenAI, &#8216;Our agreement with the Department of War&#8217; (OpenAI, 2 March 2026) &lt;<a href="https://openai.com/index/our-agreement-with-the-department-of-war/">https://openai.com/index/our-agreement-with-the-department-of-war/</a>&gt; accessed 9 March 2026.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-43" href="#footnote-anchor-43" class="footnote-number" contenteditable="false" target="_self">43</a><div class="footnote-content"><p>Ibid.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-44" href="#footnote-anchor-44" class="footnote-number" contenteditable="false" target="_self">44</a><div class="footnote-content"><p>Ministry of Digital Development and Information, &#8216;Singapore launches new Model AI Governance Framework for Agentic AI&#8217; (MDDI, 22 January 2026) &lt;<a href="https://www.mddi.gov.sg/newsroom/singapore-launches-new-model-ai-governance-framework-for-agentic-ai--/">https://www.mddi.gov.sg/newsroom/singapore-launches-new-model-ai-governance-framework-for-agentic-ai--/</a>&gt; accessed 10 March 2026.</p></div></div><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-45" href="#footnote-anchor-45" class="footnote-number" contenteditable="false" target="_self">45</a><div class="footnote-content"><p>Ibid.</p></div></div>]]></content:encoded></item></channel></rss>