A Political Philosophy of Technology - or - From Mordor, with malice
A vision of the future from the CEO of Palantir
We’ve been treated to a ticker-tape parade’s worth of updates this week. So, with DOGE achieving “God Mode” access to Federal government systems, Trump counting his cobalt, and Canadians bracing waiting readying for US sanctions, you’d be forgiven for missing February’s “new book releases” — and one unique contribution to modern political philosophy: The Technological Republic: Hard Power, Soft Belief, and the Future of the West by Palantir CEO Alex Karp (w. Nicholas W. Zamiska).

Here’s Karp in the preface:
The central argument that we advance in the pages that follow is that the software industry should rebuild its relationship with government and redirect its effort and attention to constructing the technology and artificial intelligence capabilities that will address the most pressing challenges that we collectively face. The engineering elite of Silicon Valley has an affirmative obligation to participate in the defense of the nation and the articulation of a national project—what is this country, what are our values, and for what do we stand—and, by extension, to preserve the enduring yet fragile geopolitical advantage that the United States and its allies in Europe and elsewhere have retained over their adversaries. (My emphasis)
To be sure, the fusion of big tech and government is an increasingly common observation. Further, it would be foolish to assume that national security for any modern state wouldn’t demand highly capable, rapidly responsive technologies. What’s different here is the moralizing and fence-building Karp is attempting — a signal that his company, Palantir, unlike others, serves as the inevitable model for US sovereign technology in the years to come.
Such a book emerges amid trying times. Even if we ignore the firestorm of controversies of the Trump administration’s early days, Karp’s vision is christened just weeks after the Silicon Valley bros struck their pose at Trump’s inauguration and follows the latest summit on AI (in Paris), which signaled, for many, the sharp and scrappy return of realism (It’s back, baby!1) — and continually shrinking odds for global regulation of emerging technologies.
The consequent vitriol and tech-bashing is not unwarranted — Elon’s “technocratic” assault on the US federal government is a real-time experiment as to whether you can run a public goods organization like a private firm (Spoiler: you cannot).2 Yet, the lament against the “Silicon Voligarchs” is only one among a range of concerns, including wider structural reforms (i.e. access and placement of tech firms inside the carapace of the state) that leading and increasingly embedded technology firms are working to advance.
This is where Karp’s argument is notable: it offers the clearest and most succinct example of securitizing the United States technological domain. Karp’s thesis is a clear-throated nationalist or state-centric philosophy for the new technological age: American technological advantage (not American technological development) is the only acceptable strategy for Silicon Valley leaders.
In this context, securitization refers to (at least) two different processes. First, it captures the specific speech act, where leaders (particularly those the ability to shape discourse) outline what constitutes a security threat.3 With tech leaders’ increasing access to government elites and their prime positions within media and cultural spaces, Karp’s has the ability to proclaim — through his new book and behind closed doors — the necessity of seeing technology as both a national security threat and its solution, which ultimately increases attention, support, and funding for an America-first approach to technology.4
Second, securitizing tech formalizes the essential role it plays in state affairs and positions the creators of said technology as rightful recipients of enduring government support. Contrary to the commercial or market approach, Karp’s argument demands — nay, necessitates — that funding be sustained through the state, particularly when the soft- and hardware of advanced technology will be central for gaining the military, competitive, and strategic advantage. For those first movers, like Palantir, who have embedded their services — and now organize critical data —within a growing number of government departments, this is tantamount to writing yourself a check.
Perhaps more importantly, Karp’s vision doesn’t end there. He positions private technology companies as critical stewards for our modern moment, suggesting they must help us(?) — it’s unclear — engage in “vital yet messy questions of what constitutes a good life, which collective endeavors society should pursue, and what a shared and national identity can make possible...”
Such an appeal feels a little like Nietzsche’s caricature of the priest, who heals his flock only after convincing them they’re sick.5 But the move is politically-deft and strategically astute. His argument leverages the public perception that tech firms have pursued customers and cash instead of care or country, and have — wittingly or otherwise — created an ecosystem where safety is an afterthought, harm is an externality, and malevolent actors are able to take advantage of the digital ecosystem to the detriment of the United States. Apple, Google, Microsoft, and Meta, in this narrative, have built technological utopias (literally: no or un-places) and compelled users to spend time there, and monetized their users’ attention, without offering any greater purpose or end. The result, Karp suggests, is a kind of national (and civic) abandonment.
These are strong and powerful rhetorical moves, and the sleight of hand — and distancing himself (and Palantir) from other firms while remaining inside the tech bubble — may ultimately fail. But most surprising, however, is how Karp’s narrative doubles down on the need for private-public fusion — an admixture of interests that has historically rankled critics of both private and public power.
This shifting balance motivated former President Biden’s warning about the creeping oligarchy of big tech in America. Biden’s PSA was itself an echo of Eisenhower’s concern about the growing military-industrial complex in his own White House exit address in 1961. The capture of public —read: elected—power by the private sector once sparked protest. Karp’s narrative — and positioning American technologists as servants of national power and prosperity — is an attempt to sidestep these tensions under a (techno-optimist) argument that reifies a vision of the world where technology companies become inextricable from the nation state.
After OpenAI’s ChatGPT emerged in November 2022, I mused with friends about how rapid technological changes might influence politics. This was the moment where leading figures in tech were warning that AI was moving too fast, potentially breaking too many things. A drafted op-ed from that period (unplaced and still saved on my Google Drive) focused on the use of “open letters” — documents, we argued, that served as the commercial or private-sector version of the government “white paper,” usually published to shape policy:
These private-sector “open letters” are … symptomatic of an ongoing transfer of political power from the electorate, through our public officials, to the backrooms and boardrooms of technology firms. At stake in industry messaging on AI is the creeping normalization of the premise that new technologies are best understood, and their social implications best managed, by those who develop them. The consequence of this assumption is not merely the disenfranchisement of the modern citizen, but a kind of mechanization of our social and political sphere — a ruthless reduction of moral and ethical debates to a set of technical problems to be solved by technocrats or even by the technologies themselves.
Karp’s new book, and the argument it advances, suggests the ground war between private and public power endures — that the infection of public office by private interests runs ever deeper. The ultimate advantage for a technology firm, much like a petulant virus, is to convince its host that survival is impossible without its presence. After that, of course, the patient is terminal.
Quotes that stuck
“What first appears to be a wish for more time may turn out to be just one part of a simple, yet vast, desire for autonomy, meaning, and purpose.” - Jenny Odell, Saving Time: Discovering a Life Beyond the Clock
“We need to develop a better sense of the madness of the world.” Benjamin Labatut, in interview.
“I believe that the information structure is one of the most determinative factors in any society. It shapes the landscape. It’s an ecological force… I think one of the things [Marshall McLuhan] was right about is that everything else is downstream from how we exchange information. Politics is downstream. Even culture is downstream. Because it gets exchanged in certain media.” Martin Gurri, in conversation with Ezra Klein.
Credits: Links included this week feature reporting/thinking by
, , Matteo Wong, Maria Varenikova, Andrew E. Kramer, Fergus Linley-Mota, Elisabeth Siegel, Chloe Chadwick, Kelly Bronson, Mary-Jane Rubenstein, among others.It never left, of course.
The Washington Post published an exclusive investigation this week, which show Musk’s companies (including Tesla and SpaceX) have received as much as $38 billion USD in government funding over the years — suggesting “Musk is one of the greatest beneficiaries of the taxpayers’ coffers” (as DOGE works to trim “government largesse”).
Colleagues and I have been thinking about the idea of “narrative power” for the last few years. Some of those colleagues put together a good overview/summary of the concept in the context of big tech/AI here.
The “America First” framing is also synonymous with the current American administration’s philosophy more generally, of course.
Mary-Jane Rubenstein makes a similar argument about Elon Musk in her wonderful book, Astrotopia.