The Internet Regime

The internet functions as a global structure of governance and control that influences politics, culture, the economy and spatial organisation.

Electronic computer, Washington D.C.

Electronic computer, Washington D.C. | NASA | Public domain

The internet has reshaped power relations in the political, economic and cultural spheres. Approaching the web as a regime allows us to better understand its internal dynamics, the effects of which also extend beyond its borders.

There are countless ways to imagine internet politics. A common sociological approach is to view the internet as a social field, where corporations, governments, activists and other stakeholders compete for power and influence. This is a useful lens through which to interpret the agency and struggle of these actors over specific social functions of the internet. For instance, it allows us to view Elon Musk’s acquisition of Twitter as a move to achieve hegemony over the public sphere, with multiple waves of user migrations to alternative platforms like Bluesky and Mastodon reflecting ideologically motivated exits from an untrusted digital space.

However, to comprehend the internet as a social field is to reduce it to a bounded sphere of social interaction, and thereby overlook its material foundations. This is a major oversight, because those material foundations structure the internet’s role in shaping governance, norms and power relations across multiple domains.

An alternative perspective could be to imagine the internet as a regime. Unlike a field, which is contained and shaped by its own internal dynamics, a regime operates as an overarching structure of governance and control, setting the rules and norms that affect multiple fields simultaneously. As a regime, the internet integrates corporate and state power within a specific logic, governing not only its own operations but also influencing politics, culture, economics and spatial organisation at a global scale. It is a system of overlapping sovereignties, where actors like states and corporations negotiate control over materials, infrastructure, data and users, reflecting broader territorial and geopolitical struggles.

To approach the internet as a regime is to look beyond the localised competition of social fields to examine how multiple fields are shaped and regulated simultaneously. Such a move recalls Reza Negarestani’s treatment of oil in his landmark work of speculative inquiry, Cyclonopedia: Complicity with Anonymous Materials. In this unusual book, the Iranian philosopher unravels the subterranean forces shaping geopolitical dynamics, presenting a haunting view of oil as both a material and a metaphysical entity that reshapes human existence. Like oil in Cyclonopedia, the internet can be viewed as a conspiratorial agent of distributed sovereignty: decentralised, pervasive and almost Lovecraftian in its indifference to human systems.

From this perspective, crucially, the internet does not “belong” to any single entity. Rather, it destabilises human notions of control and sovereignty, infecting the geopolitical landscape and compelling nations, economies and militaries to participate in its expansion. As a regime, the internet functions through at least three interlocking dynamics: the contestation of cultural hegemony, the intensification of social sorting, and the structuring influence of spatial and territorial power. These dynamics are not disjointed but mutually constitutive, spiralling in a cyclonic structure likely to amplify inequity and domination globally.

AI, inequality and the deepening crisis of democracy

As a computational intelligence embedded within this regime, Artificial Intelligence (AI) is both a byproduct and driver of the dynamics described above, raising existential concerns about human freedom, democratic institutions and the ontological implications of machinic agency. Through its embedded biases, structural logic and alignment with corporate and state imperatives, AI operates as both a mirror and amplifier of existing inequalities. At the heart of AI’s impact on democracy lies algorithmic bias and social sorting. AI systems, trained on historical datasets and programmed by imperfect logics, embed systemic prejudices into their automated decisions. Whether through credit scoring, predictive policing or facial recognition, these technologies reproduce and intensify existing hierarchies, making inequalities both scalable and opaque.

In this sense, far from democratising opportunity, AI is poised to entrench exclusion, undermining the principles of fairness and equal representation central to democratic ideals. Yet the threat is not merely structural. AI-powered manipulation of public opinion, particularly on social media, weaponizes algorithms to amplify dominant ideologies and suppress dissenting voices. By shaping discourse to favour corporate or state interests, AI distorts the free exchange of ideas, exacerbates polarisation and undermines the deliberative processes vital to democratic governance.

These dynamics are compounded by the centralisation of power in the hands of a few dominant actors – tech giants and authoritarian states – who prioritise profit, control and efficiency over transparency and accountability. The monopolistic nature of AI’s development ensures that its benefits are concentrated, while its risks are externalised onto the broader public. Simultaneously, AI-driven surveillance and control extend the reach of state and corporate actors into the most private realms of life. Framed as tools of security or convenience, AI surveillance systems monitor, predict and manipulate behaviour, curtailing freedoms of expression, association and privacy, all of which are cornerstones of any functioning democracy.

The erosion of autonomy and the rise of digital exclusion

The threats posed by AI to human freedom are similarly profound. Erosion of autonomy is perhaps the most insidious, as AI systems mediate everything from consumer choices to romantic relationships and, by extension, human reproduction. The logic of algorithmic recommendation subtly steers behaviours and constrains options, replacing agency with a curated simulation of choice. Coupled with this is the pervasive commodification and exploitation of personal data, where individuals are reduced to repositories of information, mined and monetised without meaningful consent. Privacy is not merely invaded; it is obliterated, as human lives are fragmented into datasets optimised for corporate gain.

Finally, AI intensifies digital exclusion, widening the chasm between those who can access and benefit from these technologies and those left behind. The digital divide, already a function of class, geography and education, becomes a site of structural marginalisation, further denying opportunities to those who lack the skills or resources to participate in the digital economy. For many, exclusion from AI-driven systems translates into exclusion from economic, social and political life.

The impact of AI, then, is not reducible to technological determinism or isolated misuse. It reflects and amplifies deeper logics of neoliberal governance, commodification and surveillance, posing existential threats to the ideals of democracy and the foundations of freedom. Its trajectory is not inevitable, but to challenge it requires confronting these logics head on – reimagining the development and governance of AI not as a tool of domination, but as a system that serves equality, transparency and autonomy.

The sovereign machine of algorithmic governmentality

The internet and artificial intelligence together crystallise what Mezzadra and Neilson refer to as “the sovereign machine of governmentality” – a mechanism that simultaneously enforces boundaries and manages populations. It is a dynamic apparatus that fuses the sovereign power to delineate exceptions and exclusions with the governmental imperative to control life itself. This is not a tidy process. It is a sprawling, uneven and deeply contested terrain where power operates through the infrastructures of digital capitalism and where resistance, though often fragmented, persists.

At the heart of this machine lies the capacity of sovereignty to impose boundaries – borders that are less territorial and more algorithmic, but no less real than Donald Trump’s wall. AI-powered platforms govern flows of visibility, access and inclusion. Algorithms decide who appears on our screens and who disappears into obscurity, enforcing exclusions that feel natural but are anything but neutral. Content moderation, predictive policing and algorithmic bias expose this process as a form of digital sovereignty, that is, the capacity to declare who belongs, who is seen and who is othered. In this sense, sovereignty is fundamentally about the exception – the power to decide who falls outside the rules and what this means. Whether through biased hiring algorithms, facial recognition tools that fail to “see” certain people, or opaque censorship regimes, AI repeatedly reproduces boundaries that map onto existing social hierarchies. These exclusions, far from being accidental glitches, are fundamental to the functioning of the system.

If sovereignty enforces social boundaries, then governmentality operates within them, managing populations and life processes in ways that are increasingly granular, automated and extractive. In this sense, AI does not merely process data; it governs behaviour. Every click, scroll or search is rendered measurable, predictable and actionable. Platforms deploy this knowledge to optimise attention, steer choices and extract value, creating a form of governance through data that shapes individuals’ subjectivities as much as it disciplines their actions.

This is where Foucault’s biopower meets the demands of digital capitalism. The internet, as a space of both freedom and surveillance, renders life itself a raw material to be mined. The proliferation of AI systems in everything from advertising to public services facilitates the management, segmentation and commodification of human behaviour. Here, people are not just users or citizens; they are data points, commodities and, ultimately, labourers whose activity generates profit but not recognition.

As a machine of extraction, the internet and AI expand what Mezzadra and Neilson would call the frontiers of capital. Value is no longer confined to traditional sites of production but is extracted from the very fabric of daily life. Social media posts, rideshare gigs, search histories – all become grist for the mill of digital capitalism. Sovereignty here is economic as much as it is political: the platforms act as both managers of circulation and enforcers of hierarchies, siphoning profit from unpaid labour while ensuring that flows of capital and data move smoothly across borders they themselves police.

Yet sovereignty is never absolute. Rather, it is constantly negotiated. Resistance, fragmented as it may be, constantly challenges the boundaries it imposes, whether through direct action, state intervention or grassroots movements. The internet regime may appear to thrive on inequality, but its greatest weakness is precisely its asymmetry. By mapping the chokepoints of this regime – its monopolies, its algorithmic exclusions and its geopolitical vulnerabilities – one begins to understand where power is concentrated and where opportunities for transformation may lie. To illustrate this point, let us briefly consider a pivotal structure in the sovereign machine of algorithmic governmentality: the supply chain of the tech firm Nvidia.

Chokepoints and geopolitical vulnerabilities in Nvidia’s supply chain

Nvidia plays a critical role in global geopolitics due to its dominance in the semiconductor industry, most notably in the production of the advanced graphics processing units (GPUs) that are essential for AI, gaming and automation. Yet the company’s highly complex supply chain exposes a global economic system riddled with inequalities, bottlenecks and escalating tensions. Indeed, semiconductor supply chains give us a fairly detailed idea of who pays the price for technological progress, where power is concentrated, and how this system produces structural vulnerabilities.

If we were to view Nvidia’s supply chain as a transnational urban map – a network of nodes linked by flows of goods, capital and labour – then the chokepoints would be the narrow streets where pressure builds and movements are constricted. Critically, these chokepoints are not accidental, but structural features of the system. As such, they serve specific sets of interests and exclude others.

What is most notable about Nvidia’s supply chain is the intense level of global interdependence it requires. From raw material extraction to design, manufacturing and assembly, Nvidia’s products move through a tremendously fragmented geopolitical landscape. Like other tech giants, the company has embraced just-in-time production and subcontracting – efficiency models that are celebrated by shareholders but which embed substantial levels of precariousness into the system itself, as became all too clear during the peak of the COVID-19 pandemic.

This fragility is no accident. It is a product of weaponised interdependence, which Beaumier and Cartwright (2023) define as the US’s ability to “coerce global production networks” by leveraging its control over design technologies (primarily via intellectual property protections), disrupting supply chains to strategically contain Chinese technological development. For Nvidia, this geopolitical chess game creates economic uncertainty, as markets like China simultaneously represent massive consumer demand and spaces of exclusion under US-imposed restrictions.

However, Nvidia’s most well-known chokepoint is its main manufacturing partner, the Taiwan Semiconductor Manufacturing Company (TSMC). TSMC produces over 90% of the world’s advanced logic chips. Nvidia’s reliance on this one node is a prime example of the asymmetrical risks embedded in global production networks. Taiwan’s position between the United States and China has transformed the island into a hotly contested territory where power struggles are waged through export controls, sanctions and nationalist industrial policies. As China intensifies its efforts to achieve a self-sufficient semiconductor industry, US export controls against Huawei and the CHIPS Act of 2022 reflect the declining Western superpower’s broader shift towards techno-nationalism. Indeed, both Trump and Biden sought to prioritise the security of the country’s tech industry during their presidencies, investing billions to “repatriate” semiconductor production.

Cracks in the regime

The internet has reshaped power relations across the political, economic and cultural domains. As infrastructures of extraction and control, the internet and artificial intelligence fuse the logic of capital with algorithmic governmentality, turning human life into raw material for profit. From algorithmic bias that deepens social inequalities to the monopolisation of AI by corporate entities and states, these technologies amplify existing hierarchies while further entrenching exclusions – digitally, economically and politically.

Yet these systems are not invincible. Their structural vulnerabilities are clearly visible in supply chain chokepoints, algorithmic failures, geopolitical tensions and labour conflict. The chip industry’s dependence on a single firm (Nvidia), and that firm’s dependence on a single manufacturer, clearly illustrate this fragility. The power wielded by individual nodes in the semiconductor supply chain highlights how weaponised interdependence intensifies geopolitical contestation, making the internet and AI key battlegrounds in the struggle for technological and territorial sovereignty.

From production networks to platforms, from supply chains to surveillance systems, the internet regime operates unevenly, leaving fissures at multiple levels. Recognising these fissures – and understanding the spatial, infrastructural and political logics that underpin them – is crucial to identifying pathways for contestation and transformation. Reclaiming the internet and AI has less to do with dismantling the technology itself than with confronting the systems of power that shape it, and with building infrastructures rooted not in extraction and control, but in equity, autonomy and democratic governance.

All rights of this article reserved by the author

View comments0

Leave a comment