A digital mundialization is underway and advancing rapidly and irreversibly, creating an interdependent reality from village level to global scale, with all the ups and downs this entails. Far from being a peripheral change, this state of interdependence, which both makes up and goes beyond the digital dimension, causes a major shift in the sociopolitical architecture.
From common pool resources to global commons
Considering the above panorama, and before exploring the main lines of a framework for action for a citizen internet, it is worth pausing a moment to consider the notions of common goods and global commons. Neither are new, particularly the first. In the case of digital goods, scientific literature is relatively extensive, especially in the US, birthplace of the internet. It has also favoured a techno-centrist approach, in detriment to other cultural, economic, and political factors that all come into play in the definition of the internet. Consequently, academic research has tended to compartmentalize international relations specialists and internet experts, whether these by from the disciplines of communication sciences or IT. The present situation makes it necessary to break down these barriers between conceptual worlds and bring these perspectives together.
The internet is basically a network of information networks that allows the exchange of information between computers via a shared protocol: the TCP/IP protocol. It is also a complex system, in as much as it constitutes in itself a web of interconnection of sociotechnical subsystems, where local, regional and global models of infrastructures, uses and content are juxtaposed. Subsequently, resources mobilized in the digital sphere are by definition combined and plural. There are mixed resources (assignation of domains, exchange points); public resources (energy, digital services), shared resources (protocols, standards, norms, servers, open code, content), and private resources (transoceanic fibres, data centres, proprietary code, content.) Regarding the nature of resources, commons specialists do not narrow the digital environment down to strictly one common good. For them, it is rather a common pool resource, a hybrid compound of shared resources. In this context, the notion of common good, applied to the internet, refers to a regulatory perspective or aim. This very current debate also occupies the field of telecommunications and other areas like collective security and ecosystem services. As we shall see later, the characterization of resources, goals and regulatory models needs to be expanded. All the same, the explosion of electronic communication has driven the notion of universal commons, and it has done so beyond the scope of the internet. This idea has become intensified in recent decades. It has in some way become formalized with mundialization, along with the paradigm of common goods. Until quite recently, the architecture of world governance based on the United Nations system and the movement of common goods coexisted in two worlds that barely communicated with each other.
It is essential to remember that common goods have a philosophical and political genealogy. Classical political philosophy is predicated on the hypothesis of the social contract that envelopes the state of nature to which human beings are subjected. This vision can be found in western philosophers who have strongly influenced political thought: Aristotle, Hobbes, Locke, Rousseau, to name just a few. The state of nature where goods are common is a sphere characterized by the absence of government in which private property does not yet exist. When the notion of property emerges, the state of nature rapidly turns into state of war, and individuals protect themselves by drawing up a social contract as a first step towards a social and political construction. Then governments appear whose main reason for existing is to guarantee the goods converted into individual goods. Later, laws and institutions appear. Different people’s commercial interests lead to a continuous retreat of the frontier of common goods, which are subject to the constant assault of individuals, companies, or predatory states. In the long term, this phenomenon leads to what specialists call the capture of common goods. That is, the process through which shared resources become resources under private or semi-private control.
Native American philosophy, especially Andean philosophy, reached via a different paradigm the notion of commons through the idea of we-ness and communalization. Various South American thinkers like Russel, Mejía and Quintanilla stress that in different indigenous world views a relational view of coexistence developed, based on principles of reciprocity, complementarity and sharing. In fact, indigenous peoples created forms of common management of certain goods, especially cognitive and natural goods, inconceivable outside of their belonging to a broader community. So there were mingas, tambos and the Qhapaq Ñan as infrastructure guaranteeing the continuity of the communities. Property was defined above all in relation to the collective. Researcher Sofia Chacaltan Cortez sums up tambos as follows:
“The tambos were small to medium-sized buildings systematically built fifteen to twenty kilometres apart along the main roads of the Qhapaq Ñan, which unified ideologically and spatially the Tawantinsuyo territory. The tambos functioned under the system of reciprocity and redistribution characteristic of the Inca (and pre-Hispanic) economy. They were sustained and administered by imperial officials immersed in a bureaucratic hierarchical Inca system.”
Economics Nobel Elinor Ostrom showed evidence of similar elements in African and Asian philosophies. In short, the commons movement seems to be as old as the first need to administer a shared resource. It is probably no coincidence that these philosophical roots should have been vividly expressed in the Manifesto for the Recovery of the Common Goods of Humanity, summarized in 2009 at the World Social Forum in Belém, Brazil. The manifesto states in its preamble:
“The privatization and commercialization of elements vital to humanity and the planet are stronger than ever. After the exploitation of natural resources and human work, the process has accelerated and extended to knowledges, cultures, health, education, communications, genetic assets, living beings and their modifications. The welfare of everyone and the preservation of the Earth are sacrificed for the immediate financial gain of the few.”
As early as the nineteenth century the philosophers Proudhon and Frantz predicted precisely these future problems. Both based themselves on observations that they had made from the creation of the unified Italian and German states. They understood the fundamental nature of these questions, as well as some of their ramifications for the ruling powers. They reached the conclusion that the main mission of governments is to generate economic growth and that the all-powerful modern state is not naturally inclined to promote common goods. Their analysis is very similar to that of the commoners of the current century. In 1968, the biologist Garrett Hardin, and a little earlier Mancur Olson (1965), opened up an epistemological breach with their interpretation of the tragedy of the commons. Hardin worked from the principle of the state of nature to bury the notion of the collective management of common goods. Contrary to classical liberal theory, starting with Adam Smith, that saw in the selfishness of individual action the main driving force behind the liberal economy, Hardin saw in it the root of all ills, leading to the tragedy of the commons. However, considering his initial differences with Adam Smith and liberal economists, Hardin defended private property as a solution to the problem of common goods. He legitimized the neoliberal economy as the main guarantee of the common goods of the State. It is this vision of the commons that has become established in most academic circles.
Later, in the 1990s, Elinor Olstrom took an opposite approach in Governing the Commons. She used specific examples to demonstrate that the tragedy of the commons is not inevitable. Based on her pioneering study, other researchers confirmed her observations and showed the breadth of the phenomenon on a global scale. In the notable case study The Wealth of the Commons: A World Beyond Market and State (2012), Silke Helfrich and David Bollier made an important contribution to the subject, showing many successful cases from all over the planet. David Bollier sees commons as essentially a combination of a resource, a community, and a group of social rules. The important thing is not only to determine what is common, but rather to establish a community that can administer a given resource, and see if that community is capable of drawing up norms, rules, institutions and appropriate sanctions. From the moment when the commons go beyond territorial management, the question arises of their polycentric governance, that is, the regulation of overlapping, multiple centres of regulation. One of the important characteristics of the commons is that they are generally rooted in the terrain, with the primacy of the practical dimension. There has been no priority to have a theory of common goods, or even a doctrine of governance.
Recently, geopolitical debates have installed the notion of the global commons. US geopolitologist Zbigniew Brzezinski calls them strategic commons or strategic global commons. In more general terms, global commons are commons whose use and administration are far beyond the scope of a single country, requiring the participation of multiple parties. Under this term they are defined as non-governed spaces that affect directly or indirectly the security of the states, the people and sometimes the whole planet. For commons specialists, this definition of global commons is incorrect. As we saw above with common goods, these goods are closer to the notion of shared resources as described by Elinor Olstrom, or common pool resources. Historically, the sea was the first strategic common. For a long time, the seas and oceans were subject to the laws of Realpolitik and power relations, with the most powerful fleets controlling the maritime space, allowing the strongest nation to control maritime communications. In this way, England was capable of ensuring its expansionist policy, at the expense of the Netherlands, its great trade rival. In time, international law gradually developed to provide an infrastructure that codifies navigation and the use of sea resources.
Today, airspace and cyberspace—as well as space itself, with the role of satellites—occupy a central place in geostrategic questions. Max Weber wrote that states traditionally claim the monopoly of legitimate violence, and we might add that they also have the monopoly on strategic activities, a sphere in which even transnational corporations must be left behind the more powerful countries. Zbigniew Brzezinski writes that “the strategic commons will probably be the area most affected by the change of paradigm of global power, in its relationship with the progressive growth of capacities and the activism of emerging powers like China and India, and the potential decline of the United States. The sea and the air, space and cyberspace, which are at the centre of the national interest of every country, are essentially dominated today by the United States. However, in the coming years a growing number of players will become involved and they will be the object of greater competition as the strength and ambitions of other countries grow.” Therefore, it is a priori states that will be the main candidates to dispute an increasingly intense geostrategic competition in the field of the strategic commons. Given that this area knows no physical boundaries, or limits between the public space and the strategic space, and its regulatory regime is generally limited, the challenge of preventing governments from invading the public space, civil and individual liberties, will not be an easy one. In practice, the security policies undertaken by China and the United States, allied with other industrial countries, confirm this perspective. In this scenario, the question of a new regulation of global commons becomes a central issue. The awareness of public opinion to go beyond the states’ security lag should be a key factor of the future.
A governance model with a growth crisis, in search of itself
The architecture of governance holding up the internet is thus a central issue and, as recent years have shown, increasingly in dispute. One of the specificities of the internet, unlike other communication technologies, is that during its first twenty years it was administered in the context of a horizontal model, founded on scientific cooperation between peers. This model evolved subsequently towards the creation of institutional organs according to a more hierarchical model. But it was always maintained within a model irreducible to traditional multilateral logic. In fact, it would have been impossible to build a pioneering internet of this kind if an interstate realm had been proposed from the beginning. This is a young, original, unfinished architecture, whose principles are revealed to be theoretically better suited to the characteristics of the global commons under analysis here.
Generally, the governance model of the digital sphere is akin to a polycentric model, or akin in its layers to the image of the common pool resources that make it up. Its geometry juxtaposes various institutional arrangements around the functions of the critical questions to be regulated. One of these central questions has to do with the standards and domains of the internet. This is what determines its uniqueness. In this function, diverse groups or institutionalized organs are grouped together, involving civilian, scientific, business and institutional actors. The other generally formalized areas have to do with: accesses and interconnections; online security; intermediaries of data and information; intellectual property. Each of these areas holds up various regulatory mechanisms that tie together both national and regional actors, multilateral agencies or coordinated groups, and private and public international laws. There is no one multilateral agency specializing in the digital question, nor is there a single legal organ with binding power in these issues. Therefore, the internet governance model resembles above all a transversal, multi-sector geometry. Due to the extent of electronic communication, its regulation crosses over horizontally with many other levels of regulation, from local to international, whether in the social, cultural, economic or political sphere. It intersects, for example, with the policies of the World Trade Organization, the G20, the International Telecommunications Union (ITU), intelligence agencies, trade and intellectual property treaties, as well as national laws within states. All this configures a plural, loose architecture, with a functioning that must be evaluated more due to its capacity to raise responsibilities and coordinate relations than to segment frontiers and competences. Therein lies an important innovation of regulation. It is a model that must fundamentally treat the relation between scales, actors and thematic questions, combining diverse modalities of action (multilateral dialogue, sovereign decision-making, coproduction of norms, multi-sector participation, subsidiarity of civil and commercial law, etc.) This complex geometry is new and disturbing, in terms of both political practice and theory. Various theoretical currents have lent their weight to this field. This is the case of the regime theory and of international relations, of hegemonic stability, of realism (dominant in global geopolitics today), of commons, etc. However, none of these areas took primacy in the building of the current architecture of internet governance.
If we look more specifically at the critical area of administration of standards and domains, the US association ICANN took responsibility from 1998 for the assignation and administration of domains, after the first cycle of self-administration of this function between scientist peers as indicated above. The United Nations has attempted to internationalize this organ at multilateral information society summits in Geneva (2003) and Tunis (2005.) These summits did not see any progress in the priorities, means, and types of associations and instruments of deliberation to further regulation in the digital sphere. Their scarce resources led the United Nations to create a new informal forum, the Internet Governance Forum (IGF), with the task of continuing the debate for several years. The security crisis in the wake of Edward Snowden’s revelations in 2013 brought the issue back into debate. Due to this profound crisis, increasingly more voices have risen up in Europe and in emerging countries to demand the internationalization of the critical function of domain assignation. Further mobilizations in 2014 and 2015 broadened the scope of the demand. However, to date, the US hegemony appears to remain unmoved in this matter.
This fracture line, perhaps the most recognized at present in regulatory organizations, is useful for understanding that internet governance means fundamentally conflictive deliberation or jurisprudence processes where rivalries and struggles for interests are manifested. This conflict goes hand in hand with the density of the internet. This is especially true when there is no clear framework of arbitration, sanction and anticipation, as may be the case, for example, in the International Panel on Climate Change (IPCC) and the recent incorporation of environmental questions into the competence of the International Criminal Court. In the field of international justice, most of the times it is cases of rupture and even international scandals that generate new measures of jurisprudence and that drive the agenda of legal framework evolution. Analysing the model of internet governance, the researcher Françoise Massit-Follea stresses that its diffuse logic has led to the creation of power structures and coalitions of influence operating within the instances of participation. One of the conclusions of the report by the Global Commission on Internet Governance (2016) is precisely regarding these two dimensions:
“to anticipate and approach new challenges arising from technological change and innovation; to improve coordination among actors and their activities in the realm of governance.”
However, aside from the critical—but not unique—function of standards and assignation of domains, what broader balance can we extract from this architecture of regulation over twenty years of experimentation? In structural terms, we can highlight a trend towards what one could call an inversion between the means and the ends of regulation. That is, a phenomenon in which the technical dimension takes priority and imposes its logic on the political dimension of the administration of digital resources. This trend means that burning questions about the crisis of security in the Snowden era, the mass violation of the right to privacy, the hypermonopolization of resources, are not raised as new questions to incorporate into regulation. Secondly, the modality of multi-sector governance tends to be raised as a single model for administering diverse aspects of the digital world. In practice, this model leaves a net balance of many ambiguities in terms of decisions and roles assumed by each of the actors involved. The technical approach was perceived once again in the preparatory document of the Netmundial organizations in 2014, where the world “multistakeholder” was used almost fifty times, while the word “democracy” was never mentioned. At that meeting, twenty-five civil associations declared that the debates “had not really helped to change the status quo in terms of the protection of the fundamental rights, or balance the powers and influence of the actors involved.”
According to researcher François Massit-Follea, at the level of concrete cooperation processes, various instances created with the intention of encouraging general interest end up moving decisions away from participation and construct, essentially, a pretence of collaboration. In the case of the Internet Governance Forum, after ten annual meetings, accompanied sometimes by regional and national forums, the unresolved issues are accumulating. Some analysts see this scenario as the challenge of moving from a governance focusing on the technical infrastructure of the internet to a global governance of a digital common.
Aside from the goals assigned to digital regulation, we are yet to find a way to implement the broader approach to internet governance arising from the multilateral summits of 2003 and 2005. At that time, there was a suggestion that the management of technical resources and questions of public policies should adopt a multi-party, multi-polar approach. That is, institutionalize a greater international cooperation and raise new questions politically. However, while power relations between the states, private operators and civil society intensify, a technical consensus is still in place, with very uneven levels of depth. One of the consequences is that this tends to polarize ideologically the supporters of a free and open internet on one side, and on the other side a governance based on territorial sovereignty, and finally the promoters of the current model defending their economic performance. Some conceive digital technology as a field of international relations orchestrated by the States, considering that digital sovereignty is an avatar of it. They oppose the idea that globalization and the internet weaken the regulatory power of the States in economic terms. Others sustain that digital technology radically transforms the nature of the international system, as the internet has extended to all sectors of society. This dividing line leads us in some way to the tensions that have developed historically between the state and the economy, and led to the four macro models: the soviet model; the Hamiltonian and Chinese model; the social democrat model; and the liberal Reaganite/Thatcherite model. In all these experiences, civil society has in the best of cases been a passenger shaken by government decisions and economic avalanches.
As a predictable consequence of the above points, the architecture of governance is influenced by a greater privatization and instrumentalization of the spaces of negotiation over particular interests. This trend can only remind us of the phenomenon of the capture of commons. In the legal field, the jurist Olivier Itenu stresses how the hegemony of the US right is consolidated thanks to loose lobbying exerted in the different regulatory bodies. In the IT sector, private and non-governmental actors have always played an important, legitimate role, laying fibre optics, multiplying exchange points between networks or feeding the definition of technical standards. However, both their current influence in the digital services and their central role in mediating content makes them a top-level regulatory actor.
The actions of private operators directly influence policies of privacy, control of financial flows, censorship and monitoring of copyright. In fact, the reports that monopolistic companies like Google and Facebook provide show that governments are submitting more and more requests to these digital industries.
Furthermore, the interception policy implemented by the industrial powers has shown that cooperation with private operators is essential. It should also be remembered that the NetMundial summit of 2014 was jointly sponsored by the organizers of the World Economic Forum in Davos, which gives another indication of the connivance between institutions and corporate actors, selective connivance typical of the diplomacy of the private club. This form is increasingly being called into question, as well as revealing itself to be less influential on the global agenda. In this context, the private sector is today at the crossroads of these tensions. It will play a crucial role in the way in which internet governance develops in the near future.