With the change of government in Barcelona, a discourse has been gaining ground that places the emphasis on technological sovereignty and talks about data commons, digital rights and free programming. Even so, when we talk about citizens’ capacity to decide we cannot overlook the collective dimension of our online rights. For this reason it is essential to link this sovereignty with terms such as equity, justice and redress.
For months now it has been common to hear discourses that relate technology and sovereignty, especially in Barcelona. From a council team that arrived in government with the promise and the responsibility of rethinking the city’s relationship with technology after paradigmatic cases of opacity, corruption and rejection by citizens, the new principles aim to shape an alternative in the face of the option of smart cities dominated top-down by processes and with little or no consideration for the social return, the common good, or the social and ethical impact of technology.
This definition of alternatives and new paradigms is extremely urgent. The discourse around technology seesaws between extremes all too often, from a certain reinvented and technophile playfulness to the most absurd and irrational techno-solutionism, that acritically assimilates data with oil or assumes that more information is always synonymous with better decisions. The only nuance is that while the right wing tends to trust in the market and deregulation, hoping that, as if by magic, technology will improve the economy, make us more efficient and conceal inequalities, the left wing marks its distances with the market without actually committing to public initiative and the capacity to imagine different technological futures, and it shares with the right a childish hope that technology will bring us closer to better societies. Abracadabra.
Thus, left, right and centre, the technological discourse is often filled with chants to Big Data, the Internet of Things, “smart” devices, algorithms and Silicon Valley, as evidenced by Britain’s Labour leader when he referred to the “fourth industrial revolution” in these terms. And when someone demands a thought for the social impacts of these processes, for the risks for privacy, for the need to rethink rights, values and guarantees in this new data scenario, techno-solutionism drops some qualifying adjective such as “ethical” or “responsible” and down it goes.
In this context, the role that the city of Barcelona wants to play is important. Can we rethink technology from the angle of the common good? If so, what are the terms and contents of the new proposals? For the Deputy Mayor of Barcelona, “in a democratic city, technology should serve to digitally empower citizens, to protect their privacy from abuses by the public and private powers, to fight against corruption and to advance towards a more equitable and sustainable economy. That has a name: conquering technological, digital sovereignty, for the common good.” The city council’s documents, for their part, make distinctions between technological sovereignty (breaking down the dependence on proprietary programs and encouraging public leadership), data sovereignty (safeguarding the privacy of citizens), and transparency (citizens’ audit). The model proposed for Barcelona, therefore, focuses on leadership and public value, privacy and transparency.
Being pioneers in the structuring of an own-grown definition of the role that technology must play in urban environments and processes, and attending to the understandable balances and legacies of the existing systems and processes, represents a remarkable effort. However, there are other principles that other cities and initiatives are already working on and that remain outside of the discourses of the moment. Thus, key questions linked to equity and justice, responsibility, redistribution or redress (yes, machines and algorithms make mistakes and their victims fall into bottomless pits of legal and administrative incomprehension) have not yet found their place in the city’s new digital plans.
The boundaries of data sovereignty
Opting for technological sovereignty as the catalyst of a new paradigm is stimulating, but also risky and potentially limiting. Sovereignty is a complex and often controversial term, that in its Rousseauian tradition makes reference to a republican power emerging from the people and under its control. In the context of geopolitical disputes for the control of transnational communication, during recent years sovereignty has been considered as the construction of a governance – at different governmental and political-territorial levels – with a greater capacity for coordination and regulation of digital exchanges, that may increase both collective security and competitiveness alike. The case of Barcelona is defended as being synonymous with the “capacity to decide” – the possibility of creating governance frameworks and technological solutions that do not abuse citizens’ data, that respect them and their capacity to know what is happening with their information, working to tackle real and not only commercial problems, and that are based on open, auditable and customisable codes. A kind of popular sovereignty of data that is distanced from other notions linked to sovereignty, such as protectionism or the (absurd) attempt to develop technological infrastructures linked to territorial boundaries.
Furthermore, if the choice of the term sovereignty to structure a new technological discourse and practices seeks to base itself on the idea of decision-making control and capacity (following the analogy of popular sovereignty), the elements linked to the common good, precisely, fade into the background. Sovereignty allows the expression of individual opinions that, once aggregated, determine political futures. In the world of data and in the individual relationship between citizens and devices for capturing information, this decision-making capacity overlooks the collective dimension of the rights that are at play, such as privacy, and opens the door to terribly harmful data relations.
Can a citizen decide individually to share his or her data, when it contains the data of other individuals? If a person freely and sovereignly decides to install an application on their mobile phone that captures data, what becomes of the decision-making capacity of the people on their contacts list, whose data is immediately transferred to a third party? Managing the collective aspect of the social impact of technology escapes the notion of sovereignty.
There are other key aspects that must form part of any new technological discourse, such as attention to non-discrimination and the digital divide, terms linked to equity, justice and redress. The data society pivots around algorithms that classify and take small and major decisions about the information they receive and the people from whom they receive it. These algorithms often reproduce discriminatory (sexist, racist) dynamics, as in the cases of voice recognition systems that do not identify the voices of women, or the police algorithms that recommend concentrating police activity in black neighbourhoods, or Google’s jobs advertising system, which only shows the best paid jobs to men. Is the capacity to decide or free programming sufficient to tackle these issues?
In the decision to provide broadband to disadvantaged neighbourhoods, for example, who takes into account the impact of this policy on existing inequalities? Who looks out to ensure that these populations are not made more vulnerable by exposure to providers that abuse the information that they share via this new infrastructure? How can one value the social and ethical impact, and construct better practices before establishing the relationships with citizens?
Towards ethical data management
For some years now, different actors have talked about the need to give form to a new social contract in the area of technology use and personal data. For these voices, the abuse of data to create new forms of manipulation, exploitation and control (public and private) threatens to erode fundamental values such as trust and democracy itself. Laying down the bases of this new framework agreement requires mobilisation in three fundamental areas: local government, the private sector, and citizens.
In recent years, academia and significant parts of civil society have gradually rescued and underline which are the rights and values that are suffering under the asphyxiating boot of techno-solutionism. As we mentioned previously, key concepts such as equity, justice, transparency, privacy, responsibility, redistribution, redress and leadership and public and citizens’ value emerge as elements to be custodied in data processes.
How can this be done? Firstly, local government has to be capable of giving form to the future, while tackling the chaos of the past. For decades complex organisations have been incorporating systems without planning or control, constructing technological addendums on outdated processes and wasting the opportunity to rethink them from zero. To tackle this chaos from the past, local government needs to equip itself with data architecture, data governance systems and specific roles of supervision and custody of best practices. There are very few organisations that today can stand up to a data audit, and with the entry into force of the new European data protection regulation and citizens’ demands for responsibility and transparency, this scenario is becoming more unsustainable by the day.
In order to shape the future, the administrations have to rethink the way in which technology is budgeted, tendered and acquired. Committing to innovative and transparent processes, free programming and contracts that clearly establish which uses can and cannot be made of the data generated are essential. Avoiding algorithmic discrimination, cyber-attacks and investment without returns starts with the drawing up of terms and clauses for tenders and cannot finish until the final execution of the projects. The administration can also provide valuable data for companies, but any public “commons data” must be based on rigorous processes of data curatorship, checking for anonymity and responsible management by third parties.
The private sector has its own incentives to avoid being left behind in the race to incorporate ethical and responsible practices. However, different cities are experimenting with ways of promoting more diverse and innovative ecosystems, based around technology. Firstly, there are those that acritically assume the techno-optimist discourses and effectively believe that data is the new oil. Secondly, there are those who back civic technologies and responsible ecosystems, and the rigorous study of the impact of technological policies, such as New York.
Finally, citizens must have the tools to be able to affirm and defend their rights both online and offline, based on clear and applicable regulatory frameworks that allow the identification of abuses and discriminations and activate mechanisms of redress. Being able to use technology in a responsible way and expecting legal behaviour from it, however, does not depend on the capacity of users to understand and defend themselves. If catching a plane does not require knowledge of aeronautical mechanics, giving data to any authority, company or service should not require knowledge of the legal framework. This passive protection is what public bodies have to encourage and guarantee.
An action plan on these three levels (back-end governance by local authorities, new forms of procurement and control of the acquisition of technology and the protection of individual and collective rights of citizens) should cover all that we know about the potential and the risks in the short, medium and long term of the data society, and allows us to awaken from our current state of sleepwalking towards socially undesirable futures.
The construction of a new discourse linked to technology, therefore, has to incorporate elements of control and empowerment of citizens, and so-called technological sovereignty may be another piece in the puzzle of challenges that need to be tackled. Cities, labour relations, transparency, mobility, rights and a long list of phenomena are all being shaken up by the new possibilities for technology and data. Giving form to a new social contract for data that enables the establishment and re-establishment of notions such as justice and redistribution in these new socio-technical realities requires control and decision-making capacity, yes, but also legality, ethics, acceptability, non-discrimination and a firm commitment to civic technologies, capable of incorporating in specific ways this preoccupation with its impact and the dynamics that are reproduced.
So a big welcome to technological sovereignty, as another pillar on which to gradually construct and consolidate a new technological model that is ethical, responsible and civic.
 See: Reading, Viviane (2015). Digital Sovereignty: Europe at a Crossroads, EIB Institute, or A. Cattaruzza, D. Danet, S. Taillat and A. Laudrain, “Sovereignty in cyberspace: Balkanization or democratization,” 2016 International Conference on Cyber Conflict (CyCon U.S.), Washington, DC, 2016, pp. 1-9.
 This is not the place to tackle this issue in depth, but the comparison between data and oil does not hold up. Oil stimulates economics on diverse scales because a small quantity of oil is sufficient to derive value from it on a small scale (a transporter does not need an oil well, with just a few litres he can generate business and value). Data, in contrast, structures a market that tends towards the monopoly, and where the ownership of a small quantity of data does not generate any value. The data market is a monopolistic market where the winner takes all, generating a decreasing value for the actors who contribute to this process of accumulation.