Lawless: the secret rules that govern our digital lives
Full PDF version now available for free! Or: Cambridge Core, Booktopia, Book despository, Amazon.
Digital intermediaries govern the internet. The telecommunications companies that provide the infrastructure, the standards organizations that design the protocols, the software companies that create the tools, the content hosts that store the data, the search engines that index that data, and the social media platforms that connect us all make decisions that impact how we communicate on a broad level. They govern us, not in the way that nation states do, but through design choices that shape what is possible, through algorithms that sort what is visible, and through policies that control what is permitted. The choices these intermediaries make reflect our preferences but also those of advertisers, governments, lobby groups, and their own visions of right and wrong.
Technology companies now find themselves at the center of many different battles to control what people do and say online. They are the focal points of control of the internet, and governments and private organizations around the world are rapidly learning how to influence their rules and their code. Tech companies play a major role in governing our actions, but the power they have over us is wielded in a way that does not at all live up to the standards of legitimacy we have come to expect of governments. The way that we currently regulate internet intermediaries means that they are under no requirement to rule in a way that is accountable.
Internet intermediaries enjoy a broad discretion to create and enforce their rules in almost any way they see fit. They make decisions based on their own vision for how they want users to behave, their business plans, and commercial interests, as well as in response to their exposure to legal risk and potential bad publicity. They provide little in the way of due process, leaving their users to wonder how and why decisions affecting them were made and creating deep suspicions about hidden bias and overt discrimination.
This is what I mean when I say that intermediaries govern in a lawless way. The broad discretionary powers they exercise are the antithesis to legal means of making decisions. The role of law in democratic societies is to create a set of rules that reflect the public interest and the morals of the populace. Laws are made legitimate through democratic institutions that are supposed to work in the public interest and constitutional limitations that protect the rights of citizens. The hallmark of legitimacy in law is the rule of law: an underpinning principle that the rules of a society should be created and enforced in a way that is predictable and fair. The legislative system is designed to ensure that the rules themselves reflect the public interest and the will of the people, and the judicial system exists as a way to check that laws are validly made and fairly enforced. Legal systems are by no means perfect, but they create the infrastructure that allows for public oversight of the rules that we live by.
Change is coming
Technology companies govern, but they are losing popular support. Legitimacy, in governance terms, comes from the consent of the governed - a common acceptance that those that exert power over us have the right to govern us. For years, technology companies have been able to justify their prerogative to govern the same way other companies do - because we choose to use their services in a competitive marketplace. Up until this moment, we might have said that tech companies’ right to govern comes from the contracts we enter into. The market provided legitimacy: we, as consumers, each choose to abide by their rules, no matter how poorly they are defined or how arbitrarily they are enforced. If these intermediaries are seen as just providing services to consumers, who are free to vote with their wallets, then their actions are almost certainly legitimate. But as the influence of technology companies on our lives becomes more clear, these companies need to do more to justify themselves and maintain their legitimacy.
Slowly, tech companies have been losing our collective consent. The tide of public opinion is now challenging the assumed right that technology companies have to govern our lives in the way that they do. The pressure on technology companies to be more accountable is growing steadily. This pressure has been building for years because technology companies have been making decisions that affect us all behind closed doors, without any real accountability. It increases with every shock and controversy that casts doubt on whether the industry has our best interests at heart or is doing as much as we would like to fight all manner of bad actors online. This pressure is fed by media industries that delight in attacking technology companies - particularly those parts of the mainstream media that have suffered the most in the shift to digital and blame big tech for their ongoing struggles. It’s stoked by governments that want to protect their citizens from the dangers of the internet (or at least look like they are) and by governments that want to better control their citizens through technology companies.
This pressure is not sustainable in the long term. No matter how benevolent and thoughtful tech executives appear to be, the lack of transparency and accountability will continue to breed allegations that they are uncaring, incompetent, biased, or even just downright evil. No matter how much technology companies protest, their central power as focal nodes on the internet makes them irresistible targets for people who want better control over users.
A new constitutionalism
The core argument of this book is that because online intermediaries play such a crucial role in regulating how users behave, we should find a way to ensure that their decisions are legitimately made. For this, we need what I call digital constitutionalism. Traditional constitutionalism focuses on power exercised by the state and is not well adapted to ensuring that the decisions of private actors are legitimately made. A more modern view of regulation can help us to understand that the type of power that intermediaries exercise over users is a type of governance power and that this power is subject to influence by a wide range of different actors. This recognition requires us to pay attention to the work that intermediaries do to govern the internet, as well as the different methods that state governments, the private sector, the media, and civil society use to influence the practices of intermediaries. Once we recognize how the internet is governed in practice, it becomes clearer that traditional ways of thinking about how the exercise of power is made legitimate are no longer adequate to protect people online.
Constitutionalism is the difference between lawlessness and a system of rules that are fairly, equally, and predictably applied.
There is no simple, single definition of what it means to govern legitimately. It is impossible to define, because it is a concept that depends fundamentally on context and constantly changes. People who exercise power have legitimacy because we collectively give it to them. So whether social media platforms, search engines, content hosts, telecommunications companies, and other entities are acting legitimately when they shape our actions and our environment depends on how much we expect from them. This is still very much up for grabs; we are still in the early days of the commercial internet, and we do not yet have an easy answer or even common agreement on the exact shape of the limits people want to see imposed on the power of tech companies.
Working out what limits we, as a society, want to impose on the exercise of power in the digital age is the first challenge of digital constitutionalism. Constitutionalism is fundamentally about the limitation of governance power; digital constitutionalism requires us to think about not just national governments but also about how the power that platforms wield ought to be limited. It is important to emphasize that digital constitutionalism does not mean we would want to treat private intermediaries as if they were exactly like nation states. We hold governments to a higher standard of legitimacy because they control armies and police, can levy taxes and imprison us, and are responsible for maintaining and organizing our core social and physical infrastructures - from education, health care, and social security, to roads and public utilities. The high standard of legitimacy we hold governments to comes at a major bureaucratic cost. It would be disastrous to try to apply these standards directly to private platforms and telecom providers.
Human rights is probably the most powerful tool we have to encourage intermediaries to make their governance processes more legitimate. The language of human rights provides a universally agreed-upon set of values that governments and businesses should work to promote. These values - and the responsibilities that accompany them - provide a useful way of making explicit concerns over the constitution of our shared online social spaces. The voluntary component of human rights compliance is already helping to set standards for what intermediaries should do, and it provides a guide for civil society to work cooperatively to amplify the pressure for more legitimate processes. The frame of human rights can also guide governments to implement better laws, with binding legal obligations. Human rights do not enforce themselves, and they are not sufficient to hold technology companies accountable, but they do provide a common language that we can use to build consensus about what we expect from those who govern us.
The key next steps towards accountability in platform governance are both straightforward and very difficult. Platforms should immediately improve their transparency practices, focusing on how they can help people understand decisions that affect them and their systems as a whole. They should hire human rights lawyers and empower them to review and advise about improving technical features and business practices. Platforms will need to reach out more to others in working through some of the tough decisions they will have to make - they should cultivate stronger relationships with experts, civil society groups, government regulators, and find some new ways to encourage genuine participation from their user communities. The rules that platforms develop should be clearer and better justified, and they must start to experiment with new systems of independent review and appeals processes that adequately deal with inevitable mistakes.
The second challenge of digital constitutionalism is building enough consensus and enough social pressure to force technology companies to create their own constitutional limits. Rulers usually do not give up power voluntarily. We are at a constitutional moment now, where change might be possible but is by no means guaranteed. For all of us who care about how the internet is governed, now is the time to work together to hold power accountable. We need to make visible the influence that technology companies have on our lives and the influence that others have on them, in turn. We need to trace how governments and private interests regulate how we behave and communicate; what we can see and share; and how we live, love, and work through the technologies that we use. For those of us who are academic researchers, this means we need to devise new research methods that can help us understand complex regulatory systems, made up of human and technical components, at massive scales, over time, and across national borders and platform boundaries. For this we will need better data, and we should be working with technology companies and governments to ensure that good data is made available and accessible for ongoing research. We will also need new theory to understand how power can and should be held to account in a decentralized environment.
And then we will need to mobilize. We will need to seize this moment to marshal and coordinate pressure on technology companies to fundamentally change their cultures - to recognize that, as powerful governors of our social lives, they owe us real accountability. At the same time, we need to resist the efforts of governments around the world to introduce new restrictions that unjustifiably limit our freedoms or threaten the conditions for autonomy and innovation that make the internet so great.
All of this means that we need new collaborations. We do not yet have the institutions that are able to regularly and consistently hold power to account at scale. A digital constitutionalism requires not just change from platforms, but new structures that can monitor compliance, continue to exert pressure, and address wrongdoing. There is a role for courts and legislatures here, but there is also a need for new institutions that can more effectively marshal social pressure in day to day governance where the legal system is too cumbersome. These new institutions require some imagination - we will have to invent them. Academics, activists, journalists, will have to work together to engage tech companies, governments, and concerned users. As for concerned users, it’s easy to feel disempowered, but there is great power in collective action. For all of us, it’s time to participate in the emerging debates about how we want our shared social spaces to be governed, to make our concerns heard to governments and technology companies, and to lend our support to the activists and the civil society organizations fighting for our freedoms.
Achieving real change is not going to be easy, but what is at stake is the possibility of constructing an internet that is vibrant, diverse, and accountable. There’s a lot of work ahead of us, but never has there been a better opportunity to make serious change than now.