Digital Secretary Oliver Dowden's Times Red Box Op-ed
The Digital Secretary reacts to Twitter's suspension of Trump's account, and outlines how the UK Government plans to protect free speech while regulating social media.

If the last decade has been defined by anything, it鈥檚 the power of social media. Its opening saw the hope of the Arab Spring, whilst its closing witnessed last week鈥檚 disgraceful scenes at the US Capitol.
Both were the product of social media鈥檚 unprecedented ability to spread ideas and bring people together, for both good and bad. Put simply, we have a new printing press - but it鈥檚 an invention whose implications society and governments are just beginning to grapple with.
With so many of us now consuming our news and information through social media, a small number of companies wield vast power in shaping how we see the world. To an outsider, it doesn鈥檛 always seem this power is wielded transparently or consistently.
Iran鈥檚 Ayatollah has a Twitter account, whilst the elected President of the United States is permanently suspended from holding one. Trump鈥檚 supporters have labeled that move censorship; the other half of the country has asked what took so long.
Norway鈥檚 Prime Minister has had posts defending freedom of expression deleted on Facebook because they contained the iconic 鈥淣apalm Girl鈥� photo - an unintentional violation of the site鈥檚 child nudity policy - whilst in Myanmar, the same platform has been used to whip up hatred towards Rohingya Muslims.
Those facts alone should make anyone who loves democracy pause for thought. The idea that free speech can be switched off with the click of a button in California is unsettling even for the people with their hands on the mouse.
Just this week, Twitter鈥檚 CEO, Jack Dorsey, said that while he felt that it was right for his platform to ban Trump, leaving platforms to take these decisions 鈥渇ragments鈥� the public conversation and sets a dangerous precedent.
So as we enter a new era in our relationship with tech, who should decide its rules?
We need to be able to define what social media is and isn鈥檛. Given it is now so crucial a part of public discourse, should we compare it to a utility? Or should we see social media companies as publishers, akin to newspapers - and therefore liable for everything they publish?
In reality, neither the passive 鈥減latform鈥� nor the editorialised 鈥減ublisher鈥� truly hit the mark. Holding companies liable for every piece of content - for 500 hours a minute of uploads on YouTube alone - would break social media.
But equally, when these companies are curating, editorialising, and in some cases removing users, they can no longer claim to be bystanders with no responsibility whatsoever.
However we categorise social media, one thing is clear: as with other forms of mass communication, democratically elected governments must play a role in regulating it.
In the UK, we are leading the world by starting to deal with this dilemma. I have been clear that we are entering a new age of accountability for tech.
At the end of last year, we outlined plans for a groundbreaking new rulebook for social media companies: one that would make sites like Facebook and Twitter responsible for dealing with harmful content on their platforms, while also holding them answerable for their wider role and impact on democratic debate and free speech.
We can no longer outsource difficult decisions. There鈥檚 now a burning need for democratic societies to find ways to impose consistency, transparency, fairness in the online sphere.
And it needs to be flexible enough to adapt as social media evolves. There鈥檚 always another platform somewhere else on the horizon. No-one had ever used TikTok in the UK before 2018 but now 17 million of us do: almost double the total newspaper circulation in 2019.
It also means navigating some complex philosophical disputes. How do you resolve the inherent tension, for example, between protecting people from dangerous misinformation in a global pandemic, whilst also protecting their right to express an opinion?
There are no easy answers. But we are setting out the parameters.
We need to do everything we can to protect our most vulnerable citizens, and particularly children, from harm. Our upcoming Online Safety Bill holds them as our number one priority.
The second is the protection of free speech. As Lord Justice Sedley put it in 1999, that definition has to include 鈥渘ot only the inoffensive but the irritating, the contentious, the eccentric, the heretical, the unwelcome and the provocative.鈥� Without the latter, we are making an empty promise.
So, under our legislation, social media giants will have to enforce their terms and conditions consistently and transparently. This will prevent them from arbitrarily banning any user for expressing an offensive or controversial viewpoint.
If users feel like they鈥檝e been treated unfairly, they鈥檒l be able to seek redress from the company. Right now, that process is slow, opaque and inconsistent.
And it鈥檚 absolutely vital that internet regulations can鈥檛 be used as a tool to silence an opponent or muzzle the free media. So news publishers鈥� content on their own sites will be exempt.
The decisions governments around the world take will shape democracies for decades to come. As the UK takes up the G7 Presidency this year, we want to work with our democratic allies to forge a coherent response.
We are just taking the first steps in this process. But decisions affecting democracy should be made democratically - by governments accountable to parliament, not executives accountable to shareholders.