Last weekend, Facebook founder and chief executive Mark Zuckerberg outlined the form of regulation he believes is needed to control social media.
Campaign interprets key passages from Zuckerberg’s manifesto, which was published first by The Washington Post and on his Facbeook page on 30 March.
'I believe we need a more active role for governments and regulators.'
This is a recognition that, finally, governments and regulators have realised Facebook needs to be regulated. This is despite Facebook, along with fellow tech giants Google and Amazon, spending tens of millions of dollars on lobbying the US government last year.
One might consider this very article by Zuckerberg as a piece of lobbying itself.
UK politicians have taken a lead in calling for major tech platforms to be regulated and the chancellor has asked the Competition & Markets Authority to investigate whether Facebook and Google have too much power in the digital media market.
'By updating the rules for the internet, we can preserve what’s best about it – the freedom for people to express themselves and for entrepreneurs to build new things – while also protecting society from broader harms.'
The word advertiser is never mentioned in Zuckerberg’s piece (except when referring to political ads), but advertising, on which Facebook almost entirely depends for survival, looms large behind this statement.
Facebook talks about "expression" in a newspaper article, but when it talks to advertisers it uses "engagement" to say the same thing. Facebook depends on people logging in, posting content, sharing content outside the platform and creating data about themselves.
The problem is, Facebook is incentivised to make us outraged as a sure-fire way to keep us "engaged". In order to keep users captive on its platform, the content that yields the highest emotional responses are shared the most and these are inevitable the most controversial.
'From what I’ve learned, I believe we need new regulation in four areas: harmful content, election integrity, privacy and data portability.'
These are the areas that Facebook believes it has already shown it has made progress in terms of lessening harm or increasing accountability. It's harder (at least politically in free-market economies such as the US) to impose new restrictions on those areas where it has been trying to self-regulate.
'…we have a responsibility to keep people safe on our services. That means deciding what counts as terrorist propaganda, hate speech and more. We continually review our policies with experts, but at our scale we’ll always make mistakes and decisions that people disagree with.'
From this week, Facebook is going to start cracking down on white nationalist hate speech, having spent three months consulting with academic experts. One has to wonder with whom Facebook consulted about this for the 15 years it has been in existence.
However, Zuckerberg is muddying the waters here. The problem with Facebook is not really about where the line is between criminal behaviour and freedom of expression – something that democratic societies have grappled with ever since mass media was invented. The problem is that Facebook refuses to take on the editorial responsibility that comes with being a publisher. For lawmakers to force Facebook to take on these responsibilities, it would lead to serious costs to its business model.
As well as using machine learning to screen for harmful content, Facebook hired 30,000 human moderators worldwide. Whether this is sufficient for a platform with more than two billion users is doubtful.
'Lawmakers often tell me we have too much power over speech, and frankly I agree. I’ve come to believe that we shouldn’t make so many important decisions about speech on our own.'
Facebook is willing to sub-contract its editorial responsibilities to regulators, but is it willing to pay national governments for policing its platform?
'Facebook already publishes transparency reports on how effectively we’re removing harmful content. I believe every major Internet service should do this quarterly, because it’s just as important as financial reporting.'
Facebook marks its own homework and wants other tech giants to do the same in order to give that practice more legitimacy.
'People around the world have called for comprehensive privacy regulation in line with the European Union’s General Data Protection Regulation, and I agree. I believe it would be good for the internet if more countries adopted regulation such as GDPR as a common framework.'
Facebook lobbied hard against GDPR before it was enacted by the EU last year. Now that it has become reality, it may as well say how good it is, and God forbid data protection laws in any other country should get even tighter.
'Finally, regulation should guarantee the principle of data portability. If you share data with one service, you should be able to move it to another. This gives people choice and enables developers to innovate and compete.'
Key to avoiding increased competition/antitrust regulation will be showing how Facebook users can take their data elsewhere. Facebook has a track record of buying out potential competitors before they’ve had a chance to grow (Instagram and WhatsApp).
To this end, it has already joined with Microsoft and Google for the Data Transfer Project, which makes it easy for people to move their data from one place to another. Again, Facebook is setting the terms of trade for being regulated, thereby making it more complicated for lawmakers to do it from scratch.
(This article first appeared on CampaignAsia.com)