It is one of the most famous photos in the world. Phan Thi Kim Phuc flees a napalm attack during the Vietnam War. It’s a horrifying symbol of the blind destruction of armed conflict.
If you saw that photo on Facebook, would you "like" it? Perhaps you might feel that "liking" such an image would be an improper response. But if you did not "like" it, Facebook’s algorithm – known as Edgerank – would take note. It would make sure you did not see any more pictures like it in the future.
If that doesn’t seem right to you, don’t worry – because, in reality, Facebook might never have let you see it in the first place.
Earlier this month, the photo was removed from the site as it breached standards about nudity on the social network. As a result, the journalist who uploaded it was barred from using the site for three days.
Eventually, a Norwegian newspaper, Aftenposten, cried foul over the takedown of the picture, calling chairman and chief executive Mark Zuckerberg "the world’s most powerful editor".
Thousands of people responded to the takedown with an act of virtual civil disobedience by posting the image of Phuc on their Facebook pages and, in some cases, daring the company to act. Hours after the pushback, Facebook reinstated the photo across its site.
Now, remember the Ice Bucket Challenge? As you will recall, it was a great PR coup that raised $31.5m (£18.9m) for ALS Association in 2014.
When it was blowing up on Facebook that summer of 2014, something else was happening in the US.
In Ferguson, Missouri, the fatal shooting of Michael Brown by white police officer Darren Wilson led to protests and riots. The unrest sparked a vigorous debate in the US about the relationship between police and African Americans. Twitter was full of it. But if you looked on Facebook, you would not have seen much news about Ferguson.
Facebook’s algorithm preferred the Ice Bucket Challenge over police protests because it wanted to show us things it thought we would "like" instead of what might actually be important.
Unlike, say, Twitter, Facebook is an environment in which users are structurally and architecturally encouraged to be positive and "like" things. And that means most people’s feeds are dominated by happy news.
The people at Facebook have discovered that happy news is what keeps us on the site and they have run tests to see how long Facebook has to go down before users abandon the site (the answer, they were surprised to learn, was that users never abandon the site). And the more we are on the site, the more ads Facebook can sell.
At the same time, Facebook is spooning more and more content into our news feeds. Its latest wheeze is to add a "Live" button to the top-left corner of the mobile site so users can watch more live-streaming.
Facebook isn’t deliberately manipulating the news to influence users or even events – but it could, as its own research on voter turnout in the 2010 US election proves.
And, let’s be frank, it’s not like it has not done it before. Facebook messes with our news feeds all the time. In August this year, it disabled ad-blockers on on the site without explicit permission to do so from users. Before that, it de-emphasised news stories in news feeds in favour of posts from friends and family. As I write this, it is testing auto-playing all of its ad videos in Australia with the sound automatically turned on.
All these moves point in the same direction: Facebook is intent on making advertising dominate our online experience.
Zuckerberg’s defence is as simple as it is hollow. We are a platform, not a media company, he says. It’s not really up to us what people see on Facebook.
As writer Peter Kafka has pointed out: "When you gather people’s attention and sell that attention to advertisers, guess what? You’re a media company."
And, increasingly, Facebook appears to be a rather poor one – at least if you are a user. Changing rules, requesting ever-more personal data (under the guise of improving your experience of ads), serving ads that users can’t escape and showing only content that confirms users’ biases are not the work of a media company at the top of its game.
Does it matter? I think so. Our options are being shaped, manipulated and, yes, narrowed by clever people who just want us to carry on enjoying what we are already enjoying. That’s lame.
The dark side of the Force, according to Wookieepedia, "is a pathway to many abilities some consider to be unnatural".
Facebook just signed up.
Andy Pemberton is the director of Further
(This article first appeared on CampaignLive.co.uk)