Facebook

Facebook: the real reason behind its privacy fixation

When it comes to making impressive and imposing personal statements, Mark Zuckerberg is in a class of his own. Back in 2010, the Facebook founder encouraged all us to share our lives with the world. After all, privacy was no longer a social norm. In this post I explore what went wrong.

Facebook and privacy

Zuckerberg has dramatically changed his mind about Facebook and privacy: elaborated in his 3,000-word manifesto. Privacy, he now says:

gives people the freedom to be themselves and connect more naturally, which is why we build social networks.

With this in mind, he is planning to transform Facebook. It is going from an open conversations platform to a privacy-focused communications platform. Users will be able to message each other across Facebook and its subsidiary platforms, WhatsApp and Instagram. All their communications will be encrypted end-to-end. The result, says Zuckerberg, will make Facebook feel less like a town square and more like a living room.

But I believe that we shouldn’t fall for Zuckerberg’s line. This so-called change has little to do with privacy. He’s only responding to changes in consumer behaviour. Messaging apps such as WhatsApp are attracting ever more traffic. All the while Facebook has lost 15 million US users since 2017. Zuckerberg knows that his company will need to adapt to thrive in the long term. And until he works out how to make a profit from the living room version of Facebook, he’s not going to abandon the town square version.

This is still the business that makes all the money through targeted advertising. Despite all the bad press Facebook has attracted lately, it generated $16.9bn in revenue in the last quarter of 2018.

The true motive

Facebook’s true motive may be to bind all its platforms so tightly that neither users nor antitrust regulators can untangle them. Introducing more end-to-end encryption would also push some of Facebook’s biggest PR problems under a rug. Awkward issues such as fake news and hate speech will be harder to detect, or to hold Facebook accountable for. The new model could possibly be a boon for terrorists, child abusers and cyber bullies too.

Zuckerberg says Facebook is working on its ability to identify bad actors by detecting patterns of activity or through other means. But it’s hard to have any confidence in his assurances. He and his company, after all, have shown an almost complete inability to recognise or respond to the dangers inherent in the company’s existing open products. Why would they be any more conscientious when it came to policing a closed model?

What do you think? Why not leave a comment below?

Photo by NeONBRAND on Unsplash

1 Comment

  1. […] Tech loves to boast of its desire to improve the world. Well, Facebook, Google, Apple et al now have the chance to prove it. Let’s look at evidence from China and […]

Leave a Reply