The extensive reporting about Facebook over the past week by The Associated Press and a consortium of news organizations has been impressive in its quantity and depth.
But what it told the world is something that was already either established by other reporting or assumed: namely, that Facebook is driven by profit, not social responsibility, and that while it may profess a desire to be both profitable and responsible, the bottom line usually wins out when the two clash.
This most recent peek behind the curtain of the global social media giant was facilitated largely by a trove of internal company documents provided by Frances Haugen, a former Facebook employee who became a whistleblower. She turned on her former employer after becoming disenchanted with its lack of effort to reduce the flow of misinformation, slander and viciousness that Facebook not only enables but on which it richly profits.
Facebook, as the documents affirm, has written its computer programs to put a priority on what it calls “engagement.” The more interest that a post creates, the more it gets spread to others on the platform. The more a post is spread, the more engagement it receives, creating a vicious cycle, all of which produces the eyeballs that Facebook then sells to advertisers.
As has been established since the beginning of communication between human beings, the less truthful and more derogatory a statement, the more likely it is to be passed on. As the maxim says, “a lie can travel halfway around the world while the truth is still putting on its shoes.” The problem is that the internet has exponentially increased the speed of travel for lies and calumny, and Facebook is the dominant disseminator of misinformation and incendiary speech.
Its efforts to rein in the bad actors have been mostly ineffectual. Why so is a matter of speculation.
If you want to give Facebook the benefit of the doubt, it’s that the task is impossible. With an estimated 3 billion users worldwide constantly posting news, comments and all matters of personal trivia, there is no way that Facebook can significantly weed out the true from the false, or the dangerous from the innocuous.
If you have a darker suspicion of Facebook, the explanation would be that the cesspool within the social media world is exactly what the company’s business model intended. It has written algorithms that are designed to encourage the outrageous and offensive, and its efforts to moderate what it has spawned are half-hearted at best.
Haugen offers one antidote: to pressure Facebook to change its algorithms to rank posts — and thus their dissemination — based on truthfulness instead of how many likes, dislikes, comments and shares a post receives.
The company has, in fact, tested such a change. It took 6,000 of its users and applied truth as its priority in ranking what got fed to them from posts about vaccines. The results were significant. The test sample saw a 12% decrease in debunked claims and an 8% increase in authoritative ones.
Such a change, though, would be voluntary and rely on Facebook’s own social conscience, which has already been proven to be lacking.
A better solution is to make Facebook responsible — as are newspapers, broadcasters and other traditional media companies — for everything it publishes. Lifting the 25-year-old shield that currently protects it from defamatory lawsuits would force Facebook to either hire enough human moderators — rather than relying on computer programmers — to restore truth and civility to its platform, or go out of business.
The only permanent way to clean up Facebook is to make rumor-mongering and slander unprofitable. Anything less won’t do enough to protect society from its and Facebook’s worst instincts.
- The Greenwood Commonwealth