In fake news, there's no such thing as ghosts
Despite the distorted narrative created by social platforms and indulged by key media figures, misinformation is not led by robots, foreign spies, or algorithmic aliens
David Kirkpatrick’s book The Facebook Effect attributes to Mark Zuckerberg a saying that became an instant classic. When trying to explain the News Feed mechanism to his colleagues, Facebook’s CEO said: “A squirrel dying in front of your house may be more relevant to your interests right now than people dying in Africa.”
You can think what you want about the phrase, it’s a free country (most of them anyway). And this was 2006, a time when social media’s greatest dilemma was the excess of kitten videos over hard news.
If transported 15 years into the future, this squirrel comparison would be quite different, and it would yield a much more disturbing imagine-this exercise. So, imagine this:
It’s 2021, Facebook users still don’t give a damn about “people dying in Africa” — that part hasn’t changed — and they still remain intrigued by the squirrels’ death in the front yard. Everyone’s News Feed is full of stories about the rodents — some of them are true, most are false. It turns out that the squirrels are actually being murdered by a militia group that believes these cute animals belong to a cabal that controls relevant events in the world. Moreover, there are hundreds of Facebook groups on the subject: some eagerly discuss the squirrel’s role in these events, while others urge patriotic support to hunt these little creatures. The tension all over the country creates an untenable and dangerous situation. Zuckerberg issues a statement reiterating his “commitment to working with federal officials” and saying that Facebook is “improving its ability to detect misinformation on the platform.” The same press release also points out that they “began to clear out several fake accounts” and have “removed 22 million pieces of hate speech” against squirrels…
How we (society) got to this eerily familiar scenario is easy to explain: targeted ad business model meets shadowy intentions. How they (social media companies) got away with it requires a bit more fanciful interpretation: bad robots, Russian villains, and angry living algorithms are to blame.
In order to dismantle the erroneous notion that misinformation in social platforms has a life of its own, it is vital to establish a clear (and elementary) differentiation between algorithms, automation, and data trafficking. They’re not the same thing, they’re not always together and, more importantly, they’re not Skynet. Behind every byte of every fake news piece, no matter how tiny it is, there is always a real person. Every single faking time.
Over the last few years, much has been said, written and filmed about the direct correlation between social media platforms and the infamous fake news vortex, which includes hate speech, misinformation, propaganda, promotion of violent extremism, simulation of virtual crowds and an extensive list of “etc.” But if any of these lectures, articles, and movies came to the conclusion that the solution is “delete Facebook,” I’d say forget it: social media is not going anywhere.
Actually, the most recent Global Statshot Report, from October, shows that social media users passed the 4 billion mark by the end of 2020 “and an average of nearly 2 million new users are joining them every day.” We are now 4.14 billion social media consumers and counting. Deleting ________ (write the name of your Screen Time champ here) is a fantasy.
Ok, but how did this futuristic 007-russian-robots gimmick emerge in the first place? And how can the real fake news architects affect 4.14 billion people (and counting)?
The whole Russian saga is too long to explain here, but in 100 words or less it’s something like this:
- Some good old investigative journalism revealed that a British consulting firm called Cambridge Analytica (CA) has harvested data from 87 million Facebook profiles (without users’ knowledge) via third-party “harmless” apps in the platform. This data was compiled by CA and then used to produce Facebook ads in order to provoke tailor-made reactions regarding sensitive issues. This strategy was instrumental in influencing several elections’ results worldwide, including Trump’s and Brexit’s, both in 2016. Although it was widely known that Trump’s campaign had tested and posted numerous ads itself on Facebook, it turned out that there were also some mysterious interactions between the Trump staff and Russian private companies and intelligence officials. Besides, Facebook said it flagged suspicious Russian accounts posting ads during this election period. Hence the Russian association. (I’m pretty sure I’ve failed the 100-words challenge.)
Cornered by the U.S. Congress, Zuckerberg was determined to find a scapegoat for the crisis and its aftermath, so he immediately fell in love with the Russian villains narrative: “There will always be bad actors in the world. And we can’t prevent all governments from interference,” Zuckerberg insisted, in a 2017 video.
Naturally, this was second-class baloney. A month later, Facebook’s COO Sheryl Sandberg foiled her boss’ fiction by stating the obvious: “A lot of them (the Russian-linked ads), if they were run by legitimate people, we would let them run (anyway)”. And in January 2018, Samidh Chakrabarti, Head of Civic Engagement at Facebook, admitted that the issue has always been much more serious: “If there’s one fundamental truth about social media’s impact on democracy it’s that it amplifies human intent — both good and bad. (…) At its worst, it allows people to spread misinformation and corrode democracy.”
In this same post, Chakrabarti also mentioned “Russian entities,” but at some point he gave up: “Foreign interference isn’t the only means of corrupting a democracy. The same tools that give people more voice can sometimes be used, by anyone, to spread hoaxes and misinformation.” No way, Samidh. Never would have imagined that.
The Cambridge Analytica case combined the core elements of fake news dissemination (algorithms, automation, and data harvesting), which helped in the creation of that sci-fi script with Russian aliens and self-conscious robots. Thankfully, the whole scheme was exposed due to good journalistic work and the initiative of a few whistleblowers. But the truth is, CA wasn’t the first, it wasn’t the only and it won’t be the last “bad actor” out there. The data broker market, in conjunction with the targeted ad business model from social media sites, can sell anything to anyone — from t-shirts and shoes to fear and insurgency. In this system, there’s no need for secret agents: one person is enough to promote any coordinated chaos he wants — anywhere, anytime. The depth of the rabbit hole is equivalent to the depth of the pocket of whomever is ordering the service.
Giuliano Da Empoli, author of Les ingénieurs du chaos (2019), describes how a similar game plan upended Italian politics over the last decade. “The Movimento 5 Stelle (M5S) was entirely founded on harvesting voter data about their demands regardless of any ideological base. It is as if, instead of being hired, a Big Data company such as Cambridge Analytica had seized power directly and chose its own candidate.” And he concluded his example with a disturbing observation: “Their algorithm forces people to support any cause — no matter if it’s reasonable or absurd, realistic or intergalactic — as long as it meets voters’ ambitions and fears (especially fears).”
If you get a real sense of being in control of your character’s actions and fate, think again.
Another book that helps to understand where targeted ads and data harvesting intertwine is Brittany Kaiser’s Targeted (2019). I won’t go into the merit of the book — whether good or bad — (it is bad), but when Kaiser is not making excuses, which takes half of the 400 pages, she’s actually providing very good intel. Grab some snacks and make yourself (un)comfortable:
- “The (Cambridge Analytica) database was prodigious and unprecedented in depth and breadth, and was growing ever bigger by the day. We had come about it by buying and licensing all the personal information held on every American citizen.”
- “We matched this data to their political information and then matched all that again to their Facebook data (what topics they had ‘liked’). From Facebook alone, we had some 570 individual data points on users, and so, combining all this gave us some 5,000 data points on every single American over the age of eighteen — some 240 million people.”
- “When people signed on to play games such as Candy Crush on Facebook, and clicked ‘yes’ to the terms of service (…), they were opting in to give their data and the data of all their friends for free (…)”
Perspective is really a funny thing. Today, all of this sounds as if the devil himself had emerged from the depths of hell and seized our digital souls (and, in a way, it kinda did). But four, five years ago, nobody even cared.
Here’s Alexander Nix, Kaiser’s late boss and former CEO of Cambridge Analytica, during the US presidential primaries in 2016: “The degree of granularity that can be achieved when you have the right data and the tactical operation to put it into action is incredible. Using CA’s creative guidance and voter targeting, the campaign created Facebook ads (…) to communicate with the right voters in the right way.” Semantics is another really funny thing.
One of the most curious outcomes from this data scandal and its subsequent investigations is that, in the end, nobody has actually been held accountable. Recently, Nix got a 7-year ban “from acting as a director, in the promotion, formation or management of a company,” Cambridge Analytica filed for insolvency proceedings and shut down in 2018, and Facebook escaped with a $5 billion fine and the promise it won’t do it again (not a joke). Five billion dollars is a lot of money? Sure, but it is also what Facebook makes in 26 days.
More importantly, the end of Cambridge Analytica doesn’t mean the end of the cataclysm. The $200 billion data broker industry keeps growing every year, targeted ad specialized companies were a smash hit during the U.S. 2020 election (they had no problem advertising it) and Facebook and its peers have never been so popular.
Make no mistake about the final “Trump bump” that social media companies tried to pull off when they banned the former U.S. president less than two weeks before he left The White House. While we have been debating over misinformation being spread on a massive scale in the last few years, they have been avidly working on something much more important (to them, naturally): their engagement machine. And here’s where the much-publicized algorithms come in.
Among other things, algorithms are responsible for serving those YouTube recommendations, referring new Facebook groups or showing Instagram “suggested posts” on your News Feed. Over 60% of people who joined extremist groups on Facebook were due to the recommendation system, for example. The same goes for any YouTube videos: more than 70% of the 1 billion hours watched daily are also generated by their recommendation system. And I hate to break it to you, but algorithms are not conscious robots that look a lot like that Mad Men actor.
All of these gears — data, algorithm, recommendation — are not casually interconnected: they were built to act like this, to generate more ads ergo more money. Look how Brittany Kaiser’s book describes the process: “(Any) company could use its own data and select from those data sets the kind of people it wanted to reach and then pay Facebook to onboard those lists and do a look-alike search. Facebook would then find ten thousand (or a hundred thousand, or even a million) look-alikes. The company would then send its advertising over the Facebook platform directly to those look-alikes.”
Back in 2018 Facebook wanted to embrace the Russian spies tale so badly that they hired a former CIA officer (again, not a joke). Yael Eisenstat, who also had worked as a national security adviser for the then vice-president Joe Biden, endured five months as Head of Elections Integrity Operations at Facebook. She quit when she realized the obvious: “They were never going to make the fundamental changes that address the key systemic issues that make Facebook ripe for manipulation, viral misinformation and other ways that the platform can be used to affect democracy.”
And it will get worse because the fake news industry is already packing to move its apparatus to another location: the encrypted dungeons of the unified “messengers,” which — hopefully — you already read about it here.
During the release of the Netflix doc The Social Dilemma, journalist Adi Robertson smartly pointed out that algorithmic recommendation engines are far from being the only problem in this farce. “The film briefly mentions that Facebook-owned WhatsApp has spread misinformation that inspired grotesque lynchings in India. (…) It’s a highly private, encrypted messaging service with no algorithmic interference, and it’s still fertile ground for false narratives.”
Unlike me, however, Robertson did like the movie. My greatest issue with it is that The Social Dilemma upholds the myth that fake news magically comes out of nowhere, and when there is someone behind it, it’s an enigmatic squirrel-killing robot.
But I also believe that the documentary’s many problems does not nullify its message — especially because The Social Dilemma bears the merit of presenting some solid interviews. In one of them, Justin Rosenstein, a former Big Tech engineer, establishes an extremely shrewd comparison that not even the movie itself seems to have absorbed dutifully. He said: “We live in a world in which a tree is worth more dead than alive, in a world in which a whale is worth more dead than alive. For so long as our economy works in that way, and corporations go unregulated, they’re going to continue to destroy trees, to kill whales (…) Now we’re the tree, we’re the whale. We are more profitable to a corporation if we’re spending time staring in an ad than living our life.”
And that’s it. The wry plot twist, not in the documentary but in real life, is that in this current world, packed with fake villains and ghost stories, now we’re the tree, now we’re the whale. Essentially, now we’re the squirrel.