YouTube and ‘fake news’: The end of algorithm opacity?
There’s a predictable rhythm to the online aftermath of any sort of mass shooting. After the initial outpouring of sympathy and grief, you can expect the media coverage to turn to the reactions of the alt-right, both through morbid curiosity and a genuine attempt to explain how conspiracy theories get disseminated online. As sure as The Onion publishes its savage takedown of the Republican response to mass shootings, you can expect the exposés of reddit’s r/The_Donald subreddit to follow.
Things were no different in the immediate aftermath of the Marjory Stoneman Douglas High School on Valentines Day. Members of the alt-right immediately began espousing the idea that some of the teenagers caught up in the incident were so-called ‘crisis actors’, and that the entire incident had been staged in order to promote a pro-gun control agenda. And regular as clockwork, the newspapers began reporting on that, too. What was different about their response this time was that most of the coverage was of the means by which the theories were spread rather than the theories themselves, and commenters zeroed in on one particular vector for the misinformation: YouTube. The Register’s Kieren McCarthy said:
“Despite months of in-depth investigations into the distortion of social media platforms, and several formal hearings by lawmakers in both the US and UK, it appears that Google-owned YouTube remains unable to prevent the manipulation of its own systems.”
YouTube has, for some time, been considered the dark horse of platforms that spread online misinformation. While Facebook and Twitter are the platforms most frequently associated with it, increasing scrutiny is being put on YouTube and its parent company Google for the algorithms it employs that allow the propagation of dangerous misinformation of this nature.
The algorithm I worked on at Google recommended Alex Jones’ videos more than 15,000,000,000 times, to some of the most vulnerable people in the nation.
— Guillaume Chaslot (@gchaslot) February 25, 2018
Since the shooting, people with knowledge of the algorithm have effectively been arguing that, far from being an aberration, this is YouTube’s algorithm performing as intended, and that YouTube’s business priorities are what has created the environment that allows misinformation to flourish. AlgoTransparency, which has been built by people with insider knowledge of YouTube’s algorithms, was created to explain how the system of recommendations allow just that.
1) YT conspiracy genre is thriving & grows w/every terror event, shooting, etc.
2) Youtube is propaganda’s “endgame” b/c it’s where content can be created & fully monetized
3) YT biz model algorithmically incentivizes scaled disinformation campaigns targeting victims of events— J0nathan A1bright (@d1gi) February 25, 2018
It follows a recent public exploration of the dark side of YouTube for kids, where content created and flagged up by algorithm led to a significant backlash. The author of the piece that started the backlash, James Bridle, wrote:
“Someone or something or some combination of people and things is using YouTube to systematically frighten, traumatise, and abuse children, automatically and at scale, and it forces me to question my own beliefs about the internet, at every level.”
As a result of the negative publicity surrounding that controversy, and the focus on its role in promoting false flag conspiracies after mass shootings, it’s likely that YouTube will start making noise about fixing the system, as much for economic reasons as for moral ones. Big advertisers are increasingly savvy about the environments in which their ads appear, and are demanding ‘safe’ advertising spaces much more vociferously than they used to.
It’s too much to hope that YouTube will open its algorithm up to any real scrutiny – just as Facebook refused to when it first began grappling with its own misinformation crisis. We’ll probably still have to rely on efforts like AlgoTransparency in order to figure out exactly how misinformation gets propagated. So in the meantime, expect the cycle to continue in the aftermath of any alt-right bait.
Martin Tripp Associates is a London-based executive search consultancy. While we are best-known for our work in the TMT (technology, media, and telecoms) space, we have also worked with some of the world’s biggest brands on challenging senior positions. Feel free to contact us to discuss any of the issues raised in this blog.
Image via Christer van der Meeren from Flickr.