How to Cure The Amoral Social Video Platform Algorithms

Mary Shelley's Frankenstein. Public domain image.

Mary Shelley's Frankenstein. Public domain image.


I'll never forget the conversation I had with a YouTube engineer several years ago. It was like the scene at the beginning of a sci-fi movie where the unwitting scientist describes his invention, unaware of the chaos he might be unleashing.  He enthusiastically explained to me how YouTube's algorithm would be designed to surface videos with lots of views that are similar to videos a viewer had previously watched.

I remember at the time thinking how hard it would be for our business (high-cost animation) to compete on YouTube because we would have so little control over how our content was presented, branded and promoted.  But my problems were parochial compared to the larger impact YouTube's decision would have.

The YouTube engineers made endless small incremental tweaks to YouTube using A/B testing to find the right formula to maximize watch-time and surface videos that were similar to the ones you've already watched.  There's no denying the empirical results. The world watches over 1 billion hours of YouTube per day.

More watch-time on poplar videos and surfacing more videos that you've already shown an interest in - what's not to like?  Like an amoral robot run amok, the algorithms started producing unwanted results - not just for YouTube but for all the social platforms giving rise to the popularity of extreme, offensive and controversial videos.   The platforms have become the playground for shock jock vloggers, conspiracy theories and Russian meddlers. 

The “we-just-give-people-what-they-like” hands-off ethos is a distinctly Silicon Valley modus operandi.  Google can still maintain that they're simply "organizing the world's information," and Facebook can still claim they are just "connecting people." To be fair, the social platforms are about a lot more than controversial content and they're all trying to address the problems that their algorithms have inadvertently created.  But there is no denying there is a problem - for advertisers and more importantly our culture.

In early 2017, advertisers started figuring out that their ads were running alongside questionable videos. Like Harrison Ford tracking down rogue replicants, YouTube responded by unleashing their own counter algorithm and clumsily "demonetized" (removed ads) from thousands of videos in what is now known as the YouTube Adpocalypse.  This caused a 70% drop in ad revenue for most content providers on YouTube.  It also inadvertently demonetized a lot of uncontroversial videos.  The demonetization algorithm was as blind as the algorithm that favored the controversial videos in the first place. 

More recently, YouTube announced that it is hiring 10,000 editors to search out and remove non-ad friendly videos and limiting the editors to 4 hours per session because it would be too traumatizing to spend 8 hours.  The job sounds less physically demanding than driving an Uber, but the time limit gives you an idea of how distressing some of the content on YouTube can be.  Maybe these editors can sort out the extreme content from the content that is simply meant for adult audiences, but my hunch is that 10,000 editors will err on the side of heavy-handedness.

Facebook's solution.  Bury the offending videos.  They've adjusted their feed to focus less on third party published content and more on posts from friends and family.  And, Facebook is implementing a new 3-pronged attack (spammers, bots and tools to flag fake news) to address some of their problems.


I personally have an aversion to censorship.  Full disclosure - I make cartoons for 18-34's and by their nature they push boundaries, so I have a dog in this race.  The problem is that often content that is intended for adults gets conflated with gratuitous violence, hate speech and conspiracy theories.  Is YouTube really going to become so sanitized that a twelve year can safely watch it?  We've seen this movie before.  Up until the late sixties, the ad-supported networks censored themselves so heavily they created the headroom for cable and HBO.  CBS aired The Beverly Hillbillies while the nation burned.  Maybe the current dynamic will create opportunities for the emergence of new 18+ social platforms?


Advertisers have the financial clout to change a company’s behavior.  They have been successful at censoring all kinds of political voices in television - from Bill Maher post 911, to the recent Laura Ingraham ad boycott at FOX News.  Interestingly, Maher recently came in Ingraham's defense.  He understands that advertisers can be reactionary and they have their own private commercial interests.  It was advertisers that pressured YouTube into the Adpocalypse, not the offending videos themselves.


I think the government would botch any solution to these problems and so does Sen. Mark Warner (D-VA), who at this years SXSW said, “We’re going to need their (social platforms) cooperation because if not, and you simply leave this to Washington, we’ll probably mess it up.”


Asking the the platforms, advertisers or the government to play censor cop might be missing the point.  A new video platform for adult audiences could be part of the solution.  It would be ironic if YouTube became the family friendly place.  They've grown accustomed to their multi-billion dollar ad deals and this might be what it takes. 

Perhaps the social video platforms need to rely less on advertising and more on subscriptions.  It's working for the music platforms.  Or micro-transactions - would you be willing to pay one penny per video?  The growing adoption of frictionless virtual currencies could become a new means of content monetization.

What if the social platforms adjust their algorithms to favor videos that are higher quality, less base, and true?  Is an algorithm smart enough to make that call?  Do the social platforms need more human editorial scrutiny?  Is this level of human intervention economically feasible?  Stay tuned.

Favoring less controversial videos might decrease watch-time but increase the quality of the content and therefore the value of the social platform's ad inventory.  But I'm not sure that's what people are looking for when they visit YouTube.  It was partly Youtube's wild west appeal that was its attraction from day one.  Whatever combination of solutions emerge, I think it's fair to say that the technology did get out ahead of the social platform's and society's ability to cope with the problems they inadvertently created.

John Evershed is the founder and former CEO of Mondo Media and serves on several digital media company boards.