Several days ago, Bloomberg reported on the pleadings of several YouTube employees who had been trying for years to get the media giant to take more direct action against violent, obscene, or extremist videos. Yet allegedly, the company decided to prioritize views over these concerns and actively discouraged employees from finding or even acknowledging the presence of these videos on its platform.
The early days of YouTube were not like this. Micah Schafer, a policy writer for the company who joined in 2006 – before it was bought by Google – recalls their rapid response to problematic videos, such as those that encouraged anorexia. They would quickly put age-restrictions on videos, stop including them in recommendations, or just take them down entirely. Yet not long afterwards, this hands-on approach was quickly abandoned.
It would seem that the buyout by Google had changed things. Although YouTube was never profitable (and still may not be), it became a priority to stem the loss of cash as best as possible. Once users expect to use a service for free, it becomes very difficult to change that assumption without alienating and losing your userbase. Since users wouldn’t accept having to pay, in 2012 YouTube decided that advertising revenue became the next best strategy, and would do so by increasing viewership as much as possible in order to run more ads.
This approach resulted in the changing of how videos were recommended and autoplayed in order to show as many ads as possible. Unfortunately, this resulted in extremist and disturbing content being elevated as well. According to over twenty YouTube employees, proposals to reduce or eliminate the rise of toxic content were ignored in favor of promoting higher viewership ratings. Five employees eventually left because of YouTube’s unwillingness to tackle – or largely acknowledge – it’s growing problem.
Where are we today? YouTube is slowly starting to change it’s policies to restrict or limit access to toxic content, such as demonitizing anti-vaccination videos or adding informational boxes from Wikipedia next to conspiracy videos. While some may describe their response as better late than never, others will call it too little, too late. Undoubtedly, the presence of these videos have already influenced the thinking – and perhaps voting choices – of many individuals already. While YouTube’s efforts appear to be a step in the right direction, one can only hope that those negatively affected will benefit from these positive changes, and that good ideas will eventually win over bad. Time will tell.