Last week, YouTube launched a crackdown on white supremacists and purveyors of hoaxes. It took down thousands of videos and channels that featured Holocaust denial and promoted Nazi ideologies.
But instead of praise, the implementation of a new hate speech policy managed to offend a wide array of would-be supporters: Some of the advocates who had been lobbying for YouTube to change its practices protested that their video clips had been wrongly caught up in the sweep. Among the videos that YouTube removed were clips of Hitler's speeches and videos explaining the origins and dangers of white supremacist ideas that had historical and educational value.
YouTube wields enormous power as the gatekeeper of 5 billion hours of video uploaded daily. Its role is part social media service, part real-time broadcaster, and part archive - meaning censorship on YouTube is more likely to raise difficult questions of erasing history.
Far more than Facebook or Twitter, YouTube's vast video library has made it a first destination for countless students to research their term papers. Academics and journalists use the archival footage uploaded onto the site to analyse the past.
Because the service has a "quasi-educational role," said Adam Neufeld, vice president for the Anti-Defamation League, it is even more important that the company be vigilant about not pushing misinformation.
But unlike a traditional library, YouTube's algorithms are designed to recommend related content and reward "watch time," a formula that too often has led unwitting users down a rabbit hole of conspiracies and hateful ideas. Instead of solving that problem, the company's new policies appeared to throw the baby out with the bathwater - to take down the good with the bad in one fell swoop.
The company has "a big problem with blanket or ham-handed applications of rules," said Heidi Beirich, director of the Intelligence Project for the anti-hate advocacy group the Southern Poverty Law Center. One of the group's videos had been removed in the purge.
By Thursday, YouTube had reinstated some of the videos, including the Southern Poverty Law Center's clip, and even put up its own warning labels on some educational content.
But the company also emphasised that it was up to the public to provide context when people are uploading sensitive content or their videos would be taken down. YouTube, which until recently adopted an anything-goes approach to user-generated content, now argues that the public may not be able to readily discern the difference between the promotion of a hateful ideology and the act of teaching about it.
YouTube's new policy prohibits videos in which a user asserts superiority over a vulnerable group, such as women, veterans, gay people, people of colour, and victims of a violent crime. The policy also bans videos alleging that a well-documented violent event, such as the Holocaust or the Sandy Hook school shooting, did not take place. Previously, YouTube only banned videos in which users directly called for violence against a protected group.
The company uses a combination of human monitoring and software in its takedown efforts, and says that every video that was taken down in the sweep was subject to a human review.
"We aren't quite where we want to be," said Sundar Pichai, the chief executive of YouTube parent Google, in a Sunday interview with Axios on HBO, describing YouTube's efforts to remove hate speech,. "YouTube is the scale of the entire internet. But I think we are making a lot of progress."
"It's a hard computer science problem," he added. "It's also a hard societal problem because we need better frameworks around what is hate speech, what's not, and how do we as a company make those decisions at scale and get it right without making mistakes."
YouTube said that with a service so large - over 1.8 billion people log in on a monthly basis - there were bound to be mistakes. He said the company was looking at ways to make content that has academic and research value available to researchers in the future. "Policy updates are always complicated, especially at the beginning as teams get up to speed," said YouTube spokesman Farshad Shadloo. "Our policies apply to all creators equally."
Educational videos that got swept up in YouTube's takedown include clips of Hitler's speeches uploaded by teachers who focus on World War II. A video channel run by Cal State San Bernardino's Center for the Study of Hate and Extremism also disappeared from the platform, but was reinstated after inquiries from the Los Angeles Times. (YouTube confirmed that the video was reinstated.)
Another video that was removed came from the channel of the SPLC, which for years has lobbied Google to take a more aggressive stance against white supremacy. The video featured a journalist interviewing prominent British Holocaust denier David Irving.
"The video was likely flagged as Holocaust denial propaganda, but what it is is an exploration of those views and why they are problematic," said Beirich, who appealed the takedown. When the video was reinstated several days later, it had a warning label which said, "The following content has been identified by the YouTube community as inappropriate or offensive to some audiences."
The video service is also following in the footsteps of Google, which has changed its search algorithm for certain terms and is now curating content it deems to be authoritative alongside search queries.
In 2015, for example, the SPLC complained to Google that top results of searches for the term "Martin Luther King" yielded hate sites and disguised white supremacy sites. Such sites led to the radicalization of Dylann Roof, who was convicted of a hate-filled shooting of nine African Americans at a South Carolina church, and who described his internet-inspired conversion to white supremacy in a manifesto.
At the time, Beirich said Google employees told her that no changes could be made to the search terms, but shortly after, results began to change. Google declined to comment on the meeting.
Danielle Citron, a professor focusing on censorship and free expression at the University of Maryland Carey School of Law, said that educational efforts can be critical tools for countering hateful ideologies, particularly in an age of algorithms. The advocacy videos have a good shot of ending up in the same feeds as the hate videos because they use common terms, potentially reaching the users most vulnerable to radicalisation.
The argument for keeping up some kinds of demeaning or derogatory speech is that "if you take it down, you lose chances to combat hate as well -- you lose opportunities to try to persuade," she said.
But ADL's Neufeld pointed out that such attempts to persuade people from radicalisation often fall flat. Researchers have found that many people who seek out political views online and elsewhere are looking to confirm their preferences, and experiments in turning them away from their preferences can often make them dig in even further because they resist being told that they are wrong.
Critics say YouTube is contributing to the radicalisation driving several massacres recently, such as the shooting at a mosque in Christchurch, New Zealand. The video-broadcasting giant began discussing changes to its hate speech policy roughly a year ago as part of a systematic effort to review policies around different topics, such as violent extremism and misinformation. But conversations about the dangers of white supremacy accelerated after the Christchurch shooting, a person familiar with the discussions said.
Beirich was surprised to learn that YouTube had been working on the new policy for a year. "If it was that long, why were there these basic errors?"
© The Washington Post 2019
For the latest tech news and reviews, follow Gadgets 360 on X, Facebook, WhatsApp, Threads and Google News. For the latest videos on gadgets and tech, subscribe to our YouTube channel. If you want to know everything about top influencers, follow our in-house Who'sThat360 on Instagram and YouTube.