Share

Stop harassment, but don't kill YouTube in the process

YouTube’s latest attempt to crack down on hate speech has produced significant collateral damage. Policies designed to limit the spread of holocaust denialism wound up censoring videos that attacked denialism; a history teacher who dedicated his life to chronicling Nazi abuses found himself banned.

These errors were more than just predictable, and they are more than just the latest illustration of the difficulty of applying the standards and practices of old media to new media. What they show is the fundamental difference between a social media company and a media company.

Major publishers

Society expects major publishers to censor odious views, and there can be little doubt that companies such as YouTube and Facebook are major publishers. Yet professional censorship depends on a publisher’s control of the production of content. The system doesn’t work in a world of user-created content.

YouTube’s crackdown was spurred by a campaign from Carlos Maza, a journalist at Vox Media, against Steven Crowder, a YouTube “shock jock” who persistently directed homophobic and racist slurs at Maza. YouTube issued a serious of confused and contradictory responses that ultimately resulted in a new policy banning any videos “alleging that a group is superior in order to justify discrimination, segregation or exclusion.”

That rule seems innocuous enough. But Crowder’s misconduct took the form of mockery and condescension, tools of subversion often used by marginalised groups against the privileged or powerful. More to the point in this current dispute: Efforts to educate each successive generation on the horrors of Nazism will necessarily involve exposure to its worst vitriol.

Thus publishers will have to make distinctions — between spiteful and subversive rhetoric, between incitement and education. These are the kind of decisions that traditional publishers, those in control of their own editorial content, have always been expected to make (and mostly do).

Small fraction

Now consider YouTube, which releases about 500 hours of content every minute, the vast majority of which is ignored. A small fraction, as determined largely by user engagement, is elevated to a mass audience. As a publisher, then, YouTube is not only depending on its users for its content, it is depending on them to help decide which content other users see. In newspaper terms, the user is both reporter and editor.

This system is not without costs. Yet it is also essential to the massive democratisation of video content that YouTube has empowered. Content that might have struck traditional editors as trivialcontroversial or just plain weird can, if given the opportunity, strike a chord with millions of users and move into the mainstream.

The psychologist Jordan Peterson, for example, first found fame with a contentious denunciation of Ontario’s hate speech laws. He won legions of fans, however, with his voluminous collection of YouTube videos on self-help and biblical interpretation. Now a bestselling author, Peterson has given lectures about self-improvement to audiences around the world.

This is the great genius and flaw of YouTube: It publishes a wildly eccentric and sometimes belligerent psychology professor that a sophisticated human editor might ignore. It also employs sophisticated algorithms that could still label an anti-Nazi documentary as hate speech.

Open platform

The only solution is for YouTube to become a truly open platform. As much as possible, it should train both its humans and  machines to err on the side of publication. If it is going to establish some principles for what it allows, as Felix Salmon and others suggest, they should be very permissive.

That doesn’t mean YouTube can’t fight genuine harassment, which often comes from consumers rather than producers of content. For one, it can take technological steps to limit its harm, such as making aggressive blocking software available to keep comments from reaching those who don’t want to see them. It should also work with law enforcement to pursue serious threats that move offline and into the real world. It’s not always easy to draw that line, but with time it will get easier.

These are admittedly imperfect solutions. Those are usually the only kind at first when a new technology comes along and disrupts old ways of doing things. What works for the news media won’t work for social media. Whatever the solution, however, the goal should be to reduce harms while protecting benefits. On that score, some degree of vile and offensive online speech may be a cost we have to manage in order to have a free and digitally democratised society.

Karl W. Smith is a Bloomberg columnist. Views expressed are his own.

We live in a world where facts and fiction get blurred
Who we choose to trust can have a profound impact on our lives. Join thousands of devoted South Africans who look to News24 to bring them news they can trust every day. As we celebrate 25 years, become a News24 subscriber as we strive to keep you informed, inspired and empowered.
Join News24 today
heading
description
username
Show Comments ()
Rand - Dollar
19.12
+0.2%
Rand - Pound
23.67
+0.6%
Rand - Euro
20.37
+0.1%
Rand - Aus dollar
12.27
+0.3%
Rand - Yen
0.12
+0.1%
Platinum
938.90
-1.2%
Palladium
1,026.50
-0.3%
Gold
2,392.97
+0.6%
Silver
28.60
+1.3%
Brent Crude
87.11
-0.2%
Top 40
67,314
+0.2%
All Share
73,364
+0.1%
Resource 10
63,285
-0.0%
Industrial 25
98,701
+0.3%
Financial 15
15,499
+0.1%
All JSE data delayed by at least 15 minutes Iress logo
Company Snapshot
Editorial feedback and complaints

Contact the public editor with feedback for our journalists, complaints, queries or suggestions about articles on News24.

LEARN MORE
Government tenders

Find public sector tender opportunities in South Africa here.

Government tenders
This portal provides access to information on all tenders made by all public sector organisations in all spheres of government.
Browse tenders