
YouTube seems to be playing continual catch-up to reduce the spread and influence of hateful content and bad actors on the platform. This week YouTube made two announcements: they posted an overview of their policies designed to protect kids, and updated their hate speech policy.
In brief, today's update expanded YouTube's hate speech policy now includes:
- Claims that one group is superior to justify discrimination.
- Claims that the covered attributes are an illness or mental deficiency.
- All videos that promote or glorify Nazi ideology.
- Claims that deny well-documented violent events took place.
- Added caste as a protected attribute.
What groups and individuals are covered by YouTube's hate speech policy?
YouTube's hate speech policy prohibits promoting hatred or violence against individuals or groups based on the following attributes:- Age
- Caste
- Disability
- Ethnicity
- Gender Identity
- Nationality
- Race
- Immigration Status
- Religion
- Sex/Gender
- Sexual Orientation
- Victims of a major violent event and their kin
- Veteran Status
What does YouTube mean when it says you aren't allowed to promote hatred?
YouTube spells out what they mean when they say you are not allowed to promote hatred of individuals or groups based on the attributes listed above:- You are not allowed to encourage, praise or glorify violence based on any of the listed attributes, including implied calls to violence. YouTube also prohibits threats and harassment.
- You are not allowed to dehumanize individuals or groups by calling them subhuman, comparing them to animals, insects, pests, disease, or any other non-human entity.
- You are not allowed to use racial, ethnic, religious, or other slurs with the purpose of promoting hatred.
- You are not allowed to promote or treat as factual stereotypes that promote hatred
- You are not allowed to use the attributes to justify violence, discrimination, segregation, or exclusion
- You are not allowed to promote conspiracy theories ascribing evil, corrupt, or malicious intent based on the listed attributes
In today's update, YouTube expanded their policy:
- You are not allowed to post content claiming a race, religion or other group is superior in order to justify discrimination will be removed, even if it does not explicitly call for violence. This includes videos that promote or glorify Nazi ideology "which is inherently discriminatory".
- You are not allowed to claim a person or group is physically or mentally inferior, deficient, or diseased based on the listed attributes. This includes using pseudoscience or statistics to promote hatred.
- You are not allowed to post content that denies that a well-documented, violent event took place, such as the Sandy Hook shootings or the Holocaust. That includes claiming that the event was "staged" or that the victims are "actors".
- The policy has been expanded to prohibit hate based on caste.
How is the YouTube hate speech policy enforced?
What makes a difference is not that YouTube has these policies, but how they actually enforce the policies. YouTube says they will be "ramping up their systems" over the next few months. Here is what they say they will be doing:- Removing content that violates policy. Any content uploaded before the policy update will not get a strike when their video is removed. Otherwise the content will result in a Community Guidelines strike if it's not the first video you've had removed. Learn more about Community Guidelines strikes.
- Reducing recommendations of videos with "borderline" content.
- Increasing recommendations of "authoritative" content.
- While hateful content has always been considered not "advertiser-friendly", so should not show ads, YouTube they will remove "channels that repeatedly brush up against their hate speech policies" from the YouTube Partner Program entirely.
If you had a video removed for violating YouTube's hate speech policy, and you feel that was in error, you can appeal.
YouTube notes that some videos that include the prohibited content may not be removed.
... some videos could remain up because they discuss topics like pending legislation, aim to condemn or expose hate, or provide analysis of current events.But they caution that this is not a "free pass":
Context matters. YouTube provides guidelines for adding context to your videos, including editing tips, adding a relevant title, and providing additional information in the video's description and Cards.
While I think limiting hate speech is generally a good thing, it remains to be seen how effective these policy updates will be. What I think will be important to watch:
- How well does YouTube enforce this policy when a video with hate speech is posted by a politician, public figure or popular YouTuber? If a politician posts a campaign video with hateful content, will it be removed?
- How well does YouTube distinguish commentary on hate speech from actual hate speech? Will the policy be used by bad actors to help flag and remove content criticizing their actions?
In what may a sign of things to come, YouTube has come under criticism in a recent high profile dispute between Vox video host Carlos Maza and "edgy" YouTube commentator Steven Crowder. Crowder has regularly mocked Maza's sexual orientation and ethnicity, which Maza says has resulted in direct personal harassment. Crowder claims it was "harmless ribbing". YouTube reviewed the content and determined that "while we found language that was clearly hurtful, the videos as posted don't violate our policies."
But while the videos were not removed, YouTube did suspend Crowder's monetization because of a "pattern of egregious actions [that] has harmed the broader community and is against our YouTube Partner Program Policies."
The policy against "preventing harm to the broader YouTube community" was announced in February 2018, in response to Logan Paul's posting a video of an apparent suicide.
Edited to add: YouTube has posted more details about their assessment of Crowder's content and their harassment policy. Their bottom line:
For hate speech, we look at whether the primary purpose of the video is to incite hatred toward or promote supremacism over a protected group; or whether it seeks to incite violence. To be clear, using racial, homophobic, or sexist epithets on their own would not necessarily violate either of these policies. For example, as noted above, lewd or offensive language is often used in songs and comedic routines. It's when the primary purpose of the video is hate or harassment. And when videos violate these policies, we remove them.They also say they will be assessing the current harassment policy "in the coming months".
So it's still not entirely clear where YouTube draws the line between free speech, hate speech and harassment, especially in cases where the videos themselves might not violate policy, but the content encourages more explicit harassment by fans.
Resources and additional information
- Announcement: YouTube's ongoing work to tackle hate
- FAQ: Update to YouTube's Hate Speech Policies
- YouTube Help Center: YouTube's Hate Speech Policy
- YouTube Help Center: YouTube's Harassment and Cyberbullying Policy
- Announcement: Taking a harder look at harassment
- Announcement: Preventing harm to the broader YouTube community
- AdSense Content Policies: Prohibited Content
- More recent updates on YouTube Policy
Is ranting against pedophiles also a form of harassment, or is it legitimate activism?
ReplyDeleteThere's no policy against ranting. The policy prohibits harassment and hate speech. Pedophilia is not a protected attribute, best I can tell.
DeleteThis comment has been removed by the author.
DeleteSo if I make a vid calling out a child predator for his behavior and if the vid encourages people to report the pedo and not trust him, false flags from trolls won't get it taken down for harassment?
DeleteHere's a vid I did on March 30, 2017 about Christian Hendricks. As my YT account is terminated I uploaded it to Twitch so people can still see it there. No insults or racial slurs were used to describe him; also my talking accent doesn't sound perfect and I had mic volume issues due to an accident as I recorded it with OBS.
https://www.twitch.tv/videos/286946792