In June 2022, Paul M. Barrett and Justin Hendrix of NYU’s STERN Centre for Business and Human Rights came with a very timely report: “A Platform ‘Weaponized’: How YouTube Spreads Harmful Content— And What Can Be Done About It“. We know less about YouTube than the other major social media platforms. YouTube, with more than 2 billion users, is the most popular social media site not just in the United States, but in India and Russia as well. But because of the relative difficulty of analyzing long-form videos, as compared to text or still images, YouTube has received less scrutiny from researchers and policymakers. This in-depth report addresses the knowledge gap.
Like other major platforms, You Tube has a dual nature: It provides two billion users access to news, entertainment, and do-it-yourself videos, but it also serves as a venue for political disinformation, public health myths, and incitement of violence.

YouTube’s role in Russia illustrates this duality. Since Russia launched its invasion of Ukraine in February 2022, YouTube has offered ordinary Russians factual information about the war, even as the Kremlin has blocked or restricted other Western-based social media platforms and pressured foreign journalists in the country to silence themselves. But for years before the brutal incursion, YouTube served as a megaphone for Vladimir Putin’s disinformation about Ukraine and its relations with the West. Despite its heft and influence, less is known about YouTube than other major social media sites.
Does YouTube send unwitting users down a ‘rabbit hole’ of extremism?
In response to reports that the platform’s own recommendations were “radicalizing” impressionable individuals, YouTube and its parent, Google, altered its recommendation algorithm, apparently reducing the volume of recommendations of misinformation and conspiratorial content. But platform recommendations aren’t the only way people find potentially harmful material. Some, like the white 18-year-old accused of shooting and killing 10 Black people in a Buffalo, N.Y., grocery store, seek out videos depicting violence and bigotry. These self-motivated extremists can find affirmation and encouragement to turn their resentments into dangerous action.
A social media venue with global reach
Roughly 80% of YouTube traffic comes from outside the United States, and because of language and cultural barriers, the platform’s content moderation efforts are less successful abroad than at home. The report explores how YouTube is exploited by Hindu nationalists persecuting Muslims in India, right-wing anti-vaccine advocates in Brazil, and supporters of the military junta in Myanmar.
In Part 2, we examine YouTube’s role as the internet’s vast video library, one which has contributed to the spread of misinformation and other harmful content. In 2019, for example, YouTube reacted to com-
plaints that its recommendations were pushing impressionable users toward extremist right-wing views.
The company made a series of changes to its algorithms, resulting in a decline in recommendations of conspiratorial and false content. But recommendations are not the only way that people find videos on YouTube. A troubling amount of extremist content remains available for users who search for it. Moreover, YouTube’s extensive program for sharing advertising revenue with popular creators means that purveyors of misinformation can make a living while amplifying the grievances and resentments that foment partisan hatred, particularly on the political right.
In Part 3, we turn our attention to YouTube’s role in countries outside of the U.S., where more than 80%
of the platform’s traffic originates and where a profusion of languages, ethnic tensions, and cultural variations make the company’s challenges more complicated than in its home market. Organized misogynists in South Korea, far-right ideologues in Brazil, anti-Muslim Hindu nationalists, and supporters of Myanmar’s oppressive military regime have all exploited YouTube’s extraordinary reach to
spread pernicious messages and rally like minded users. [see also: https://humanrightsdefenders.blog/2020/11/02/bbc-podcast-on-the-framing-of-video-monk-luon-sovath/]
Recommendations to the U.S. government
Allocate political capital to reduce the malign side effects of social media: President Biden’s off-the-
cuff expressions of impatience with the industry aren’t sufficient. He ought to make a carefully considered statement and lend his authority to legislative efforts to extend federal oversight authority. Former President Obama’s recent speech at Stanford about disinformation provided a helpful foundation.
Enhance the FTC’s authority to oversee social media: Some of the issues raised in this report could
be addressed by a proposal we made in a February 2022 white paper—namely, that Congress should
authorize the Federal Trade Commission to use its consumer protection authority to require social media companies to disclose more data about their business models and operations, as well as provide procedurally adequate content moderation.
To YouTube:
Disclose more information about how the platform works: A place to start is explaining the criteria
algorithms use to rank, recommend, and remove content—as well as how the criteria are weighted relative to one another.
Facilitate greater access to data that researchers need to study YouTube: The platform should ease
its resistance to providing social scientists with information for empirical studies, including random samples of videos.
Expand and improve human review of potential harmful content: YouTube’s parent company, Google,
says that it has more than 20,000 people around the world working on content moderation, but it declines to specify how many do hands-on review of YouTube videos. Whatever that number is, it needs to grow, and outsourced moderators should be brought in-house.
Invest more in relationships with civil society and news organizations: In light of their contribution to the
collapse of the advertising-based business model of many U.S. news-gathering organizations, the platforms should step up current efforts to ensure the viability of the journalism business, especially at the local level.
The NYU Center for Business and Human Rights began publishing reports on the effects of social media on democracy in the wake of Russia’s exploitation of Facebook, Twitter, and YouTube during the 2016 U.S. presidential campaign. We initially advocated for heightened industry self-regulation, in part to forestall government intervention that could lead to First Amendment complications. As the inadequacy of industry reforms has become clear, we have supplemented our calls for self-regulation with a proposal for enhancement of the Federal Trade Commission’s consumer protection
authority to oversee the industry.
In Part 4, we offer a concise version of the FTC proposal, as well as a series of recommendations to YouTube itself. The report does not address the problem of YouTube hosting potentially harmful videos aimed at children and teenagers. This persistent phenomenon deserves continued scrutiny but is beyond the scope of our analysis.
Leave a Reply