Curious minds select the most fascinating podcasts from around the world. Discover hand-piqd audio recommendations on your favorite topics.
piqer for: Global finds Technology and society
Prague-based media development worker from Poland with a journalistic background. Previously worked on digital issues in Brussels. Piqs about digital issues, digital rights, data protection, new trends in journalism and anything else that grabs my attention.
In an effort to tackle the spread of disinformation on the platform, two days ago YouTube's CEO Susan Wojcicki announced that the company would start adding Wikipedia information to videos about known conspiracy theories. The move comes after YouTube faced heavy criticism for promoting conspiracy videos and hateful content — just last month, the platform was forced to remove one of its top trending videos, which alleged that David Hogg, a survivor of the Parkland school shooting, was a paid crisis actor.
Critics doubt that providing alternative viewpoints alongside controversial videos will solve the problem. Especially if YouTube itself encourages consumption of extremist content through its algorithmic recommendations, as techno-sociologist Zeynep Tufekci suggests. In a sharp op-ed for The New York Times, she delineates YouTube’s monetization process, in which the company feeds its viewers with increasingly radical content, as this tends to lead to consumption of more videos. In other words, “YouTube leads viewers down a rabbit hole of extremism, while Google racks up the ad sales.”
Although YouTube has faced intense scrutiny over the way terrorist groups have used the site for recruitment and propaganda, it’s only recently that the platform cracked down on videos posted by the far-right. But YouTube’s ability to radicalize goes beyond any specific topic or area, Tufekci’s insightful article points out.
“Videos about vegetarianism led to videos about veganism. Videos about jogging led to videos about running ultramarathons. It seems as if you are never ‘hard core’ enough for YouTube’s recommendation algorithm. It promotes, recommends and disseminates videos in a manner that appears to constantly up the stakes. Given its billion or so users, YouTube may be one of the most powerful radicalizing instruments of the 21st century,” writes Tufekci.