Technology News

YouTube bias exposed by ‘TheirTube’ project that shows how platform looks to conspiracy theorists

A new tool has demonstrated how YouTube’s secretive recommendation algorithm directs users to videos promoting extremist ideologies and conspiracy theories.

The TheirTube project, developed by creative designer Tomo Kihara, allows people to “step inside someone else’s YouTube bubble” by showing them videos recommended by YouTube.

The tool shows how the world’s most popular video-sharing platform looks like to people from different political demographics, such as conservatives and liberals, as well as other personas like fruitarians, climate change deniers and ‘prepper’ survivalists.

“The proverb ‘Fish discover water last’ also describes how we are blind to the recommendation bubbles we are in,” Mr Kihara said. “Nowadays with an AI curating almost all of what we see, the only way for a person to get a better perspective on their own media environment is to see what others’ bubbles look like.”

Viewing TheirTube through the ‘Conspiracist’ persona, for example, recommends videos about unsolved mysteries and unexplained natural phenomenon.

There are also suggestions for videos with more sinister undertones, such as a 40-minute-long video about Microsoft founder Bill Gates that draws in harmful anti-vaccination conspiracy theories.

The video, which has been viewed more than 600,000 times since it was posted in May, describes Gates as an “unelected global health czar and population control advocate” with an agenda “driven by a eugenicist ideology.”

Videos listed among the recommendations at the end of the video include those that describe the coronavirus pandemic as a hoax.

Videos recommended for the ‘Climate denier’ persona include one outlining “the moral case for fossil fuels”, and another from the official channel of The American Petroleum Institute titled ‘Benefits of Pipelines’.

The project appears to demonstrate that YouTube’s algorithm panders to a psychological theory known as confirmation bias, whereby people have a tendency to favour information that supports their existing beliefs or values.

“TheirTube shows how YouTube’s recommendations can drastically shape someone’s experience on the platform and, as a result, shape their worldview,” the project’s website states.

Each of the TheirTube personas was developed through interviews with YouTube users within similar bubbles, which informed which channels and videos to suggest based on YouTube’s recommendations.

Around 70 per cent of all videos viewed on YouTube are recommended by the AI algorithm – the equivalent of 700 million hours of video content each day globally.

“These recommendation bots are useful, but it can reinforce the wrong kind of ideas too,” states a video explaining TheirTube. “Once you are in a recommendation bubble, it is difficult to realise you’re in one.”

YouTube did not reply to a request for comment about the project, though the Google-owned firm does not typically publicly disclose details about how its algorithms work.

The TheirTube project was awarded a Mozilla Creative Media Award this week, which aim to encourage “more trustworthy AI in consumer technology.”

Source

To Top