YouTube’s algorithm reportedly doesn’t care if you ‘thumbs down’ videos

Spread the love

A photo of a screen on YouTube with the mouse over the Dislike button.

YouTube has already prevented videos from showing how many “dislikes” they have received, but apparently giving a video a thumbs down doesn’t change the number of similar videos the platform recommends to you.
Photo: wachiwit (Shutterstock)

My YouTube recommendations are full of old reruns of Gordon Ramsay’s Kitchen Nightmares. It may be partly my mistake for getting drunk one night and watching an entire episode. Let me tell you, if there’s one thing I don’t want on my feed anymore, it’s the famous British braggart taking down another chef while the world’s nastiest sound effects (braaaa-reeeee) are mixed in the background. I haven’t liked a lot of these videos, but now I have Hell’s Kitchen showing up on my page, and I feel more and more like a “raw” steak that Ramsay is prodding and scolding.

But apparently I’m not alone with my YouTube recommendation issues. A report from the Mozilla Foundation released on monday states, based on a survey and crowdsourced data, that the “dislikeThe ” and “do not recommend channel” feedback tools do not actually change the video recommendations.

Well, there are two points here. One is that users consistently feel that the controls offered by Google-owned YouTube are not really make a difference. Two, based on data collected from users, that the controls offer a “negligible” impact on recommendations, meaning that “most unwanted videos still slip through.”

The foundation relied on data from its own regret reporter browser add-on tool that allows users to block selected YouTube videos from appearing in their feed. The report says it based its analysis on about 2,757 respondents and 22,722 people who allowed Mozilla to access more than 567 million video recommendations taken from the end of 2021 to June 2022.

Although the researchers admit that the respondents are not a representative sample of YouTube wide and diverse audience, a third of respondents said that using YouTube’s controls didn’t seem to change their video recommendations at all. One user told Mozilla that they would report videos as misleading or spam and that they would be back in their feed later. Respondents often said that blocking a channel would only result in recommendations from similar channels.

YouTube’s algorithm recommends videos to users that they don’t want to watch, and it’s often worse than Ramsay’s old wire. A 2021 report from Mozilla, again based on crowdsourced user data, claimed that people browsing the video platform are regularly recommended violent content, hate speech, and political misinformation.

In this latest report, Mozilla researchers found that pairs of videos that included rejected users, such as a Tucker Carlson rule, it would only result in another video being recommended from the Fox News YouTube channel. Based on a review of 40,000 video pairs, often when a channel is blocked, the algorithm simply recommends very similar videos from similar channels. Using the “I don’t like” or “I’m not interested” buttons only prevented 12% and 11% of unwanted recommendations, respectively, compared to a control group. Using the “don’t recommend channel” and “remove from watch history” buttons were more effective at correcting user feeds, but only by 43% and 29%, respectively.

“In our analysis of the data, we determined that YouTube’s user control mechanisms are inadequate tools to prevent unwanted recommendations,” the Mozilla researchers wrote in their study.

YouTube spokeswoman Elena Hernandez told Gizmodo in an email statement that “Our controls do not filter entire topics or viewpoints, as this could have negative effects on viewers, such as creating echo chambers.” The company has said that they don’t prevent all related topic content from being recommended, but they also claim to push “authoritative” content while suppressing “borderline” videos that come close to violating content moderation policies.

in a blog post 2021, Cristos Goodrow, YouTube’s vice president of engineering, wrote that its system is “constantly evolving” but that providing transparency in its algorithm “is not as simple as listing a formula for recommendations” as its systems take clicks into account. , watch time, survey responses, sharing, likes and dislikes.

Of course, like all social media platforms, YouTube has struggled to create systems that can combat the full breadth of bad or even predatory content being uploaded to the site. an upcoming book shared exclusively with Gizmodo He said that YouTube came close to making billions of dollars in ad revenue to deal with the strange and disturbing videos that are recommended to children.

Although Hernández stated that the company has expanded its data apithe SPokesperson added: “Mozilla’s report doesn’t take into account how our systems actually work and so it’s hard for us to get a lot of insights.”

But this is a criticism Mozilla also lays at Google’s feet, saying the company doesn’t provide enough access to allow researchers to assess what affects YouTube’s secret sauce, aka its algorithms.

#YouTubes #algorithm #reportedly #doesnt #care #thumbs #videos

Leave a Comment

Your email address will not be published.