When You ‘Dislike’ A Video, YouTube Keeps Recommending Similar Ones, Research Shows


When YouTube’s powerful recommendation algorithm pushes users toward videos they don’t like, the company encourages those people to click on the “dislike” or “not interested” buttons to tell YouTube’s software not to display such content.

But according to a study released Tuesday by Mozilla, the foundation behind the Firefox web browser, the buttons do little to weed out unwanted videos from YouTube’s personalized recommendations for users.

For example, using the “Not Interested” button only blocked 11 percent of similar video recommendations, Mozilla said. The “dislike” button only stopped 12% of the time. The most effective control was the “Don’t recommend this channel” button, which worked less than half the time, at 43%. Mozilla worked with research associates at the University of Exeter to define video similarity, creating a machine learning model to check for elements including similar themes or political views.

The study is based on data from more than 20,000 participants who voluntarily downloaded Mozilla’s RegretsReporter, a browser extension that gives Mozilla researchers access to people’s YouTube activity. The project aims to bring transparency to the recommendation software of the Google-owned video site, an opaque system accused of driving people towards conspiracy theories, disinformation and extremist content.

In addition to the RegretsReporter data, Mozilla surveyed more than 2,700 people who downloaded browser extensions about their experiences trying to shape YouTube recommendations. Nearly 40% of respondents said they didn’t think YouTube’s controls had any impact on their recommendations.

“The main finding is that some of the feelings that people are expressing that they can’t control — actually emerge in the data,” Becca Ricks, a senior researcher at Mozilla and co-author of the study, said in an interview. “In general, a lot of unpopular videos do get missed.”

YouTube spokeswoman Elena Hernandez disputed the study’s claims. “Mozilla’s report doesn’t take into account how our systems actually work, making it difficult for us to gather many insights,” Hernandez said in a statement. “We give viewers control over their recommendations, including the ability to prevent videos or channels from being recommended to them in the future. Importantly, our controls do not filter out entire topics or viewpoints, which could negatively impact viewers, Create echo chambers, for example.”

YouTube’s algorithm recommends content in a column on the right side of a video page, in the video player after the content ends, or on the site’s home page. Each recommendation is tailored to a specific audience, taking into account factors such as their viewing history, channel subscription list, or location. The recommendations can be benign, like another live performance of the band they’re watching, but critics say YouTube’s recommendations can also lead viewers to watching fringe and harmful content.

YouTube’s “dislike” button has also been controversial.Users criticize the company after YouTube stop The dislike count was shown on a video last November, upset because hiding the number meant people couldn’t express public dissatisfaction with the video.

For Mozilla’s research, the company created a new “Stop Recommendation” button for users who downloaded the RegretsReporter extension, which appears in the upper-left corner of recommended videos. Clicking the button triggers one of YouTube’s existing controls, including a dislike button or a “remove from viewing history” option, which allows Mozilla to test each different feedback feature. (To create a control group for the experiment, Mozilla also added a “stop recommending” button for some users who did nothing.)

Mozilla first launched the RegretsReporter extension in September 2020. Tuesday’s report is the tool’s second major study. The first set of findings, published last July, claimed that YouTube’s algorithms sometimes recommend videos that violate the platform’s own policies.

Mozilla has long been reviewing YouTube’s algorithms. In 2019, the group awarded a fellowship to Guillaume Chaslot, a former YouTube engineer and outspoken critic of the company, to support his research on the platform’s artificial intelligence systems.Two years ago, Mozilla announced a project it funded called their tubewhich allows people to see how YouTube’s recommendations might be different for people with different ideological views.

YouTube’s content moderation has gotten a lot of attention lately. Last week, executives from YouTube, TikTok, Meta and Twitter testified at a Senate Homeland Security hearing about posts and videos that appeared on their platforms. YouTube Product Lead Neil Mohan answers questions about hate speech and harassment.A day later, Mohan Announce Change YouTube’s policy on violent extremism to prohibit content that recruits or fundraises for extremist groups, even if the content is not affiliated with a known terrorist organization.





Source link

Leave a Reply

Your email address will not be published.