Why YouTube’s ‘Dislike’ and ‘Not Interested’ Buttons Have Barely Any Effect?

Mozilla's analysis of YouTube's algorithms showed that pressing feedback buttons did little to stop future similar video recommendations.
youtube

It’s no secret that YouTube’s recommendation algorithm is a mystery to both creators and viewers. A recent study by Mozilla contends that when viewers choose options like “dislike” and “not interested” to prevent YouTube from suggesting similar videos, the suggestions they receive do not significantly vary from previous ones. They discovered that YouTube keeps video recommendations coming in, even after users tell the company they aren’t interested in particular video types.

What does the Mozilla study involve?

Over 20,000 genuine YouTube viewers participated in the Mozilla study’s data collection on video suggestions. They recruited people from RegretReporter, a browser add-on that places a general “stop recommending” button over the participant’s chosen YouTube videos.

shutterstock 1922399360

When users clicked the button set up by Mozilla, it sent different signals to YouTube based on the group they were randomly assigned to on the back end. These signals included dislike, not interested, don’t recommend channel, remove from history, and a control group for which no feedback was sent to the platform.

Advertisements

More than 44,000 pairs of videos—one “rejected” video and one YouTube recommended video—were made by study assistants, who also gathered data from more than 567 million recommended videos.

A thorough survey was conducted to fully comprehend the feedback of 2,757 RegretReporter users. To decide whether the recommendation was too similar to the video a user rejected, the researchers independently examined the pairs or utilized machine learning.

What the researchers discovered

Despite consumers using feedback tools or adjusting their settings, the organization’s analysis found that YouTube still presented them with videos similar to the ones they had rejected.

Advertisements

To “teach” the algorithm to propose better content, they found that 78.3% of users used YouTube’s feedback buttons, adjusted their settings, or avoided specific videos. 39.3% of those who took any action to better manage YouTube’s recommendation system claimed that their efforts were unsuccessful.

Clicking “not interested” and “dislike” only managed to stop 11% and 12% of bad recommendations, respectively, when it came to preventing bad recommendations. The effectiveness of techniques like “don’t recommend channel” and “delete from watch history” was higher, with a reduction in inapplicable recommendations of 43% and 29%, respectively.

“Nothing changed. Sometimes when I flagged anything as spam or deceptive, the following day, it would return. I almost have the impression that the more criticism I give them for their proposals, the higher the bullshit mountain rises. According to a survey taker, “even if you block some sources, they will eventually reappear.”

Advertisements

23% of those who attempted to alter YouTube’s recommendation provided a conflicting result. They mentioned things like undesirable videos resurfacing in the feed or putting in a lot of consistent time and effort to improve suggestions.

“Yes, they changed, but not for the better. I feel punished for actively attempting to alter the algorithm’s behavior.” Another survey participant stated that, in some ways, “fewer engagement results in fewer information upon which to draw suggestions.”

Respect for users’ opinions

The researchers concluded that not even YouTube’s best capabilities for avoiding bad recommendations were adequate to alter viewers’ feeds. The report explained that YouTube is not actually interested in hearing what its users genuinely want. However, it preferred to rely on opaque methods that encourage interaction regardless of its users’ best interests. 

Advertisements
Like Dislike and Comment Removal Patterns Among YouTube Videos with 1m Dislikes 6
Like,-Dislike-and-Comment-Removal-Patterns by Gentle under CC BY-SA 4.0

The researchers recommended that YouTube accept users’ feedback they get and take them seriously as important clues about how people want to use the service.

The website doesn’t attempt to filter out all content pertinent to a particular issue, according to Elena Hernandez, a representative for YouTube, who argues that these acts are intentional. Although she challenged the findings, claiming they ignored the layout of YouTube’s controls.

“We provide users with control over their recommendations, including the ability to prevent future recommendations of a certain channel or video. Importantly, our controls don’t eliminate whole subjects or points of view because doing so can have unfavorable consequences for viewers, including forming echo chambers,” Hernandez told The Verge.

Advertisements

“We support the scholarly investigation on our platform, and that’s why the YouTube Researcher Program recently expanded Data API access. We can’t really draw many conclusions from Mozilla’s analysis because it ignores how our systems truly operate.”

Hernandez claims that Mozilla’s definition of “similar” disregards the operation of YouTube’s recommendation engine. According to Hernandez, the “not interested” option deletes a particular video, and the “don’t recommend channel” button stops the channel from being suggested in the future. The business asserts that it does not aim to prevent recommendations of any content relating to a viewpoint or subject.

YouTube HQ 11
YouTube HQ 11 by BrokenSphere under CC BY 3.0

The YouTube service’s algorithm suggested that 71% of the video’s viewers “regretted” watching, including clips on misinformation and spam, according to a study done by Mozilla last year. YouTube published a blog post justifying its choice to create the current recommendation system and filter out “low-quality” videos a few months after this study became public.

Advertisements

Other platforms have feedback tools

Social networks like Twitter, TikTok, and Instagram are attempting to give users more options to customize their feeds after depending on algorithms for years to suggest more content to users. However, users frequently complain that identical recommendations continue even after they flag something as inappropriate.

Lawmakers globally are also considering how opaque recommendations used by different social networks may affect users. The European Union is farther ahead. They recently passed the Digital Services Act to require platforms to describe how recommendation algorithms function and make them accessible to outside researchers. A bipartisan Filter Bubble Transparency Act is being considered in the US to deal with a similar problem.

According to Mozilla researcher Becca Ricks, platforms are opaque about how feedback is used, and it’s not always clear what different controls do.

Advertisements

According to Ricks, who spoke with The Verge via email, “I believe that in the case of YouTube, the platform is balancing user engagement with user satisfaction, which is ultimately a tradeoff between recommending content that encourages users to spend more time on the site and content the algorithm thinks users will like.”

According to our analysis, user feedback may not always be the most essential signal. However, the platform can change which of these signals receives the most weight in its algorithm.”


Featured Image credit: ReturnYoutubeDislike.png (outlined) by Dmitrii Selivanov under CC BY-SA 4.0 and YouTube Sticker by Unknown author under CC0 1.0

Advertisements

Recent Posts

Follow Us