Investigation by Mozilla shows that the “Dislike” button and other controls Youtube they are ineffective when the user wants the platform’s algorithm to stop suggesting certain types of videos. The foundation responsible for Firefox concluded that such checks “prevent less than half of unwanted algorithmic recommendations”.
To conduct the research, which involved 22,722 volunteers, Mozilla used data collected through RegretsReporter, its extension that allows users to voluntarily “donate” their recommendations to conduct studies like this one. Thanks to a large number of volunteers, the foundation was able to base its message on millions of recommended videos and other anecdotal messages from thousands of people.
Mozilla tested the effectiveness of four different controls present on YouTube: the “Dislike” button, which is a classic thumbs down and whose counter has long been hidden by default; “I’m not interested”; “I do not recommend the channel”; except “delete reproduction history”. The researchers found that they had varying degrees of effectiveness in stopping certain content from being suggested, but that on a general level the impact was “small and insufficient”.
Diving into the data was most effective “do not recommend this channel”, which prevented 43% of unwanted referrals. At the opposite extreme is “not interested” with a Pyrrhic efficiency of 11%. Classics The dislike button doesn’t mean it’s doing much better with 12%while removing the tracking history increases performance to 29% efficiency.
In addition to showing the rather ineffectiveness of YouTube’s controls, Mozilla’s report also shows that users are willing to go to great lengths to avoid unwanted referrals by using methods such as opting out and viewing content through a VPN. From the ground up, they emphasize that the video platform should better explain its controls and provide more proactive ways for users to define what they want to see.
Mozilla says YouTube and similar platforms rely on large amounts of data they passively collect to infer user preferences. Considering this, the institution responsible for Firefox believes this approach is somewhat paternalistic as it is the platforms that ultimately decide on content when users should be asked what they want to do on the platform.
Mozilla’s report on YouTube is not the result of an explosion, but rather a retreat from a larger concern surrounding the algorithms used by major platforms. The European Union has a digital services law that requires platforms to explain how their recommendation algorithms work and open them up to outside researchers, while similar initiatives are being encouraged in the United States, though not necessarily as ambitious.
In short, a user’s activity, at least when they choose to opt out, doesn’t have much of an impact on the recommendations that YouTube eventually makes. All Mozilla has done is put numbers and specifics into something that the vast majority of YouTube users could perceive.