Mozilla report finds YouTube’s ‘Dislike’ buttons ineffective

a new A report from Mozilla, the makers of the privacy-focused Firefox browser, suggests that YouTube’s user controls are ineffective in controlling what people see on the platform – despite what Google claims. Using data from nearly 23,000 volunteers, Mozilla was able to show that YouTube continued to recommend similar videos even when people used the various options to indicate that they did not want to see that type of content.

YouTube is the second most popular website in the world (first being Google), and according to Mozilla, an estimated 70 percent of the 1 billion hours viewed daily on the platform are the result of algorithmic recommendations. Several reports have shown how the algorithm can polarize people and recommend misinformation and malicious content – ​​something Google claims it has worked hard to fix. In this study, Mozilla wanted to test the effectiveness of the controls YouTube provides to users to manage the recommended videos they watch.

In an earlier report released last July, Mozilla found that people were routinely recommended videos they didn’t want to see and found the controls available to them to be ineffective. This new study used a Mozilla-developed browser plug-in called RegretsReport to see if this was true.

Mozilla looked at four different controls suggested by Google: clicking the thumbs-down “Don’t like” button, “Not interested,” “Don’t recommend channel,” and “Remove from watch history.” Meanwhile, users of the RegretsReport plugin will see a “Stop Recommending” button on YouTube videos. When they clicked on it, the control option (such as for the “dislike” button) corresponding to their test group was sent to YouTube, while data about future recommended videos was sent to Mozilla. (There was also a control group where clicking the button did nothing.)

[Related: Why YouTube is hiding dislikes on videos]

Over the course of the study, 22,722 participants used the RegretsReporter, which allowed Mozilla to analyze 567,880,195 recommended videos. To assess this massive amount of data, the researchers reviewed 40,000 pairs of featured videos and assessed their similarity. This allowed the team to quantitatively examine whether the videos that participants were recommended were similar to videos they had previously rejected. In other words, to see if YouTube’s tools have effectively reduced the number of bad recommendations.

For example, if someone saw an anti-vax video recommended to them and clicked “Not interested” and then recommended a cat video, that would be a good recommendation. On the other hand, if they keep getting suggested anti-vax videos after stating they weren’t interested, that would be bad recommendations. Page 22 of the report [PDF] has some good visual examples.

Mozilla’s report found that none of the user controls were particularly effective in preventing unwanted recommendations. The “Don’t recommend channel” option had the greatest impact, preventing 43 percent of bad recommendations, with “Remove from watch history” 29 percent and “Don’t like” and “Not interested” 12 percent and 11 percent, respectively. Mozilla states that its “research suggests that YouTube isn’t really interested in hearing what its users really want, but rather relies on opaque methods that encourage engagement regardless of its users’ interests.”

As a result of its findings, Mozilla is calling on people to sign a petition asking YouTube to fix its feedback tools and give users actual control over the videos they get recommended. It also has four specific recommendations for YouTube and policymakers based on its research.

Mozilla suggests that YouTube’s user controls should be easy to use, understand, and designed to “put people in the driver’s seat.” It also wants YouTube to give researchers better access to data (so they don’t have to use browser extensions to study this sort of thing). Finally, it calls on policymakers to pass laws that provide legal protection for those engaged in public interest.

Whether this report is enough to get Google to add real user features to YouTube remains to be seen. For now, it’s a pretty damning indictment of the ineffective controls currently in place.