“The YouTube recommendations were designed to waste your time.” So clear and concise is Guillaume Chaslot, a former Google worker who worked on developing YouTube’s own algorithm. The social network, which has a practically infinite range of videos, is designed, according to the former worker, so that we become addicted to it, without really importing the content that the user wants to visualize.
According to its website, the algorithm of YouTube generates 70% of the visits of the medium. In the same way, it affirms that the main problem of the same is that it does not focus on the needs of the user, but on retaining it within the videos, although the quality of the same is questionable.
The limit content, attractive, but not prohibited
According to Chaslot, YouTube focuses on showing us limit content. This is the one that borders on the violation of YouTube’s own policies, and precisely, plays with that limit to present itself as more attractive content. The old worker of Google, affirms that this type of content is toxic, and that, precisely, is the one that generates more income to the creators.
“We have to realize that YouTube’s recommendations are toxic and pervert civic discussions At this point, the incentive is to create this type of content that is very attractive, but not forbidden.” Basically, the more extravagant the content, the more the greater the likelihood that people will continue to watch, which in turn will make it more likely to be recommended by the algorithm, which generates more revenue for the creator and for YouTube. “
As words without facts are useless, Chaslot, after leaving his position in Google, created the AlgoTransparency page, with the aim of reporting YouTube’s recommendations in a clearer way. The example he focuses most on is the recommendation that YouTube proposed after the news about the publication of the Mueller report, which provided information on relations between Russia and the United States during the Donald Trump campaign in 2016.
This video funded by the Russian government was recommended more than half a million times from more than 236 different channels.https://t.co/aRNUx2WIOm
— Guillaume Chaslot (@gchaslot) April 26, 2019
More than 236 YouTube channels started recommending a video subsidized by the Russian government itself, something Chaslot does not fully understand, seeing the few reproductions that the video had, and how much larger and more visited channels had videos about this same theme.
According to Google itself, in words for The Next Web, Chaslot’s claims that YouTube focuses on showing content limit are not true, stating that the algorithm is based on likes, dislikes, times the video is shared, and authority of the channel itself.
“As with most of the statements made by AlgoTransparency, we have not been able to reproduce the results, we have designed our systems to ensure that content from more authoritative sources appears prominently in the search results and see the following recommendations in certain contexts, even when a viewer is watching content related to news on YouTube.”
An algorithm that does not get rid of polemics
Regardless of the degree of reason that the former Google employee does or does not have, the algorithm of YouTube seems not to get rid of polemics . As we echoed some time ago, in the YouTube Trends list it was not difficult to find clickbait and questionable content regarding its quality, like the examples that we show below.
- I RETURN TO THE MANSION AND GIVE ME A SURPRISE GIFT [Dualcoc]
- I AM HOME AND I GIVE UP MY GIRLFRIEND ..
- TELEPHONE JOKE TO OUR CHIEF | Ft. Joaquin PA
- REGGAETON PHRASES IN REAL LIFE 2 | Celopan
- PANCAKE “ART” CHALLENGE – CLASH ROYALE
- We created the BIGGEST COFFEE in the world in the BATHTUB and we MET inside!
- YOU GUESS YOUR AGE ??? 100% IMPOSSIBLE
In February of this same year, YouTube made profound changes to the platform, admitting that users were complaining about “junk” content, easy click search, and deceptive titles and descriptions. At this very moment, Charlot himself considered this change a victory, and that is that the platform promised to stop recommending videos as controversial as those dedicated to explaining that the Earth is flat.
“Example of this vicious circle: two years ago I discovered that AI promoted many conspiranoic information in a greater proportion than the real ones, for example, the videos of the Flat Earth were promoted around ten times more than those of the round Earth”.
The Google worker, long ago, had discovered that more videos were recommended on terraplanism than on those who told the truth, and that more visits were generated, more ads and more money promoting the visualizations of the former.
YouTube announced they will stop recommending some conspiracy theories such as flat earth.
I worked on the AI that promoted them by the *billions*.
Here is why it’s a historic victory. Thread. 1/https://t.co/wJ1jbUcvJE
— Guillaume Chaslot (@gchaslot) February 9, 2019
In the same way, strong criticisms have fallen on YouTube for having served a network of pedophiles to communicate through the platform, through videos of 10-year-old girls trying on bikinis, stretching or playing. Videos with thousands of reproductions that the platform recommended to these users , and which caused large companies such as Nestlé or Epic Games to withdraw advertising campaigns on YouTube and Google.
Via | The Next Web