vacationssetr.blogg.se

Whats happened to youtunes algorithm
Whats happened to youtunes algorithm











whats happened to youtunes algorithm

Naturally, Chaslot’s claim caught the attention of the media and was covered widely. The video upholds what could be considered a Kremlin-friendly narrative and slams mainstream media. That means that if Chaslot is correct, YouTube’s algorithm amplified a video explaining the finding on possible Russian collusion made by… Russia. When the Mueller report detailing whether there was any collusion between Russia and Donald Trump’s presidential campaign was released, Chaslot noticed that the analysis recommended from the most channels was a video from RT - a state-sponsored Russian propaganda outlet. This video funded by the Russian government was recommended more than half a million times from more than 236 different channels. Chaslot points out that most often, the top recommended videos are innocuous, but every now and again, problematic videos pop up. The tool is meant to give people a better overview of what’s actually being recommended on YouTube.īasically, it tries to find out which videos are shared by most channels to provide an overview you can’t get through your personal browsing. But to actually show its effects, he made the AlgoTransparency tool after he left Google. watch time doesn’t equal quality) as an example of why it’s bad for us as a society. In Chaslot’s mind, it should be enough to point out that the algorithm’s incentives are completely broken (i.e. “Right now the incentive is to create this type of borderline content that’s very engaging, but not forbidden.” Basically, the more outlandish content you make, the more likely it’ll keep people watching, which in turn will make it more likely to be recommended by the algorithm - which results in greater revenue for the creator, and for YouTube.īut what about actual, concrete examples of problematic recommendations? When recommendations go wrong “We’ve got to realize that YouTube recommendations are toxic and it perverts civic discussion,” says Chaslot. Chaslot says this is something the big tech companies will have to debate between themselves, but based on his own experience, he’s more inclined to believe Zuckerberg at this point. Google did not want to answer TNW’s questions as to whether the same was true for YouTube, but the company’s spokesperson said in a discussion at the DisinfoLab conference that the company’s studies showed people actually engaged more with quality content. Mark Zuckerberg admitted last year that borderline content was more engaging. Chaslot’s take on Facebook’s natural engagement pattern: “The best way to use social media is to surf the policy line.” But as YouTube becomes more central in people’s information and news consumption, Chaslot worries recommendations will push people further to extremes - whether they want it or not - just because it’s in YouTube’s interest to keep us watching for as long as possible. The basic structure of YouTube’s recommendation algorithm might’ve worked fine for its core types of content - like cat videos, gaming, and music. Google completely disagrees with Chaslot, but we’ll get to that later.

whats happened to youtunes algorithm whats happened to youtunes algorithm

Basically, the closer it stays the edge of what’s allowed under YouTube’s policy, the more engagement it gets. Engaging content gets recommended, which is badĭuring his talk at the DisinfoLab Conference last month, Chaslot noted that divisive and sensational content is often recommended widely: conspiracy theories, fake news, flat-Earther videos, for example. This might be great for a company trying to sell ads, but doesn’t necessarily reflect what the user wants - and has grave side-effects. Recommendations were designed to waste your time.”Ĭhaslot explains that the metric the algorithm uses to determine a ‘successful’ recommendations is watch time. “But the problem is that the AI isn’t built to help you get what you want - it’s built to get you addicted to YouTube. This would be amazing,” Chaslot told TNW. “It isn’t inherently awful that YouTube uses AI to recommend video for you, because if the AI is well tuned it can help you get what you want. Credit: DisinfoLab Chaslot speaking at the DisinfoLab Conference in Brussels H e says the motivations behind it are deeply flawed as it isn’t really about what the viewer wants. He’s the founder of a project to demand greater transparency from online platforms called AlgoTransparency, and used to work at Google on YouTube’s recommendation algorithm. These are the videos you should be wary of, according to Guillaume Chaslot.













Whats happened to youtunes algorithm