April 29, 2025
Trending News

https://www.xataka.com/aplicaciones/tiktok-sabe-que-su-algoritmo-genera-adiccion-jovenes-hace-poco-para-evitarlo-investigacion-npr

  • October 14, 2024
  • 0

Last Tuesday, 14 attorneys general from 14 different states of the USA filed a lawsuit against TikTok. The reason they stated in the lawsuit was that TikTok violated

https://www.xataka.com/aplicaciones/tiktok-sabe-que-su-algoritmo-genera-adiccion-jovenes-hace-poco-para-evitarlo-investigacion-npr

Last Tuesday, 14 attorneys general from 14 different states of the USA filed a lawsuit against TikTok. The reason they stated in the lawsuit was that TikTok violated consumer protection laws and contributed to the crisis in young people’s mental health. 14 cases, all involving confidential information (typical black lines covering phrases or paragraphs in an official document) related to internal communications, documents and investigative data from TikTok executives.

A tainted document allowed censored content to be viewed. In summary, TikTok is aware that its application is addictive and the measures offered by the platform are not effective. But let’s go in parts.

Document. Kentucky Public Radio has been granted access to the state attorney general’s 119-page lawsuit for faulty editing. Thanks to this, KPR and NPR (National Public Radio) were able to read the confidential content and access 30 pages of censored information. After the content was published, a state judge signed off on the lawsuit.

What does the research say? There are many things, but none of them are in TikTok’s favor. NPR said it was reviewing “all redacted portions of the case that involve TikTok executives speaking candidly about a range of dangers to children on the popular video app.” It is also claimed that some prevention measures, such as time management tools, are not only ineffective but measure their success based on the good press TikTok generates with its release.

TikTok algorithm only needs 260 videos to hook the user

260 videos. According to the documents reviewed, this is the number of videos a user must watch to become “addicted to the platform.” In the words of Kentucky officials, “While this may seem like a big deal, TikTok videos can be as short as 8 seconds and auto-play in quick succession for viewers. […] Therefore, the average user is likely to become addicted to the platform in less than 35 minutes.”

NPR noted in a study that TikTok “compulsive use is associated with a range of negative mental health effects, including loss of analytical skills, memory formation, contextual thinking, depth of conversation, empathy, and increased anxiety.” Similarly, TikTok recognized that “compulsive use also interferes with basic personal responsibilities like adequate sleep, work and school responsibilities, and connecting with loved ones.”

tiktok

Image: Solen Feyissa on Unsplash

useless tools. There are some tools on TikTok that allow you to reduce or adjust the amount of time the user spends indoors, but documentation shows that the platform is aware of their ineffectiveness. In fact, its own research is cited, which determined that the effect only lasts 1.5 minutes. With the implementation of these tools, usage time increased from 108.5 minutes to 107 minutes per day.

What’s more, the documents state that TikTok measures the success of these tools not by how much they reduce usage time, but by how they increase “public trust in the TikTok platform through media coverage.” The investigation also includes the words of a manager who, in conversations with another employee, said “our goal is not to reduce time spent.”

beautiful people. On the other hand, documents show that the algorithm prioritizes beautiful people. According to an internal report cited in those filings, TikTok noticed “a large number of unattractive topics” appearing on the app and tweaked the algorithm to increase users the company found attractive.

“By changing TikTok’s algorithm to show fewer ‘unattractive topics’ in the For You feed, [TikTok] According to NPR, Kentucky officials told the investigation that it took active steps to promote narrow beauty standards even though it could negatively impact its young users. This promotes beauty standards that can lead to personal problems in the long run.

Image: Cottonbro studio

Image: Cottonbro studio

Lack of moderation. The investigation also shows that TikTok was aware of how its app created a filter bubble, so much so that it only took 30 minutes for the algorithm to place the user in a filter bubble. This is dangerous, especially if we are talking about harmful content. Meanwhile, content that can be found on the platform. Various experiments have been carried out on this subject.

First, TikTok uses a layer of artificial intelligence to analyze and detect videos that violate its rules and then passes them on to human reviewers. AI is used to filter content that is pornographic, violent or political in nature. Human moderators only come into play if the video has a certain number of views, and the problem arises when some videos containing sensitive topics such as suicide and self-harm pass the AI ​​filter.

One of the cases mentioned was a self-harm video that was detected and removed after reaching 75,000 views. Additionally, NPR notes that the documents state that content can be found even though it is banned by the platform. It doesn’t just appear or be recommended in the “For You” section, but it is there. In fact, TikTok acknowledges a rather alarming leak rate in some cases, such as the normalization or fetishization of child abuse.

Pictures | Alexander Shatov

Image: Alexander Shatov

Trouble with minors. Finally, the latest research reveals that TikTok does not do much to remove the accounts of children under the age of 13 who cannot have a standard account. Although there are accounts aimed at younger users, NPR notes in internal documents that moderators are asked to exercise caution before deleting an account suspected of belonging to a child under 13.

Prosecutors also argue that the livestreams were not regulated because they allowed minors to livestream and receive money (in the form of gifts) from users. Local public radio’s Kentucky lawsuit states that “the existence of these virtual rewards greatly increases the risk that adult predators will target teenage users for sexual exploitation.” This is an issue that has been on the table for some time and this is not the first time it has been suggested that such live shows encourage the sexual exploitation of minors.

platform stance. We contacted TikTok to get their location from Xataka. Company sources gave the following assurance to this environment.

“ANDIt is deeply irresponsible of NPR to release classified information. Unfortunately, this complaint cherry-picks misleading quotes and takes outdated documents out of context to misrepresent our commitment to community safety. We have strong security measures in place, including proactive removal of user accounts suspected of belonging to minors, and have voluntarily initiated security features such as screen time limits, family sync tools, and privacy limits enabled by default. For children under 16″.

Another (hard) blow to TikTok. This information is just one of the obstacles facing TikTok in the US, which is currently facing a possible ban. Unless TikTok in the United States separates itself from its China-based parent company Bytedance, the app will be banned starting January 19.

Cover image | Feast Feyissa

in Xataka | We believed that OpenAI was scraping the web at an unsustainable rate. Until TikTok decided to get into artificial intelligence

Source: Xataka

Leave a Reply

Your email address will not be published. Required fields are marked *