TikTok is reportedly aware of its bad effects on teen users
TikTok's executives and employees were well aware that its features foster compulsive use of the app, as well as of its corresponding negative mental health effects, according to NPR. The broadcasting organization reviewed the unredacted documents from the lawsuit filed by the Kentucky Attorney General's Office as published by the Kentucky Public Radio. More than a dozen states sued TikTok a few days ago, accusing it of "falsely claiming [that it's] safe for young people." Kentucky Attorney General Russell Coleman said the app was "specifically designed to be an addiction machine, targeting children who are still in the process of developing appropriate self-control." Most of the documents submitted for the lawsuits had redacted information, but Kentucky's had faulty redactions. Apparently, TikTok's own research found that "compulsive usage correlates with a slew of negative mental health effects like loss of analytical skills, memory formation, contextual thinking, conversational depth, empathy, and increased anxiety." TikTok's executives also knew that compulsive use can interfere with sleep, work and school responsibilities, and even "connecting with loved ones." They reportedly knew, as well, that the app's time-management tool barely helps in keeping young users away from the app. While the tool sets the default limit for app use to 60 minutes a day, teens were still spending 107 minutes on the app even when it's switched on. That's only 1.5 minutes shorter than the average use of 108.5 minutes a day before the tool was launched. Based on the internal documents, TikTok based the success of the tool on how it "improv[ed] public trust in the TikTok platform via media coverage." The company knew the tool wasn't going to be effective, with one document saying that "[m]inors do not have executive function to control their screen time, while young adults do." Another document reportedly said that "across most engagement metrics, the younger the user, the better the performance." In addition, TikTok reportedly knows that "filter bubbles" exist and understands how they could potentially be dangerous. Employees conducted internal studies, according to the documents, wherein they found themselves sucked into negative filter bubbles shortly after following certain accounts, such as those focusing on painful ("painhub") and sad ("sadnotes") content. They're also aware of content and accounts promoting "thinspiration," which is associated with disordered eating. Due to the way TikTok's algorithm works, its researchers found that users are placed into filter bubbles after 30 minutes of use in one sitting. TikTok is struggling with moderation, as well, according to the documents. An internal investigation found that underage girls on the app were getting "gifts" and "coins" in exchange for live stripping. And higher-ups in the company reportedly instructed their moderators not to remove users reported to be under 13 years old unless their accounts state that they indeed are under 13. NPR says TikTok also acknowledged that a substantial number of content violating its rules get through its moderation techniques, including videos that normalize pedophilia, glorify minor sexual assault and physical abuse. TikTok spokesman Alex Haurek defended the company and told the organization that the Kentucky AG's complaint "cherry-picks misleading quotes and takes outdated documents out of context to misrepresent our commitment to community safety." He also said that TikTok has "robust safeguards, which include proactively removing suspected underage users" and that it has "voluntarily launched safety features such as default screentime limits, family pairing, and privacy by default for minors under 16."This article originally appeared on Engadget at https://www.engadget.com/apps/tiktok-is-reportedly-aware-of-its-bad-effects-on-teen-users-150030241.html?src=rss
TikTok's executives and employees were well aware that its features foster compulsive use of the app, as well as of its corresponding negative mental health effects, according to NPR. The broadcasting organization reviewed the unredacted documents from the lawsuit filed by the Kentucky Attorney General's Office as published by the Kentucky Public Radio. More than a dozen states sued TikTok a few days ago, accusing it of "falsely claiming [that it's] safe for young people." Kentucky Attorney General Russell Coleman said the app was "specifically designed to be an addiction machine, targeting children who are still in the process of developing appropriate self-control."
Most of the documents submitted for the lawsuits had redacted information, but Kentucky's had faulty redactions. Apparently, TikTok's own research found that "compulsive usage correlates with a slew of negative mental health effects like loss of analytical skills, memory formation, contextual thinking, conversational depth, empathy, and increased anxiety." TikTok's executives also knew that compulsive use can interfere with sleep, work and school responsibilities, and even "connecting with loved ones."
They reportedly knew, as well, that the app's time-management tool barely helps in keeping young users away from the app. While the tool sets the default limit for app use to 60 minutes a day, teens were still spending 107 minutes on the app even when it's switched on. That's only 1.5 minutes shorter than the average use of 108.5 minutes a day before the tool was launched. Based on the internal documents, TikTok based the success of the tool on how it "improv[ed] public trust in the TikTok platform via media coverage." The company knew the tool wasn't going to be effective, with one document saying that "[m]inors do not have executive function to control their screen time, while young adults do." Another document reportedly said that "across most engagement metrics, the younger the user, the better the performance."
In addition, TikTok reportedly knows that "filter bubbles" exist and understands how they could potentially be dangerous. Employees conducted internal studies, according to the documents, wherein they found themselves sucked into negative filter bubbles shortly after following certain accounts, such as those focusing on painful ("painhub") and sad ("sadnotes") content. They're also aware of content and accounts promoting "thinspiration," which is associated with disordered eating. Due to the way TikTok's algorithm works, its researchers found that users are placed into filter bubbles after 30 minutes of use in one sitting.
TikTok is struggling with moderation, as well, according to the documents. An internal investigation found that underage girls on the app were getting "gifts" and "coins" in exchange for live stripping. And higher-ups in the company reportedly instructed their moderators not to remove users reported to be under 13 years old unless their accounts state that they indeed are under 13. NPR says TikTok also acknowledged that a substantial number of content violating its rules get through its moderation techniques, including videos that normalize pedophilia, glorify minor sexual assault and physical abuse.
TikTok spokesman Alex Haurek defended the company and told the organization that the Kentucky AG's complaint "cherry-picks misleading quotes and takes outdated documents out of context to misrepresent our commitment to community safety." He also said that TikTok has "robust safeguards, which include proactively removing suspected underage users" and that it has "voluntarily launched safety features such as default screentime limits, family pairing, and privacy by default for minors under 16."This article originally appeared on Engadget at https://www.engadget.com/apps/tiktok-is-reportedly-aware-of-its-bad-effects-on-teen-users-150030241.html?src=rss
What's Your Reaction?