TikTok Boosts Posts About Eating Disorders, Suicide, Report Says – NECN
what to know
- Researchers at the nonprofit Center for Countering Digital Hate created TikTok accounts for fictional teens in the US, UK, Canada and Australia.
- Minutes after their accounts “liked” self-harm and eating disorder videos, the algorithm was recommending self-harm and weight loss videos.
- Critics of social media say the same algorithms that promote content about a particular sports team, hobby or dance fad can send users down a rabbit hole of harmful content.
TikTok’s algorithms are promoting self-harm and eating disorder videos to vulnerable teens, according to a report released Wednesday highlighting concerns about social media and its impact on young people’s mental health.
Researchers at the nonprofit Center for Countering Digital Hate created TikTok accounts for fictional teens in the US, UK, Canada and Australia. The researchers running the accounts liked the self-harm and eating disorder videos to see how TikTok’s algorithm would respond.
Within minutes, the popular platform was recommending videos on how to lose weight and self-harm, including some featuring images of idealized models and body types, images of razor blades, and discussions of suicide.
When researchers created accounts with usernames that suggested a particular vulnerability to eating disorders (names that included the words “lose weight,” for example), the accounts received even more harmful content.
“It’s like being trapped in a room of distorted mirrors where you’re constantly being told that you’re ugly, that you’re not good enough, that maybe you should kill yourself,” said the center’s executive director, Imran Ahmed, whose organization has offices in the US. US and UK “You are literally sending the most dangerous messages possible to young people”.
Social media algorithms work by identifying topics and content of interest to a user, who is then sent more of the same as a way to maximize their time on the site. But critics of social media say the same algorithms that promote content about a particular sports team, hobby or dance fad can send users down a rabbit hole of harmful content.
It’s a particular problem for teens and kids, who tend to spend more time online and are more vulnerable to bullying, peer pressure or negative content about eating disorders or suicide, according to Josh Golin, CEO of Fairplay, a non-profit organization that supports more online. protections for children
He added that TikTok isn’t the only platform that doesn’t protect young users from harmful content and aggressive data collection.
“All these damages are related to the business model,” Golin said. “It doesn’t matter what the social media platform is.”
In a statement from a company spokesperson, TikTok disputed the findings, noting that the researchers did not use the platform like typical users and saying the results were skewed as a result. The company also said that a user’s account name should not affect the type of content the user receives.
TikTok prohibits users under the age of 13, and its official rules prohibit videos that encourage eating disorders or suicide. US users searching for eating disorder content on TikTok receive a notice offering mental health resources and contact information for the National Eating Disorders Association.
“We regularly consult with health experts, eliminate violations of our policies, and provide access to support resources for anyone who needs it,” said the statement from TikTok, owned by ByteDance Ltd., a Chinese company now headquartered in Singapore.
Despite the platform’s efforts, researchers at the Center to Counter Digital Hate discovered that content about eating disorders had been viewed on TikTok billions of times. In some cases, researchers found that young TikTok users used coded language about eating disorders in an effort to evade TikTok’s content moderation.
Nyquil chicken is the latest TikTok food trend to go viral, like chicken cooked in congestion medicine.
The large amount of harmful content being sent to teens on TikTok shows that self-regulation has failed, Ahmed said, adding that federal rules are needed to force platforms to do more to protect children.
Ahmed noted that the version of TikTok offered to domestic Chinese audiences is designed to promote math and science content to young users, and limits the time 13- and 14-year-olds can spend on the site each day.
A proposal before Congress would impose new rules limiting the data social media platforms can collect about young users and would create a new office within the Federal Trade Commission focused on protecting the privacy of young social media users.
One of the bill’s sponsors, Sen. Edward Markey, D-Mass., said Wednesday that he is optimistic that lawmakers from both parties can agree on the need for stricter regulations on how platforms access and use information from young users.
“Data is the raw material that big technology uses to track, manipulate and traumatize the youth of our country every day,” Markey said.