Emma Lembke did not know what an algorithm was when she started using social media.

The then-12-year-old was thrilled when her parents gave her permission to join Instagram. She quickly followed all kinds of accounts — from Kim Kardashian to Olive Garden, she said — and was soon spending five to six hours a day on the app. Then one day, she searched for “ab workouts,” and her feed shifted. She started seeing 200-calorie recipes, pro-anorexia posts and exercise routines that “no 12-year-old should be doing in their bedroom,” she said.

Lembke, now 21, testified before the Senate Judiciary Committee in February 2023 about how social media led her to disordered eating, and what she and other advocates see as a dire need for stronger regulation to protect social media’s youngest users.

Social media platforms have promised to take more action. On Friday, TikTok enacted what some experts called one of the most well-defined policies by a social media company yet on weight and dieting posts. The company’s updated guidelines, which come as TikTok faces a potential ban in the United States, include new guardrails on posts that show “potentially harmful weight management behaviors” and excessive exercise.

TikTok said it will work to ensure the “For You” page, which serves as the main content feed on TikTok and is driven by an algorithm that caters to a user’s interests, no longer shows videos that promote “extended intermittent fasting,” exercises designed for “rapid and significant weight loss” or medications or supplements that promote muscle gain. The new regulations also aim to crack down on posts from influencers and other users promoting products used for weight loss or to suppress appetite, such as drugs like Ozempic. They also aim to curb content promoting anabolic steroid use.

Under the new policy, machine learning models will attempt to flag and remove content that is considered potentially dangerous; a human moderation team will then review those posts to see if they need to remain off the For You feed, should be removed from age-restricted feeds or should be removed from the platform altogether, said Tara Wadhwa, TikTok’s director of policy in the United States.

Advertising

The elimination of problematic TikToks from the main feed is meant in part to “interrupt repetitive content patterns,” the new guidelines said. Wadhwa said the company wants to ensure users aren’t exposed to diet and weight loss content “in sequential order, or repeatedly over and over again.”

Experts said that the new policy offers more specificity on the types of content that will be removed than guidelines set by other social media platforms such as Facebook and YouTube, which have also said they use human and machine learning-based moderation to keep eating disorder content in check.

But some are also skeptical that TikTok’s new guardrails will be able to reliably identify and reduce potentially harmful posts. S. Bryn Austin, a professor of social and behavioral sciences at the Harvard T.H. Chan School of Public Health and a specialist in eating disorders, said these regulations may be little more than a Band-Aid on an algorithm that is behaving exactly as it was designed.

“The For You feed is still designed to be able to boost their revenue, to increase engagement,” Austin said.

It may also be difficult to examine the impact of the new policy, she added. Researchers have long complained that platforms like TikTok make it difficult to study what users are seeing, how their algorithms work or how policy changes affect content feeds or user behavior.

On algorithm-driven platforms, the path from wellness and health content to posts with the potential to encourage disordered eating can be remarkably short: In a 2022 study conducted by the Center for Countering Digital Hate, researchers set up profiles to pose as 13-year-old users and found that those accounts were served content the researchers considered related to self-harm and eating disorders within minutes of signing up. The researchers also found that TikTok hashtags linked to what they classified as eating disorder content had more than 13.2 billion views.

Advertising

More than 29 million Americans experience a clinically significant eating disorder in their lifetime, and people of any age, race, gender or body type can develop eating disorders, according to the National Alliance for Eating Disorders. For people who are especially at risk of developing these issues, seeing a feed flooded with body image or diet content can be “a proverbial trigger pull” that could set disordered eating behavior in motion, said Johanna Kandel, the CEO of the National Alliance for Eating Disorders.

One challenge for social media platforms has been how and where to draw the line between posts about health and posts that could be potentially harmful. What can be dangerous for some users might not affect others at all.

“It’s not drawing a line in the sand and saying, ‘This is OK for people to see; this content is not,’” Kandel said, adding, “I don’t think it’s so cut and dry. There will have to be malleability.”