The Washington PostDemocracy Dies in Darkness

Is TikTok biased against non-White, curvy bodies? This lingerie brand says so.

Adore Me called out the popular social media app for discriminating against Black, disabled and plus-sized models

By
February 6, 2021 at 11:02 a.m. EST

This story was updated Feb. 7 at 10:50 a.m. Eastern Time to include comments from Adore Me.

TikTok swore it would do better after facing rampant criticism last year that the app was restricting content that came from certain users. But according to lingerie company Adore Me, the fast-growing app has failed to deliver on its promises.

On Thursday afternoon, the lingerie brand posted several tweets calling out the social media platform for taking down content featuring Black, plus-sized and disabled users.

“As a lingerie company, we understand that our products and marketing can push the boundaries of what’s allowed on social media platforms,” Adore Me wrote. However, with other social media platforms, such as Snapchat or Instagram, the brand said it had a firm understanding of the guidelines for posting.

“TikTok is a different story,” it continued.

“Adore Me has regularly seen the removal of our content on TikTok that features plus-size, Black, and/or differently abled models and women of color. This is unacceptable and discriminatory, and we will not stand for it.”

The thread shared several posts it said was taken down by the platform: videos showing women of color and plus-sized women wearing the brand’s lingerie. One video simply shows a young Black woman, her body out of the frame for most of the video, talking about lingerie she wanted to wear “when the CDC says it’s okay for me to swipe in your house and take your man.”

Adore Me pointed out that a similarly constructed video featuring a White woman was not taken down.

The company also says this isn’t the first time that TikTok — especially popular with young millennial and Gen Z users — has been called out for discrimination. It cited a 2020 report from the Intercept which contained internal TikTok guidelines on suppressing posts from people who are “obese,” have “ugly facial looks” or “too many wrinkles.” A feature piece in Wired magazine from last year documented the app’s reliance on the popularity of its Black creators, while simultaneously suppressing or removing content from them that explicitly talked about racism.

Singer Lizzo also spoke out against TikTok, accusing the app of removing videos of her wearing bathing suits while allowing slimmer users to post their swimsuit content (the app later restored her videos).

“The core algorithmic problem at the very root of the platform remains problematic and opaque,” Adore Me said. “The more these removals occur, the more we wonder if we’ll ever be able to grow on the platform or — if it will even matter, if TikTok continues to drive fat, Black and/or differently abled creators off the platform.”

What are algorithms and are they biased against me?

TikTok did not respond to the allegations directly, but in a statement emailed to The Lily, the company lauded “the incredible diversity of our users.”

“TikTok strives to foster a community where everyone feels welcome and comfortable expressing themselves exactly as they are,” TikTok spokesperson Jamie Favazza said. “Let us be clear: TikTok does not moderate content on the basis of shape, size, or ability, and we continually take steps to strengthen our policies and promote body acceptance.”

The company cited its ad restrictions for fasting apps and weight loss supplements, as well as its partnership with the National Eating Disorders Association.

Cora Harrington, founder and editor in chief of the Lingerie Addict, said Adore Me’s complaints rang true compared to what she has seen and heard from other brands and models. Harrington, who is Black, writes frequently about the history and culture of intimate apparel, including the industry’s long-standing inclusivity issues.

For years, Harrington says she has witnessed and experienced the way different social media apps will either bury or remove her content, or posts from lingerie brands featuring “nontraditional” models.

“ ‘Acceptable’ bodies tend to be White. They tend to be thin, they tend to be young. They tend to be able-bodied or to not have any physical scars or marking,” she said.

The further a person’s body strays from that “social ideal,” she says, the more likely it seems that social platforms will bury their posts through their algorithms, making it hard for followers to find that content.

The body-positive movement excludes women like me — those who aren’t ‘acceptably fat’

Censorship and suppression function differently, but both are driven by algorithms which are manually designed by these apps’ engineers, said Tazin Khan, chief executive and founder of Cyber Collective, a cyber security, privacy and data ethics research firm. Those formulas are manually created based off “rule sets” of what the company deems acceptable, offensive or worthy of promotion. The leaked TikTok memos were an example of these kinds of guidelines.

“This problem with TikTok, it’s not necessarily an algorithmic bias problem. It’s a TikTok bias problem,” Khan said.

These content filtration systems could ban content deemed pornographic or porn-adjacent, an issue that has disproportionately impacted Black women and Asian women because of how they’re hyper-sexualized on the Internet, Khan explained. They can also suppress content, making it harder to find on your personal timeline or in “discovery” functions, like TikTok’s #ForYou feature.

This suppression has a domino effect, especially for small businesses and brands, Harrington added. Posts that are harder to find receive less engagement and fewer likes. This in turn reinforces bias: The algorithm will be less inclined to promote similar content, while both users and creators may internalize the message that no one wants to see Black, queer, fat or disabled people wearing lingerie.

I went shopping with the woman trying to revolutionize lingerie. Here’s what I learned about myself.

Harrington was initially surprised that the company would call out TikTok.

“The fact that Adore Me is just now speaking on something that’s been going on for years is perhaps indicative of the fact that these things have only just now begun affecting them,” Harrington said.

Her comments highlight evolving expectations of brands on social media, particularly those who push imagery suggesting a commitment to diversity and inclusivity. Advertisers on social media are keenly aware that Gen Z and millennials, in particular, want brands to be authentic, transparent and socially conscious in action as well as presentation. Notably, the TikTok users Adore Me highlighted in its Twitter thread look quite different from the models on its site, most of whom are White and slender. Given the company’s history, Harrington questioned how deep the company’s concerns about bias ran.

“This isn’t a thread that makes me go, ‘Oh, this is a company that really understands and is sympathetic to the experience of marginalized people.’ This is a thread that makes me go, ‘Okay, somebody is mad because they’re losing some money.’ ”

Ranjan Roy, a senior executive at Adore Me, said the company didn’t come forward because it was concerned about losing money, noting that the brand could “simply hire models that are ‘TikTok safe’ and just move on.” But Adore Me employees were moved by interactions with their influencers and models on the platform who, when posts were taken down, would apologize to the company because they thought they’d done something wrong. In turn, Adore Me received little clarity from TikTok about why those posts were removed when it inquired.

“That boiled over, especially when we all visually saw the discriminatory patterns,” Roy said in an email.

“We’re aware that pissing off a powerful force like TikTok will not come without some certain cost,” Roy continued, citing the “long memories” of the app’s partnership and communications teams. “We consider this a win if TikTok ends up becoming more transparent about how its algorithm works.”