The Washington PostDemocracy Dies in Darkness

Despite laws in place, children are subject to sex, drugs, violence and data collection on the Internet

The borders between sites for adults and those for children have all but disappeared

By
June 17, 2019 at 1:51 p.m. EDT

Adapted from a story by The Washington Post’s Craig Timberg.

Surveys show that four out of five American preteens use some form of social media, with YouTube being the most popular but Instagram, Facebook and Snapchat also widely used — even though all four services officially prohibit users younger than 13.

Other popular online offerings — such as the game Fortnite, which has proved to be so engrossing to preteen boys that parents worry about addiction — maintain they are “not directed” at children. But the services also don’t ask users how old they are.

What does that mean? Sex, drugs, violence, hate speech, conspiracy theories and blunt talk about suicide rarely are more than a few clicks away. Even when children are viewing benign content, they face aggressive forms of data collection that allow tech companies to gather the names, locations and interests of young users.

At the heart of the issue is when companies sidestep the Children’s Online Privacy Protection Act, a 1998 law known by the acronym COPPA that restricts the tracking and targeting of those younger than 13 but requires “actual knowledge” of a user’s age as a key trigger to enforcement.

Consumer and privacy advocates have alleged rampant COPPA violations by leading technology companies, including in a highly detailed 59-page complaint against YouTube last year. Even when federal authorities take action, the penalties typically come many years after the violations began and make little dent in corporate profit margins, the advocates say.

“We’ve got a crisis of enforcement,” said Josh Golin, executive director of the Campaign for a Commercial Free Childhood, an advocacy group based in Boston. “Right now we are incentivizing companies to not know that children are on their sites.”

As researchers and consumer advocates spotlight the weaknesses of federal protections for children, some members of Congress are pushing to toughen the federal privacy law and to impose legal restrictions on what can be shown to children online. But such efforts are struggling to advance in a Congress consumed by partisan battles.

Safeguards by industry, which for years argued that “self-regulation” was an effective alternative to government’s heavy hand, also have proved weak. Even content that nearly everyone agrees should be off-limits to children, such as pornography and sites celebrating drug, tobacco and alcohol consumption, can be seen by underage users who enter fake birth dates or tap online buttons that allow them to claim to be adults. Rarely do sites or apps employ systems that routinely verify ages.

This leaves parents with few choices for helping kids navigate an online world in which the borders between sites for adults and those for children have all but disappeared. Short of round-the-clock vigilance — in play rooms, on school buses, wherever children gather with their ever-present mobile devices — there are few effective ways to shield them from corporate data collection or from encountering content traditionally kept from young eyes.

“There has been a complete and utter failure to protect children in this entire society by the Washington infrastructure,” said James Steyer, chief executive of Common Sense Media, a San Francisco-based advocacy group pushing for several new measures to make the Internet safer for children. “It’s a disgrace. And the losers have been children, parents and their families.”

A case study

One children’s website, roman-numerals.org, featured educational games, cartoon characters in togas and a decidedly adult advertisement along the bottom, a Princeton researcher recently found. In the ad a dark-haired woman in a low-cut dress smiled warmly just above the words “Ashley Madison,” with a link to the online dating service whose slogan is “Life is short. Have an affair.”

Minutes later, on a different children’s math site, another Ashley Madison ad appeared, only this time the woman’s hair was curlier and the dress more revealing.

Delivering the ads on both occasions was Google, the world’s largest digital advertising company, which acknowledged in a statement that the ads violated company policy.

“I was shocked,” said Gunes Acar, the researcher for Princeton’s Center for Information Technology Policy who discovered the ads.

The shocks would keep coming for him during several weeks last winter reviewing the ads on children’s websites.

Acar was studying the effectiveness of COPPA, which was once hailed as a landmark in protecting kids online. But recent research by Acar and others has demonstrated significant limits in the reach and enforcement of COPPA, suggesting that the law has been overrun by the very industry it was supposed to regulate.

In addition to the Ashley Madison ads, Acar’s survey of websites labeled as “child-directed” found a Google ad for a dating service featuring Qatari women and another touting pictures of “Hot Survivor Contestants.” Some ads served by Google offered downloads that included malicious software. Another Google ad caused his computer to emit a high-pitched alarm as a robotic voice announced an “Important security message” and urged him to call a number for tech support — all signs of a likely online scam.

All of these ads complied with COPPA, meaning they didn’t track or target children. But the law also had another apparent effect, one not intended by its creators: By barring personalized advertising, COPPA can prompt advertising companies to deliver a hodgepodge of untargeted ads on children’s sites, resulting in a mix that can be curiously adult in nature.

Many women receive unsolicited nude photos on dating apps and social media. Bumble wants to change that.

Acar’s survey involved repeatedly visiting children’s websites while he was not signed into any Google service, so that he could see what advertising appeared. He also collected Google’s explanations of why it displayed certain ads, to make sure that the factors weren’t particular to his browsing history or anything else that might indicate an adult user. Google’s explanations of the Ashley Madison ads, for example, indicated that they were not personalized to Acar but were displayed for general reasons, such as his location, in Princeton, N.J., and the time of day.

Google said that it has policies against ads delivering malicious software and that it does not allow adult advertising on children’s sites.

“Our policies prohibit serving personalized ads to children. We also restrict the kind of ad content that is served on primarily child-directed sites and in this case, our systems didn’t work as intended and we have removed those ads,” Google spokeswoman Shannon Newberry said.

John Barth, director of digital marketing for IXL Learning, which owns the children’s math sites that showed the Ashley Madison ads, wrote in an email that the sites are flagged to Google as “child-directed” and run ads that comply with COPPA. When provided with images of the ads that Acar found, Barth said, “This is concerning. ... I plan to investigate this issue.” The sites are no longer online.

Ashley Madison, meanwhile, expressed frustration that some of its ads had reached audiences unlikely to be in the market for its core service of helping people find sexual relationships outside their marriages.

“Unless there’s been a sudden surge of affairs at recess, this is not in our interest,” said Paul Keable, chief strategy officer for Ashley Madison’s parent company, ruby Inc.

‘The violations are rampant’

When researchers from the University of California at Berkeley tested 5,855 apps marketed to children and their parents on the “Designed for Families” portion of Google’s Play store, they found that 57 percent showed signs that they may be violating COPPA, including its limits on collecting the personal data of young users. The researchers published their findings in a peer-reviewed journal last year and furnished the list of apps to Google, hoping it would address the claims.

A year later, many of the identified apps still are available on the Play store in the “Designed for Families” section, the researchers say. New Mexico Attorney General Hector Balderas (D) has sued Google in federal court for alleged COPPA violations, including for not addressing problems the researchers found.

With Tumblr’s new nudity ban, one of the last big refuges for pornography on social media goes extinct

Google also was among the companies whose online trackers were probably collecting children’s data. Independent developers installed trackers from Google and other advertising companies into their apps, allowing them to collect such data as user locations and interests, based on what apps they used or websites they visited.

This helped app makers understand their audiences better while also providing the data necessary to attract more lucrative targeted ads, but on children’s apps, this kind of tracking probably violated COPPA, said Serge Egelman, one of the researchers and director of the Berkeley Laboratory for Usable and Experimental Security.

“The platforms have an incentive to not investigate,” Egelman said. “The violations are rampant.”

Google has argued in its response to the New Mexico attorney general’s lawsuit that “the app developer bears sole responsibility for ensuring COPPA compliance.” But the FTC also has broad authority to investigate and institute enforcement actions when companies engage in “deceptive practices.”

Congress’s push

Sen. Edward J. Markey (D-Mass.), one of the original sponsors of COPPA, has proposed a bill this year that would strengthen it. His COPPA update, co-sponsored by Sen. Josh Hawley (R-Mo.), would set a broader standard requiring compliance if there is substantial evidence that children are using a website or app. The bill also would bring the United States closer to the children’s privacy standards in the European Union by raising the age of those covered by COPPA to include anyone younger than 16. European law prohibits collecting personal data from children younger than 16 in most cases.

“We believe that parents in the United States want the same protection for their children as Europeans want for their children,” Markey said.

He also is writing a bill that would implement new standards for children’s online content, echoing previous generations’ rules for kids’ shows on broadcast television. Markey said the portability of mobile devices makes it harder than ever for parents to monitor what their children are watching — or receiving through advertising.

Although Markey predicted bipartisan support for his bills, they already are generating resistance from some in the technology industry, whose lobbying corps is among the largest and best funded in Washington.

Markey’s legislative push, they warn, will inhibit innovation and push developers toward offering fewer free services online, limiting access to poorer families.

“I think it will push us further in this direction of limiting the availability of services and apps [for] kids,” said Daniel Castro, the vice president at the Information Technology & Innovation Foundation, whose board includes Apple and Microsoft executives. “It’s going to raise the cost of compliance. It’s going to make it so you have more paid apps.”

Tony Romm, Alice Crites, Julie Tate and Emily Guskin contributed to this report.