BY BEN BRODY AND MARK BERGEN
WASHINGTON – Google’s YouTube agreed on Wednesday to pay a $170 million fine and limit ads on kids’ videos to settle claims that the company violated children’s privacy laws.
The world’s largest video-sharing site agreed to pay the fine, which is a record for a children’s privacy case, to the U.S. Federal Trade Commission and New York State for failing to obtain parental consent in collecting data on kids under the age of 13, the FTC said.
Starting in four months, Google also will limit data collection and turn off commenting on videos aimed at kids, YouTube announced at the same time, moves that will hamstring its ability to sell advertisement against a massive portion of its media library.
The settlement under the 1998 Children’s Online Privacy Protection Act, or COPPA, represents the most significant U.S. enforcement action against a big technology company in at least five years over its practices involving minors.
Washington is stepping up privacy and antitrust scrutiny of the big internet platforms that have largely operated with few regulatory constraints.
“The $170 million total monetary judgment is almost 30 times higher than the largest civil penalty previously imposed under COPPA,” FTC Chairman Joe Simons said in a joint statement with fellow Republican Commissioner Christine Wilson. “This significant judgment will get the attention of platforms, content providers, and the public.”
The commission’s two Democrats broke from its three Republicans, however, saying the settlement did not go far enough to fix the problems. Some consumer advocates have slammed earlier reports of the fine as an insufficient deterrent, given the size of the company.
YouTube said it will rely on both machine learning and video creators themselves to identify what content is aimed at children. The algorithms will look at cues such as kids’ characters and toys, although the identification of youth content can be tricky.
Content creators are being given four months to adjust before changes take effect, the company said.
STRIPPING ‘TARGETED’ ADS
The company will also spend more to promote its kids app and establish a $100 million fund, disbursed over three years, “dedicated to the creation of thoughtful, original children’s content,” Chief Executive Officer Susan Wojcicki wrote in a blog posting.
“Today’s changes will allow us to better protect kids and families on YouTube,” Wojcicki wrote in the blog, which acknowledged the rising chances that children are watching the site alone.
“In the coming months, we’ll share details on how we’re rethinking our overall approach to kids and families, including a dedicated kids experience on YouTube,” she said.
YouTube has already begun plans to strip videos aimed at kids of “targeted” ads, which rely on information such as web-browsing cookies, Bloomberg has reported.
The company violated COPPA with data collection to serve these ads, the FTC alleged.
Some consumer advocates say the move away from targeted ads would do little to stop tracking of kids when they watch content aimed at general audiences, and that relying on video creators to make the changes could hurt compliance.
The FTC has been cracking down on firms that violate COPPA. It fined the popular teen app now known as TikTok $5.7 million in February to resolve claims the video service failed to obtain parental consent before collecting names, email addresses and other information from children under 13. The agency is also planning to revamp its rules around children’s online privacy.
Alphabet Inc.’s Google doesn’t break out sales for the video site, but the company has reported that YouTube is its second-largest source of revenue behind search advertising. Research firm Loup Ventures estimates that 5% of YouTube’s annual revenue, or roughly $750 million a year, comes from content aimed at children.
YouTube had long maintained that children under 13 don’t use its site without parental supervision, as its terms of service stipulate, but according to the FTC, it touted young users in advertising materials. There’s ample evidence these young viewers flock to the site, and consumer groups complained last year.
The site has already made tweaks as it tries to create a safer destination for children.
In recent months, it changed its algorithm to promote what it called “quality” kids’ videos, a shift that alarmed many of its video creators. Wojcicki said the newest transitions “won’t be easy for some creators” and the company would work with them and provide resources to navigate the changes.