CDD

Publishings Digital Youth

  • Groups urge Congress to stop Big Tech’s manipulation of young people BOSTON – Thursday, December 2, 2021 – Today a coalition of leading advocacy groups launched Designed With Kids in Mind, a campaign demanding a design code in the US to protect young people from online manipulation and harm. The campaign seeks to secure protections for US children and teens similar to the UK’s groundbreaking Age-Appropriate Design Code (AADC), which went into effect earlier this year. The campaign brings together leading advocates for child development, privacy, and a healthier digital media environment, including Fairplay, Accountable Tech, American Academy of Pediatrics, Center for Digital Democracy, Center for Humane Technology, Common Sense, ParentsTogether, RAINN, and Exposure Labs, creators of The Social Dilemma. The coalition will advocate for legislation and new Federal Trade Commission rules that protect children and teens from a business model that puts young people at risk by prioritizing data collection and engagement.The coalition has launched a website that explains how many of the most pressing problems faced by young people online are directly linked to platform’s design choices. They cite features that benefit platforms at the expense of young people’s wellbeing, such as: Autoplay: increases time on platforms, and excessive time on screens is linked to mental health challenges, physical risks like less sleep, and promotes family conflict.Algorithmic recommendations: risks exposure to self-harm, racist content, pornography, and mis/disinformation.Location tracking: makes it easier for strangers to track and contact children.Nudges to share: leads to loss of privacy, risks of sexual predation and identity theft.The coalition is promoting three bills which would represent a big step forward in protecting US children and teens online: the Children and Teens’ Online Privacy Protection Act S.1628; the Kids Internet Design and Safety (KIDS) Act S. 2918; and the Protecting the Information of our Vulnerable Children and Youth (PRIVCY) Act H.R. 4801. Taken together, these bills would expand privacy protections to teens for the first time and incorporate key elements of the UK’s AADC, such as requiring the best interest of children to be a primary design consideration for services likely to be accessed by young people. The legislation backed by the coalition would also protect children and teens from manipulative design features and harmful data processing. Members of the coalition on the urgent need for a US Design Code to protect children and teens:Josh Golin, Executive Director, Fairplay:We need an internet that helps children learn, connect, and play without exploiting their developmental vulnerabilities; respects their need for privacy and safety; helps young children disconnect at the appropriate time rather than manipulating them into spending even more time online; and prioritizes surfacing high-quality content instead of maximizing engagement. The UK’s Age-Appropriate Design Code took an important step towards creating that internet, and children and teens in the US deserve the same protections and opportunities. It’s time for Congress and regulators to insist that children come before Big Tech’s profits.Nicole Gill, Co-Founder and Executive Director of Accountable Tech:You would never put your child in a car seat that wasn't designed for them and met all safety standards, but that's what we do every day when our children go online using a network of apps and websites that were never designed with them in mind. Our children should be free to learn, play, and connect online without manipulative platforms like Facebook and Google's YouTube influencing their every choice. We need an age appropriate design code that puts kids and families first and protects young people from the exploitative practices and the perverse incentives of social media.Lee Savio Beers, MD, FAAP, President of the American Academy of Pediatrics:The American Academy of Pediatrics is proud to join this effort to ensure digital spaces are safe for children and supportive of their healthy development. It is in our power to create a digital ecosystem that works better for children and families; legislative change to protect children is long overdue. We must be bold in our thinking and ensure that government action on technology addresses the most concerning industry practices while preserving the positive aspects of technology for young people.Jeff Chester, Executive Director, Center for Digital Democracy:The “Big Tech” companies have long treated young people as just a means to generate vast profits – creating apps, videos and games designed to hook them to an online world designed to surveil and manipulate them. It’s time to stop children and teens from being victimized by the digital media industry. Congress and the Federal Trade Commission should adopt commonsense safeguards that ensure America’s youth reap all the benefits of the online world without having to constantly expose themselves to the risks.Randima Fernando, Executive Director, Center for Humane Technology:We need technology that respects the incredible potential – and the incredible vulnerability – of our kids' minds. And that should guide technology for adults, who can benefit from those same improvements.Irene Ly, Policy Counsel, Common Sense:This campaign acknowledges harmful features of online platforms and apps like autoplay, algorithms amplifying harmful content, and location tracking for what they are: intentional design choices. For too long, online platforms and apps have chosen to exploit children’s vulnerabilities through these manipulative design features. Common Sense has long supported designing online spaces with kids in mind, and strongly supports US rules that would finally require companies to put kids’ well-being first.Julia Hoppock, The Social Dilemma Partnerships Director, Exposure Labs:For too long, Big Social has put profits over people. It's time to put our kids first and build an online world that works for them.Dalia Hashad, Online Safety Director, ParentsTogether: From depression to bullying to sexual exploitation, tech companies knowingly expose children to unacceptable harms because it makes the platforms billions in profit. It's time to put kids first.Scott Berkowitz, President of RAINN (Rape, Abuse & Incest National Network):Child exploitation has reached crisis levels, and our reliance on technology has left children increasingly vulnerable. On our hotline, we hear from children every day who have been victimized through technology. An age-appropriate design code will provide overdue safeguards for children across the U.S.launch_-_design_code_to_protect_kids_online.pdf
  • Contact: Jeff Chester, CDD (jeff@democraticmedia.org (link sends e-mail); 202-494-7100) David Monahan, CCFC (david@commercialfreechildhood.org (link sends e-mail);) Advocates Ask FTC to Protect Youth From Manipulative “Dark Patterns” Online BOSTON, MA and WASHINGTON, DC — May 28, 2021—Two leading advocacy groups protecting children from predatory practices online filed comments today asking the FTC to create strong safeguards to ensure that internet “dark patterns” don’t undermine children’s well-being and privacy. Campaign for a Commercial-Free Childhood (CCFC) and the Center for Digital Democracy (CDD) cited leading authorities on the impacts of internet use on child development in their comments prepared by the Communications & Technology Law Clinic at Georgetown University Law Center. These comments follow testimony given by representatives of both groups last month at a FTC workshop spearheaded by FTC Acting Chair Rebecca Slaughter. CCFC and CDD say tech companies are preying upon vulnerable kids, capitalizing on their fear of missing out, desire to be popular, and inability to understand the value of misleading e-currencies, as well as putting them on an endless treadmill on their digital devices. They urged the FTC to take swift and strong action to protect children from the harms of dark patterns. Key takeaways include: - A range of practices, often called “dark patterns” are pervasive in the digital marketplace, manipulate children, are deceptive and unfair and violate Section 5 of the FTC Act. They take advantage of a young person’s psycho-social development, such as the need to engage with peers. - The groups explained the ways children are vulnerable to manipulation and other harms from “dark patterns,” including that they have “immature and developing executive functioning,” which leads to impulse behaviors. - The FTC should prohibit the use of dark pattern practices in the children’s marketplace; issue guidance to companies to ensure they do not develop or deploy such applications, and include new protections under their Children’s Online Privacy Protection Act (COPPA) rulemaking authority to better regulate them. The commission must bring enforcement actions against the developers using child-directed dark patterns. - The FTC should prohibit the use of micro-transactions in apps serving children, including the buying of virtual currency to participate in game playing. - The FTC should adopt a definition of dark patterns to include all “nudges” designed to use a range of behavioral techniques to foster desired responses from users. The groups’ filing was in response to the FTC’s call for comments (link is external) on the use of digital “dark patterns” — deceptive and unfair user interface designs — on websites and mobile apps. Comment of Jeff Chester, executive Director of the Center for Digital Democracy: “Dark Patterns” are being used in the design of child-directed services to manipulate them to spend more time and money on games and other applications, as well as give up more of their data. It’s time the FTC acted to protect young people from being unfairly treated by online companies. The commission should issue rules that prohibit the use of these stealth tactics that target kids and bring legal action against the companies promoting their use. Comment of Josh Golin, executive Director of the Campaign for a Commercial-Free Childhood: In their rush to monetize children, app and game developers are using dark patterns that take advantage of children’s developmental vulnerabilities. The FTC has all the tools it needs to stop unethical, harmful, and illegal conduct. Doing so would be a huge step forward towards creating a healthy media environment for children. Comment of Michael Rosenbloom, Staff Attorney & Clinical Teaching Fellow, Communications and Technology Law Clinic, Georgetown University Law Center: Software and game companies are using dark patterns to pressure children into playing more and paying more. Today, many apps and games that children play use dark patterns like arbitrary virtual currencies, encouragement from in-game characters, and ticking countdown timers, to get children to spend more time and money on microtransactions. These dark patterns harm children and violate Section 5 of the FTC Act, and we urge the FTC to act to stop these practices. ###
  • Press Statement, Center for Digital Democracy (CDD) and Campaign for a Commercial-Free Childhood (CCFC), 12-14-20 Today, the Federal Trade Commission announced (link is external) it will use its to 6(b) authority to launch a major new study into the data collection practices of nine major tech platforms and companies: ByteDance (TikTok), Amazon, Discord, Facebook, Reddit, Snap, Twitter, WhatsApp and YouTube. The Commission’s study includes a section on children and teens. In December, 2019, the Campaign for a Commercial-Free Childhood (CCFC), Center for Digital Democracy (CDD) and their attorneys at Georgetown Law’s Institute for Public Representation urged the Commission to use its 6(b) authority to better understand how tech companies collect and use data from children. Twenty-seven consumer and child advocacy organizations joined that request. Below are statements from CDD and CCFC on today’s announcement. Josh Golin, Executive Director, CCFC: “We are extremely pleased that the FTC will be taking a hard look at how platforms like TikTok, Snap, and YouTube collect and use young people’s data. These 6(b) studies will provide a much-needed window into the opaque data practices that have a profound impact on young people’s wellbeing. This much-needed study will not only provide critical public education, but lay the groundwork for evidence-based policies that protect young people’s privacy and vulnerabilities when they use online services to connect, learn, and play.” Jeff Chester, Executive Director, CDD: "The FTC is finally holding the social media and online video giants accountable, by requiring leading companies to reveal how they stealthily gather and use information that impacts our privacy and autonomy. It is especially important the commission is concerned about also protecting teens— who are the targets of a sophisticated and pervasive marketing system designed to influence their behaviors for monetization purposes." For questions, please contact: jeff@democraticmedia.org (link sends e-mail) See also: https://www.markey.senate.gov/news/press-releases/senator-markey-stateme... (link is external)
  • General Comment submission Children’s rights in relation to the digital environment • Professor Amandine Garde, Law & Non-Communicable Research Unit, School of Law and Social Justice, University of Liverpool • Dr Mimi Tatlow-Golden, Senior Lecturer, Developmental Psychology and Childhood, The Open University • Dr Emma Boyland, Senior Lecturer, Psychology, University of Liverpool • Professor Emerita Kathryn C. Montgomery, School of Communication, American University; Senior Strategist, Center for Digital Democracy • Jeff Chester, Center for Digital Democracy • Josh Golin, Campaign for a Commercial Free Childhood • Kaja Lund-Iversen and Ailo Krogh Ravna, Norwegian Consumer Council • Pedro Hartung and Marina Reina, Alana Institute • Dr Marine Friant-Perrot, University of Nantes • Professor Emerita Wenche Barth Eide, University of Oslo; Coordinator, FoHRC • Professor Liv Elin Torheim, Oslo Metropolitan University • Professor Alberto Alemanno, HEC Paris Business School and The Good Lobby • Marianne Hammer, Norwegian Cancer Society • Nikolai Pushkarev, European Public Health Alliance 13 November 2020 Dear Members of the Committee on the Rights of the Child, We very much welcome the Committee’s Draft General Comment No25 on children’s rights in relation to the digital environment (the Draft) and are grateful for the opportunity to comment. We are a group of leading scholars and NGO experts on youth, digital media, child rights and public health who work to raise awareness and promote regulation of marketing (particularly of harmful goods, services and brands) to which children are exposed. We argue this infringes many of the rights enshrined in the UN Convention on the Rights of the Child (CRC) and other international instruments and should be strictly regulated. Based on our collective expertise, we call on the Committee to recognise more explicitly the fundamentally transformed nature of marketing in new digital environments, the harms stemming therefrom, and the corresponding need to protect children from targeting and exposure. Without such recognition, children will not be able to fully enjoy the many opportunities for learning, civic participation, creativity and communication that the digital environment offers for their development and fulfilment of their rights. Facilitating children’s participation in this environment should not come at the price of violations of any children's rights. Before making specific comments, we wish to highlight our support for much of this Draft. In particular, we strongly support the provisions in the following paragraphs of the General Comment: 11, 13, 14, 52, 54, 62, 63, 64, 67, 72, 74, 75, 88, 112, and 119. We also note concerns regarding provisions that will require mandatory age verification: e.g., paragraphs 56, 70, 120, 122. We call on the Committee to consider provisions that this be applied proportionately, as this will certainly have the effect of increasing the processing of children’s personal data - which should not happen to the detriment of the best interests of the child. The rest of this contribution, following the structure of the Draft, proposes specific additions / modifications (underlined, in italics), with brief explanations (in boxes). Numbers refer to original paragraphs in the Draft; XX indicates a new proposed paragraph. Hoping these comments are useful to finalise the General Comment, we remain at your disposal for further information. Yours faithfully, Amandine Garde and Mimi Tatlow-Golden On behalf of those listed above [See full comments in attached document]
  • October 9, 2020 Susan Wojciki CEO YouTube 901 Cherry Avenue San Bruno, CA 94006 Dear Ms. Wojciki: We commend Google/YouTube’s plan to create a $100 million investment fund for children’s content, announced in 2019 following the FTC settlement to address YouTube’s violations of COPPA. This fund has the potential to stamp an imprint on children’s online content which will have influence for years to come. We ask that YouTube adopt policies to ensure this fund will operate in the best interests of children worldwide. The programming supported by the fund should: Reflect the perspectives and interests of children from different countries and cultures Underwrite content makers who are diverse and independent, with at least 50% of funding dedicated to historically underrepresented communities Promote educational content and content which reflects the highest values of civil society, including diversity Not support content which promotes commercialism Facilitate union representation of creators of scripted and nonfiction content for YouTube Be advised by a team of leading independent experts who can ensure programming is commissioned that truly serves the educational, civic, and developmental needs of young people. As the leading global online destination for many millions of children, as well as the most powerful digital marketing entity, Google should be at the forefront of providing financial resources for quality content that is innovative, takes creative risks, and supports emerging program makers from many different backgrounds. For example, programming supported by the fund should reflect a major commitment to diversity by commissioning producers from around the world who represent diverse cultures and perspectives. The fund is also an opportunity for Google to make a significant contribution to the development of a distinct programming vision for young people that is primarily driven to foster their wellbeing. We urge Google to only fund programming free of commercial content, including influencer marketing, product and brand integration, and licensed characters or products. In addition, each program or series should have a robust release window that provides access to all children without being required to view digital advertising and other forms of commercial marketing. The expert commissioning board we advise you to adopt will help ensure that the fund will operate fairly, and help eliminate potential conflict of interests. Operating the fund using these principles will allow YouTube to cement its place as a leader in children’s programming and more importantly, make a world of difference—ensuring that time spent watching YouTube will enrich children. We stand ready to confer with you on these suggestions and your development of the fund, and would welcome the opportunity to meet with you in the near future to discuss these items. Sincerely, Jeffrey Chester, Executive Director, Center for Digital Democracy Jessica J. González, Co-CEO, Free Press Josh Golin, Executive Director, Campaign for a Commercial-Free Childhood Justin Ruben, Co-Director, ParentsTogether Lowell Peterson, Executive Director, Writers Guild of America, East, AFL-CIO
  • The Campaign for Commercial-Free Childhood (CCFC) and CDD filed comments with the UN’s Special Rapporteur on privacy, as part of a consultation designed to propose global safeguards for young people online. Both CCFC and CDD, along with allies in the U.S. and throughout the world, are working to advance stronger international protections for young people, especially related to their privacy and the impacts that digital marketing has on their development.
    Jeff Chester
  • Press Release

    Advocates Call on TikTok Suitors to Clean Up Kids’ Privacy Practices

    Groups had filed complaint at FTC documenting how TikTok flouts children’s privacy law, tracks millions of kids without parental consent.

    Contact: Katharina Kopp, CDD (kkopp@democraticmedia.org (link sends e-mail); 202-836-4621) David Monahan, CCFC (david@commercialfreechildhood.org (link sends e-mail)) Advocates Call on TikTok Suitors to Clean Up Kids’ Privacy Practices Groups had filed complaint at FTC documenting how TikTok flouts children’s privacy law, tracks millions of kids without parental consent. WASHINGTON, DC and BOSTON, MA—September 3, 2020—The nation’s leading children’s privacy advocates are calling on potential buyers of TikTok “to take immediate steps to comprehensively improve its privacy and data marketing practices for young people” should they purchase the platform. In separate letters to Microsoft, Walmart, and Oracle, Campaign for a Commercial-Free Childhood (CCFC) and the Center for Digital Democracy (CDD) detail TikTok’s extensive history of violating the Children’s Online Privacy Protection Act (COPPA), including a recent news report that TikTok internally classified more than one-third of its 49 million US users as fourteen or under. Given the likelihood that millions of these users are also under thirteen, the advocates urged Microsoft, Walmart, and Oracle to pledge to immediately stop collecting and processing data from any account flagged as or believed to be under thirteen if they acquire TikTok’s US operations, and only restore accounts that can be affirmatively verified as belonging to users that are thirteen or older. COPPA requires apps and websites to obtain verifiable parental consent before collecting the personal information of anyone under 13, but TikTok has not done so for its millions of accounts held by children. “Whoever purchases TikTok will have access to a treasure trove of ill-gotten, sensitive children’s data,” said Josh Golin, Executive Director of CCFC. “Any new owner must demonstrate their commitment to protecting young people’s privacy by immediately deleting any data that was illegally obtained from children under thirteen. With the keys to one of the most popular platforms for young people on the planet must come a commitment to protect children’s privacy and wellbeing.” In February 2019, TikTok was fined $5.7 million by the Federal Trade Commission (FTC) for COPPA violations and agreed to delete children’s data and properly request parental consent before allowing children under 13 on the site and collecting more data from them. This May, CCFC, CDD, and a coalition of 20 advocacy groups filed an FTC complaint against TikTok for ignoring their promises to delete kids’ data and comply with the law. To this day, the groups say, TikTok plays by its own rules, luring millions of kids under the age of 13, illegally collecting their data, and using it to manipulatively target them with marketing. In addition, they wrote to the companies today that, “By ignoring the presence of millions of younger children on its app, TikTok is putting them at risk for sexual predation; news reports and law enforcement agencies have documented many cases of inappropriate adult-to-child contact on the app.” In August, the groups’ allegations that TikTok had actual knowledge that millions of its users were under thirteen were confirmed by the New York Times. According to internal documents obtained by the Times, TikTok assigns an age range to each user utilizing a variety of methods including “facial recognition algorithms that scrutinize profile pictures and videos,” “comparing their activity and social connections in the app against those of users whose ages have already been estimated,” and drawing “upon information about users that is bought from other sources.” Using these methods, more than one third of TikTok’s 49 million users in the US were estimated to be under fourteen. Among daily users, the proportion that TikTok has designated as under fourteen rises to 47%. “The new owners of TikTok in the U.S. must demonstrate they take protecting the privacy and well-being of young people seriously,” said Katharina Kopp, policy director of the Center for Digital Democracy. “The federal law protecting kids’ privacy must be complied with and fully enforced. In addition, the company should implement a series of safeguards that prohibits manipulative, discriminatory and harmful data and marketing practices that target children and teens. Regulators should reject any proposed sale without ensuring a set of robust set of safeguards for youth are in place,” she noted. ###
  • Press Release

    Children’s privacy advocates call on FTC to require Google, Disney, AT&T and other leading companies to disclose how they gather and use data to target kids and families

    Threats to young people from digital marketing and data collection are heightened by home schooling and increased video and mobile streaming in response to COVID-19

    Contact: Jeffrey Chester, CDD, jeff@democraticmedia.org (link sends e-mail), 202-494-7100 Josh Golin, CCFC, josh@commercialfreechildhood.org (link sends e-mail), 339-970-4240 Children’s privacy advocates call on FTC to require Google, Disney, other leading companies to disclose how they gather and use data to target kids and families Threats to young people from digital marketing and data collection are heightened by home schooling and increased video and mobile streaming in response to COVID-19 WASHINGTON, DC and BOSTON, MA – March 26, 2020 – With children and families even more dependent on digital media during the COVID-19 crisis, the Campaign for a Commercial-Free Childhood (CCFC) and the Center for Digital Democracy (CDD) called on the Federal Trade Commission (FTC) to require leading digital media companies to turn over information on how they target kids, including the data they collect. In a letter to the FTC, the advocates proposed a series of questions to shed light on the array of opaque data collection and digital marketing practices which the tech companies employ to target kids. The letter includes a proposed list of numerous digital media and marketing companies and edtech companies that should be the targets of the FTC’s investigation—among them are Google, Zoom, Disney, Comcast, AT&T, Viacom, and edtech companies Edmodo and Prodigy. The letter—sent by the Institute for Public Representation at Georgetown Law, attorneys for the advocates—is in response to the FTC’s early review of the rules protecting children under the Children’s Online Privacy Protection Act (COPPA). The groups said “children’s privacy is under siege more than ever,” and urged the FTC “not to take steps that could undermine strong protections for children’s privacy without full information about a complex data collection ecosystem.” The groups ask the Commission to request vital information from two key sectors that greatly impact the privacy of children: the edtech industry, which provides information and technology applications in the K-12 school setting, and the commercial digital data and marketing industry that provides the majority of online content and communications for children, including apps, video streaming, and gaming. The letter suggests numerous questions for the FTC to get to the core of how digital companies conduct business today, including contemporary Big Data practices that capture, analyze, track, and target children across platforms. “With schools closed across the country, American families are more dependent than ever on digital media to educate and occupy their children,” said CCFC’s Executive Director, Josh Golin. “It’s now urgent that the FTC use its full authority to shed light on the business models of the edtech and children’s digital media industries so we can understand what Big Tech knows about our children and what they are doing with that information. The stakes have never been higher.” “Although children’s privacy is supposed to be protected by federal law and the FTC, young people remain at the epicenter of a powerful data-gathering and commercial online advertising system," said Dr. Katharina Kopp, Deputy Director of the Center for Digital Democracy. “We call on the FTC to investigate how companies use data about children, how these data practices work against children’s interests, and also how they impact low-income families and families of color. Before it proposes any changes to the COPPA rules, the FTC needs to obtain detailed insights into how contemporary digital data practices pose challenges to protecting children. Given the outsize intrusion of commercial surveillance into children’s and families’ lives via digital services for education, entertainment, and communication, the FTC must demonstrate it is placing the welfare of kids as its highest priority.” In December, CCFC and CDD led a coalition of 31 groups—including the American Academy of Pediatrics, Center for Science in the Public Interest, Common Sense Media, Consumer Reports, Electronic Privacy Information Center, and Public Citizen—in calling on the FTC to use its subpoena authority. The groups said the Commission must better assess the impacts on children from today’s digital data-driven advertising system, and features such as cross-device tracking, artificial intelligence, machine learning, virtual reality, and real-time measurement. “Childhood is more digital than ever before, and the various ways that children's data is collected, analyzed, and used have never been more complex or opaque,” said Lindsey Barrett, Staff Attorney and Teaching Fellow at IPR’s Communications and Technology Law Clinic at Georgetown Law. “The Federal Trade Commission should shed light on how children's privacy is being invaded at home, at school, and throughout their lives by investigating the companies that profit from collecting their data, and cannot undertake an informed and fact-based revision of the COPPA rules without doing so.” "Children today, more than ever, have an incredible opportunity to learn, play, and socialize online,” said Celia Calano, student attorney at the Institute for Public Representation. “But these modern playgrounds and classrooms come with new safety concerns, including highly technical and obscure industry practices. The first step to improving the COPPA Rule and protecting children online is understanding the current landscape—something the FTC can achieve with a 6(b) investigation." ###
  • Contact: Jeff Chester, CDD (jeff@democraticmedia.org (link sends e-mail); 202-494-7100) David Monahan, CCFC (david@commercialfreechildhood.org (link sends e-mail); 617-896-9397) Groups Praise Sen. Markey and Google for Ensuring Children on YouTube Receive Key Safeguards BOSTON, MA & WASHINGTON, DC—December 18, 2019—The organizations that spurred the landmark FTC settlement with Google over COPPA violations applauded the announcement of additional advertising safeguards for children on YouTube today. The Campaign for a Commercial-Free Childhood (CCFC) and the Center for Digital Democracy (CDD) commended Google for announcing it would apply most of its robust marketing protections on YouTube Kids, including no advertising of food or beverages or harmful products, to all child-directed content on its main YouTube platform. The groups also lauded Senator Markey for securing (link is external) a public commitment from Google to implement these long-overdue safeguards. The advocates expressed disappointment, however, that Google did not agree to prohibit paid influencer marketing and product placement to children on YouTube as it does on YouTube Kids “Sen. Ed Markey has long been and remains the champion for kids,” said Jeff Chester, CDD’s executive director. “Through the intervention of Sen. Markey, Google has finally committed to protecting children whether they are on the main YouTube platform or using the YouTube Kids app. Google has acted responsibly in announcing that its advertising policies now prohibit any food and beverage marketing on YouTube Kids, as well as ads involving ‘sexually suggestive, violent or dangerous content.’ However, we remain concerned that Google may try to weaken these important child- and family-friendly policies in the near future. Thus we call on Google to commit to keeping these rules in place, and to implement other needed safeguards that children deserve,” added Chester. Josh Golin, Executive Director of CCFC, said, “We are so grateful to Senator Markey for his leadership on one of the most crucial issues faced by children and families today. And we commend Google for implementing a robust set of advertising safeguards on the most popular online destination for children. We urge Google to take another critical step and prohibit child-directed influencer content on YouTube; if this manipulative marketing isn’t allowed on children’s TV or YouTube Kids, it shouldn’t be targeted to children on the main YouTube platform either.” ###
  • In the aftermath of Google’s settlement with the FTC over its COPPA violations, some independent content producers on YouTube have expressed unhappiness with the decision. They are unclear how to comply with COPPA, and believe their revenue will diminish considerably. Some also worry that Google’s recently announced (link is external) system to meet the FTC settlement—where producers must identify if their content is child-directed—will affect their overall ability to “monetize” their productions even if they aren’t aiming to primarily serve a child audience. These YouTubers have focused their frustration at the FTC and have mobilized to file comments in the current COPPA proceedings (link is external). As Google has rolled out its new requirements, it has abetted a misdirected focus on the FTC and created much confusion and panic among YouTube content producers. Ultimately, their campaign, designed to weaken the lone federal law protecting children’s privacy online, could create even more violations of children’s privacy. While we sympathize with many of the YouTubers’ concerns, we believe their anger and sole focus on the FTC is misplaced. It is Google that is at fault here, and it needs finally to own up and step up. The truth is, it is Google’s YouTube that has violated the 2013 COPPA rule (link is external) pretty much since its inception. The updated rule made it illegal to collect persistent identifiers from children under 13 without parental consent. Google did so while purposefully developing YouTube as the leading site for children. It encouraged content creators to go all in and to be complicit in the fiction that YouTube is only for those whose age is 13 and above. Even though Google knew that this new business model was a violation of the law, it benefitted financially by serving personalized ads to children (and especially by creating the leading online destination for children in the U.S. and worldwide). All the while small, independent YouTube content creators built their livelihood on this illegitimate revenue stream. The corporate content brand channels of Hasbro, Mattel and the like, who do not rely on YT revenue, as well as corporate advertisers, also benefitted handsomely from this arrangement, allowing them to market to children unencumbered by COPPA regulations. But let’s review further how Google is handling the post-settlement world. Google chose to structure its solution to its own COPPA violation in a way that continues to place the burden and consequences of COPPA compliance on independent content creators. Rather than acknowledging wrong-doing and culpability in the plight of content creators who built their livelihoods on the sham that Google had created, Google produced an instructional video (link is external) for content creators that emphasizes the consequences of non-compliance and the potential negative impact on the creators’ monetization ability. It also appeared to have scared those who do not create “for kids” content. Google requires content creators to self-identify their content as “for kids,” and it will use automated algorithms to detect and flag “for kids” content. Google appears to have provided little useful information to content providers on how to comply, and confusion now seems rampant. Some YouTubers also fear (link is external) that the automated flagging of content is a blunt instrument “based on oblique overly broad criteria.” Also, Google declared that content designated as “for kids” will no longer serve personalized ads. The settlement and Google’s implementation are designed to assume the least risk for Google, while maximizing its monetary benefits. Google will start limiting the data it collects on made “for kids” content – something they should have done a long time ago, obviously. As a result, Google said it will no longer show personalized ads. However, the incentives for content creators to self-identify as “for kids” are not great, given that disabling (link is external) behavioral ads “may significantly reduce your channel’s revenue.” Although Google declares that it is “committed to help you with this transition,” it has shown no willingness to reduce its own significant cut of the ad revenue when it comes to children’s content. While incentives for child-directed content creators are high to mis-label their content, and equally high for Google to encourage them in this subterfuge, the consequences for non-compliance now squarely rest with content creators alone. Let’s be clear here. Google should comply with COPPA as soon as possible where content is clearly child- directed. Google has already developed a robust set of safeguards and policies (link is external) on YouTube Kids to protect children from advertising (link is external) for harmful products and from exploitative influencer marketing. It should apply the same protections on all child-directed content, regardless of which YouTube platform kids are using. When CCFC and CDD filed our COPPA complaint in 2018, we focused on how Google was shirking its responsibilities under the law by denying that portions of YouTube were child-directed (and thus governed by COPPA). The channels we cited in our complaint were not gray-area channels that might be child attractive but also draw lots of teen and adult viewers. Our complaint discussed such channels as Little Baby Bum, ChuChu TV Nursery Rhymes and Kids Songs, and Ryan’s Toy Reviews. We did not ask the FTC to investigate or sanction any channel owners, because Google makes the rules on YouTube, particularly with regard to personal data collection and use, and therefore it was the party that chose to violate COPPA. (Many independent content creators concur indirectly when they say that they should not be held accountable under COPPA. They maintain that they actually don’t have access to detailed audience data and do not know if their YouTube audience is under 13 at all. Google structures what data they have access to.) For other content, in the so-called “gray zone,” such as content for general audiences that children under 13 also watch, or content that cannot be easily classified, we need more information about Google’s internal data practices. Do content creators have detailed access to demographic audience data and are thus accountable, or does Google hold on to that data? Should accountability for COPPA compliance be shifted more appropriately to Google? Can advertising restrictions be applied at the user level once a user is identified as likely to be under thirteen regardless of what content they watch? We need Google to open up its internal processes, and we are asking the FTC to develop rules that share accountability appropriately between Google and its content creators. The Google settlement has been a significant victory for children and their parents. For the first time, Google has been forced to take COPPA seriously, a U.S. law that was passed by Congress to express the will of the majority of the electorate. Of course, the FTC is also complicit in this problem as it waited six years to enforce the updated law. They watched Google’s COPPA violations increase over time, allowing a monster to grow. What’s worse, the quality of the kids YouTube content was to most, particularly to parents, more than questionable (link is external), and at times even placed children seriously at risk (link is external). What parents saw in the offering for their children was quantity rather than quality content. Now, however, after six years, the FTC is finally requiring Google and creators to abide by the law. Just like that. Still, this change should not come as a complete surprise to content creators. We sympathize with the independent YT creators and understand their frustration, but they have been complicit in this arrangement as well. The children’s digital entertainment industry has discussed compliance with COPPA for years behind closed doors, and many knew that YouTube was in non-compliance with COPPA. The FTC has failed to address the misinformation that Google is propagating among content creators, its recent guidance (link is external) not withstanding. Moreover, the FTC has allowed Google to settle its COPPA violation by developing a solution that allows Google to abdicate any responsibility with COPPA compliance, while continuing to maximize revenue. It’s time for the FTC to study Google’s data practices and capabilities better, and squarely put the onus on Google to comply with COPPA. As the result of the current COPPA proceedings, rules must be put in place to hold platforms, like YouTube, accountable.