CDD

program areas Digital Youth

  • CDD and Advocates Call on the FTC to Begin Rulemaking to Prohibit Surveillance AdvertisingJanuary 26, 2022Federal Trade CommissionOffice of the Secretary600 Pennsylvania Avenue NWWashington, DC 20580Re: Comment on Petition for Rulemaking by Accountable Tech, FTC-2021-0070  INTRODUCTIONCenter for Digital Democracy, Common Sense, Fairplay, Parent Coalition for Student Privacy and ParentsTogether strongly support the Petition for Rulemaking to Prohibit Surveillance Advertising filed by Accountable Tech1. We agree that this action is necessary to stop the exploitation of children and teens2.Surveillance advertising, also known as behavioral or targeted advertising, has become the standard business model for a wide array of online platforms with companies utilizing this practice to micro-target all consumers, including children and teens. Surveillance advertising involves the collection of vast amounts of personal data of online users, their demographics, behaviors, preferences, characteristics, and the production of inferences. To create detailed advertising profiles from this data, users are  tracked across websites and devices; they are classified, sorted, and even discriminated against via targeting and exclusion; and ultimately are left vulnerable to manipulation and exploitation.Young people are especially susceptible to the risks posed by surveillance advertising, which is why leading public health advocates like the American Academy of Pediatrics have called for a ban on surveillance advertising to children under 18 years old3. Children’s and teens’ online experiences are shaped by the affordances of surveillance marketing, which entrap them in a complex system purposefully designed to manipulate their behaviors and emotions, while leveraging their data in the process. Young people are a significant audience for the real-time ad profiling and targeting apparatus operated through programmatic platforms and technologies, which poses fundamental risks to their privacy, safety and well-being.  Surveillance advertising is harmful to young people in several ways. First, young people are already more susceptible to advertising’s negative effects and surveillance advertising allows marketers to manipulate children and teens even more effectively. Second, surveillance advertising allows advertisers to target children’s individual vulnerabilities. Third, surveillance advertising can exacerbate inequities by allowing advertisers to target (or abstain from targeting) marginalized communities. Fourth, behavioral advertising is the driving force behind a complex system of data collection and surveillance that tracks all of children’s online activity, undermining young people’s privacy and wellbeing. Finally, the Children’s Online Privacy Protection Act has failed to effectively protect children under thirteen from surveillance advertising and a more expansive prohibition is needed to protect the youngest and most vulnerable users online.For these reasons, we urge the Commission to protect children and teens by prohibiting surveillance advertising.......Please read the full petition, see PDF below......____________________________________________186 Fed. Reg. 73206 (Dec. 27, 2021).2Pet’n for Rulemaking at 32-33.3Jenny Radesky, Yolanda (Linda) Reid Chassiakos, Nusheen Ameenuddin, Dipesh Navsaria, Council on Communications and Media; Digital Advertising to Children. Pediatrics July 2020; 146 (1): e20201681. 10.1542/peds.2020-1681.childrens_coalition_survadv_1-26-22.pdf
  • Groups urge Congress to stop Big Tech’s manipulation of young people BOSTON – Thursday, December 2, 2021 – Today a coalition of leading advocacy groups launched Designed With Kids in Mind, a campaign demanding a design code in the US to protect young people from online manipulation and harm. The campaign seeks to secure protections for US children and teens similar to the UK’s groundbreaking Age-Appropriate Design Code (AADC), which went into effect earlier this year. The campaign brings together leading advocates for child development, privacy, and a healthier digital media environment, including Fairplay, Accountable Tech, American Academy of Pediatrics, Center for Digital Democracy, Center for Humane Technology, Common Sense, ParentsTogether, RAINN, and Exposure Labs, creators of The Social Dilemma. The coalition will advocate for legislation and new Federal Trade Commission rules that protect children and teens from a business model that puts young people at risk by prioritizing data collection and engagement.The coalition has launched a website that explains how many of the most pressing problems faced by young people online are directly linked to platform’s design choices. They cite features that benefit platforms at the expense of young people’s wellbeing, such as: Autoplay: increases time on platforms, and excessive time on screens is linked to mental health challenges, physical risks like less sleep, and promotes family conflict.Algorithmic recommendations: risks exposure to self-harm, racist content, pornography, and mis/disinformation.Location tracking: makes it easier for strangers to track and contact children.Nudges to share: leads to loss of privacy, risks of sexual predation and identity theft.The coalition is promoting three bills which would represent a big step forward in protecting US children and teens online: the Children and Teens’ Online Privacy Protection Act S.1628; the Kids Internet Design and Safety (KIDS) Act S. 2918; and the Protecting the Information of our Vulnerable Children and Youth (PRIVCY) Act H.R. 4801. Taken together, these bills would expand privacy protections to teens for the first time and incorporate key elements of the UK’s AADC, such as requiring the best interest of children to be a primary design consideration for services likely to be accessed by young people. The legislation backed by the coalition would also protect children and teens from manipulative design features and harmful data processing. Members of the coalition on the urgent need for a US Design Code to protect children and teens:Josh Golin, Executive Director, Fairplay:We need an internet that helps children learn, connect, and play without exploiting their developmental vulnerabilities; respects their need for privacy and safety; helps young children disconnect at the appropriate time rather than manipulating them into spending even more time online; and prioritizes surfacing high-quality content instead of maximizing engagement. The UK’s Age-Appropriate Design Code took an important step towards creating that internet, and children and teens in the US deserve the same protections and opportunities. It’s time for Congress and regulators to insist that children come before Big Tech’s profits.Nicole Gill, Co-Founder and Executive Director of Accountable Tech:You would never put your child in a car seat that wasn't designed for them and met all safety standards, but that's what we do every day when our children go online using a network of apps and websites that were never designed with them in mind. Our children should be free to learn, play, and connect online without manipulative platforms like Facebook and Google's YouTube influencing their every choice. We need an age appropriate design code that puts kids and families first and protects young people from the exploitative practices and the perverse incentives of social media.Lee Savio Beers, MD, FAAP, President of the American Academy of Pediatrics:The American Academy of Pediatrics is proud to join this effort to ensure digital spaces are safe for children and supportive of their healthy development. It is in our power to create a digital ecosystem that works better for children and families; legislative change to protect children is long overdue. We must be bold in our thinking and ensure that government action on technology addresses the most concerning industry practices while preserving the positive aspects of technology for young people.Jeff Chester, Executive Director, Center for Digital Democracy:The “Big Tech” companies have long treated young people as just a means to generate vast profits – creating apps, videos and games designed to hook them to an online world designed to surveil and manipulate them. It’s time to stop children and teens from being victimized by the digital media industry. Congress and the Federal Trade Commission should adopt commonsense safeguards that ensure America’s youth reap all the benefits of the online world without having to constantly expose themselves to the risks.Randima Fernando, Executive Director, Center for Humane Technology:We need technology that respects the incredible potential – and the incredible vulnerability – of our kids' minds. And that should guide technology for adults, who can benefit from those same improvements.Irene Ly, Policy Counsel, Common Sense:This campaign acknowledges harmful features of online platforms and apps like autoplay, algorithms amplifying harmful content, and location tracking for what they are: intentional design choices. For too long, online platforms and apps have chosen to exploit children’s vulnerabilities through these manipulative design features. Common Sense has long supported designing online spaces with kids in mind, and strongly supports US rules that would finally require companies to put kids’ well-being first.Julia Hoppock, The Social Dilemma Partnerships Director, Exposure Labs:For too long, Big Social has put profits over people. It's time to put our kids first and build an online world that works for them.Dalia Hashad, Online Safety Director, ParentsTogether: From depression to bullying to sexual exploitation, tech companies knowingly expose children to unacceptable harms because it makes the platforms billions in profit. It's time to put kids first.Scott Berkowitz, President of RAINN (Rape, Abuse & Incest National Network):Child exploitation has reached crisis levels, and our reliance on technology has left children increasingly vulnerable. On our hotline, we hear from children every day who have been victimized through technology. An age-appropriate design code will provide overdue safeguards for children across the U.S.launch_-_design_code_to_protect_kids_online.pdf
  • Contact: Jeff Chester, CDD (jeff@democraticmedia.org (link sends e-mail); 202-494-7100) David Monahan, CCFC (david@commercialfreechildhood.org (link sends e-mail);) Advocates Ask FTC to Protect Youth From Manipulative “Dark Patterns” Online BOSTON, MA and WASHINGTON, DC — May 28, 2021—Two leading advocacy groups protecting children from predatory practices online filed comments today asking the FTC to create strong safeguards to ensure that internet “dark patterns” don’t undermine children’s well-being and privacy. Campaign for a Commercial-Free Childhood (CCFC) and the Center for Digital Democracy (CDD) cited leading authorities on the impacts of internet use on child development in their comments prepared by the Communications & Technology Law Clinic at Georgetown University Law Center. These comments follow testimony given by representatives of both groups last month at a FTC workshop spearheaded by FTC Acting Chair Rebecca Slaughter. CCFC and CDD say tech companies are preying upon vulnerable kids, capitalizing on their fear of missing out, desire to be popular, and inability to understand the value of misleading e-currencies, as well as putting them on an endless treadmill on their digital devices. They urged the FTC to take swift and strong action to protect children from the harms of dark patterns. Key takeaways include: - A range of practices, often called “dark patterns” are pervasive in the digital marketplace, manipulate children, are deceptive and unfair and violate Section 5 of the FTC Act. They take advantage of a young person’s psycho-social development, such as the need to engage with peers. - The groups explained the ways children are vulnerable to manipulation and other harms from “dark patterns,” including that they have “immature and developing executive functioning,” which leads to impulse behaviors. - The FTC should prohibit the use of dark pattern practices in the children’s marketplace; issue guidance to companies to ensure they do not develop or deploy such applications, and include new protections under their Children’s Online Privacy Protection Act (COPPA) rulemaking authority to better regulate them. The commission must bring enforcement actions against the developers using child-directed dark patterns. - The FTC should prohibit the use of micro-transactions in apps serving children, including the buying of virtual currency to participate in game playing. - The FTC should adopt a definition of dark patterns to include all “nudges” designed to use a range of behavioral techniques to foster desired responses from users. The groups’ filing was in response to the FTC’s call for comments (link is external) on the use of digital “dark patterns” — deceptive and unfair user interface designs — on websites and mobile apps. Comment of Jeff Chester, executive Director of the Center for Digital Democracy: “Dark Patterns” are being used in the design of child-directed services to manipulate them to spend more time and money on games and other applications, as well as give up more of their data. It’s time the FTC acted to protect young people from being unfairly treated by online companies. The commission should issue rules that prohibit the use of these stealth tactics that target kids and bring legal action against the companies promoting their use. Comment of Josh Golin, executive Director of the Campaign for a Commercial-Free Childhood: In their rush to monetize children, app and game developers are using dark patterns that take advantage of children’s developmental vulnerabilities. The FTC has all the tools it needs to stop unethical, harmful, and illegal conduct. Doing so would be a huge step forward towards creating a healthy media environment for children. Comment of Michael Rosenbloom, Staff Attorney & Clinical Teaching Fellow, Communications and Technology Law Clinic, Georgetown University Law Center: Software and game companies are using dark patterns to pressure children into playing more and paying more. Today, many apps and games that children play use dark patterns like arbitrary virtual currencies, encouragement from in-game characters, and ticking countdown timers, to get children to spend more time and money on microtransactions. These dark patterns harm children and violate Section 5 of the FTC Act, and we urge the FTC to act to stop these practices. ###
  • Reports

    “Big Food” and “Big Data” Online Platforms Fueling Youth Obesity Crisis as Coronavirus Pandemic Rages

    New Report Calls for Action to Address Saturation of Social Media, Gaming Platforms, and Streaming Video with Unhealthy Food and Beverage Products

    The coronavirus pandemic triggered a dramatic increase in online use. Children and teens whose schools have closed relied on YouTube for educational videos, attending virtual classes on Zoom and Google Classroom, and flocking to TikTok, Snapchat, and Instagram for entertainment and social interaction. This constant immersion in digital culture has exposed them to a steady flow of marketing for fast foods, soft drinks, and other unhealthy products, much of it under the radar of parents and teachers. Food and beverage companies have made digital media ground zero for their youth promotion efforts, employing a growing spectrum of new strategies and high-tech tools to penetrate every aspect of young peoples’ lives.Our latest report, Big Food, Big Tech, and the Global Childhood Obesity Pandemic, takes an in-depth look at this issue. Below we outline just three of the many tactics the food industry is using to market unhealthy products to children and teens in digital settings.1. Influencer marketing - Travis Scott & McDonald'sMcDonald’s enlisted rapper Travis Scott, to promote the “Travis Scott Meal” to young people, featuring “a medium Sprite, a quarter pounder with bacon, and fries with barbecue sauce.” The campaign was so successful that some restaurants in the chain sold out of supplies within days of its launch. This and other celebrity endorsements have helped boost McDonald’s stock price, generated a trove of valuable consumer data, and triggered enormous publicity across social media.2. Gaming Platforms - MTN DEW Amp Game Fuel - TwitchPepsiCo’s energy drink, MTN DEW Amp Game Fuel, is specifically “designed with gamers in mind.” Each 16 oz can of MTN DEW Amp Game Fuel delivers a powerful “vitamin-charged and caffeine-boosted” formula, whose ingredients of high fructose corn syrup, grape juice concentrate, caffeine, and assorted herbs “have been shown to improve accuracy and alertness.” The can itself features a “no-slip grip that mirrors the sensory design of accessories and hardware in gaming.” It is also “easier to open and allows for more uninterrupted game play.”To attract influencers, the product was featured on Twitch’s “Bounty Board,” a one-stop-shopping tool for “streamers,” enabling them to accept paid sponsorship (or “bounties”) from brands that want to reach the millions of gamers and their followers.3. Streaming and Digital Video - "It's a Thing" Campaign - FantaConcerned that teens were “drinking less soda,” Coca-Cola’s Fanta brand developed a comprehensive media campaign to trigger “an ongoing conversation with teen consumers through digital platforms” by creating four videos based on the brand’s most popular flavors, and targeting youth on YouTube, Hulu, Roku, Crackle, and other online video platforms. “From a convenience store dripping with orange flavor and its own DJ cat, to an 8-bit videogame-ified pizza parlor, the digital films transport fans to parallel universes of their favorite hangout spots, made more extraordinary and fantastic once a Fanta is opened.” The campaign, which was aimed at Black and Brown teens, also included use of Snapchat’s augmented-reality technology to creative immersive experiences, as well as promotional efforts on Facebook-owned Instagram, which generated more than a half a million followers.
  • Contact: Jeff Chester, CDD jeff@democraticmedia.org (link sends e-mail); 202-494-7100David Monahan, CCFC, david@commercialfreechildhood.org (link sends e-mail)Advocates say Google Play continues to disregard children’s privacy law and urge FTC to act BOSTON, MA and WASHINGTON, DC — March 31, 2021—Today, advocacy groups Campaign for a Commercial-Free Childhood (CCFC) and the Center for Digital Democracy (CDD) called on the Federal Trade Commission (FTC) to investigate Google’s promotion of apps which violate the Children’s Online Privacy Protection Act (COPPA). In December 2018, CCFC and CDD led a coalition of 22 consumer and health advocacy groups in asking the FTC to investigate these same practices. Since then Google has made changes to the Play Store, but the advocates say these changes fail to address the core problem: Google is certifying as safe and appropriate for children apps that violate COPPA and put children at risk. Recent studies found that a significant number of apps in Google Play violate COPPA by collecting and sharing children’s personal information without getting parental consent. For instance, a JAMA Pediatrics study found that 67% of apps used by children aged 5 and under were transmitting personal identifiers to third parties.Comment of Angela Campbell, Chair of the Board of Directors, Campaign for a Commercial-Free Childhood, Professor Emeritus, Communications & Technology Law Clinic, Georgetown University Law Center:“Parents reasonably expect that Google Play Store apps designated as ‘Teacher approved’ or appropriate for children under age 13 comply with the law protecting children’s privacy. But far too often, that is not the case. The FTC failed to act when this problem was brought to its attention over two years ago. Because children today are spending even more time using mobile apps, the FTC must hold Google accountable for violating children’s privacy.”Comment of Jeff Chester, executive Director of the Center for Digital Democracy:"The Federal Trade Commission must swiftly act to stop Google’s ongoing disregard of the privacy and well-being of children. For too long, the Commission has allowed Google’s app store, and the data marketing practices that are its foundation, to operate without enforcing the federal law that is designed to protect young people under 13. With children using apps more than ever as a consequence of the pandemic, the FTC should enforce the law and ensure Google engages with kids and families in a responsible manner."###
  • Press Statement, Center for Digital Democracy (CDD) and Campaign for a Commercial-Free Childhood (CCFC), 12-14-20 Today, the Federal Trade Commission announced (link is external) it will use its to 6(b) authority to launch a major new study into the data collection practices of nine major tech platforms and companies: ByteDance (TikTok), Amazon, Discord, Facebook, Reddit, Snap, Twitter, WhatsApp and YouTube. The Commission’s study includes a section on children and teens. In December, 2019, the Campaign for a Commercial-Free Childhood (CCFC), Center for Digital Democracy (CDD) and their attorneys at Georgetown Law’s Institute for Public Representation urged the Commission to use its 6(b) authority to better understand how tech companies collect and use data from children. Twenty-seven consumer and child advocacy organizations joined that request. Below are statements from CDD and CCFC on today’s announcement. Josh Golin, Executive Director, CCFC: “We are extremely pleased that the FTC will be taking a hard look at how platforms like TikTok, Snap, and YouTube collect and use young people’s data. These 6(b) studies will provide a much-needed window into the opaque data practices that have a profound impact on young people’s wellbeing. This much-needed study will not only provide critical public education, but lay the groundwork for evidence-based policies that protect young people’s privacy and vulnerabilities when they use online services to connect, learn, and play.” Jeff Chester, Executive Director, CDD: "The FTC is finally holding the social media and online video giants accountable, by requiring leading companies to reveal how they stealthily gather and use information that impacts our privacy and autonomy. It is especially important the commission is concerned about also protecting teens— who are the targets of a sophisticated and pervasive marketing system designed to influence their behaviors for monetization purposes." For questions, please contact: jeff@democraticmedia.org (link sends e-mail) See also: https://www.markey.senate.gov/news/press-releases/senator-markey-stateme... (link is external)
  • General Comment submission Children’s rights in relation to the digital environment • Professor Amandine Garde, Law & Non-Communicable Research Unit, School of Law and Social Justice, University of Liverpool • Dr Mimi Tatlow-Golden, Senior Lecturer, Developmental Psychology and Childhood, The Open University • Dr Emma Boyland, Senior Lecturer, Psychology, University of Liverpool • Professor Emerita Kathryn C. Montgomery, School of Communication, American University; Senior Strategist, Center for Digital Democracy • Jeff Chester, Center for Digital Democracy • Josh Golin, Campaign for a Commercial Free Childhood • Kaja Lund-Iversen and Ailo Krogh Ravna, Norwegian Consumer Council • Pedro Hartung and Marina Reina, Alana Institute • Dr Marine Friant-Perrot, University of Nantes • Professor Emerita Wenche Barth Eide, University of Oslo; Coordinator, FoHRC • Professor Liv Elin Torheim, Oslo Metropolitan University • Professor Alberto Alemanno, HEC Paris Business School and The Good Lobby • Marianne Hammer, Norwegian Cancer Society • Nikolai Pushkarev, European Public Health Alliance 13 November 2020 Dear Members of the Committee on the Rights of the Child, We very much welcome the Committee’s Draft General Comment No25 on children’s rights in relation to the digital environment (the Draft) and are grateful for the opportunity to comment. We are a group of leading scholars and NGO experts on youth, digital media, child rights and public health who work to raise awareness and promote regulation of marketing (particularly of harmful goods, services and brands) to which children are exposed. We argue this infringes many of the rights enshrined in the UN Convention on the Rights of the Child (CRC) and other international instruments and should be strictly regulated. Based on our collective expertise, we call on the Committee to recognise more explicitly the fundamentally transformed nature of marketing in new digital environments, the harms stemming therefrom, and the corresponding need to protect children from targeting and exposure. Without such recognition, children will not be able to fully enjoy the many opportunities for learning, civic participation, creativity and communication that the digital environment offers for their development and fulfilment of their rights. Facilitating children’s participation in this environment should not come at the price of violations of any children's rights. Before making specific comments, we wish to highlight our support for much of this Draft. In particular, we strongly support the provisions in the following paragraphs of the General Comment: 11, 13, 14, 52, 54, 62, 63, 64, 67, 72, 74, 75, 88, 112, and 119. We also note concerns regarding provisions that will require mandatory age verification: e.g., paragraphs 56, 70, 120, 122. We call on the Committee to consider provisions that this be applied proportionately, as this will certainly have the effect of increasing the processing of children’s personal data - which should not happen to the detriment of the best interests of the child. The rest of this contribution, following the structure of the Draft, proposes specific additions / modifications (underlined, in italics), with brief explanations (in boxes). Numbers refer to original paragraphs in the Draft; XX indicates a new proposed paragraph. Hoping these comments are useful to finalise the General Comment, we remain at your disposal for further information. Yours faithfully, Amandine Garde and Mimi Tatlow-Golden On behalf of those listed above [See full comments in attached document]
  • October 9, 2020 Susan Wojciki CEO YouTube 901 Cherry Avenue San Bruno, CA 94006 Dear Ms. Wojciki: We commend Google/YouTube’s plan to create a $100 million investment fund for children’s content, announced in 2019 following the FTC settlement to address YouTube’s violations of COPPA. This fund has the potential to stamp an imprint on children’s online content which will have influence for years to come. We ask that YouTube adopt policies to ensure this fund will operate in the best interests of children worldwide. The programming supported by the fund should: Reflect the perspectives and interests of children from different countries and cultures Underwrite content makers who are diverse and independent, with at least 50% of funding dedicated to historically underrepresented communities Promote educational content and content which reflects the highest values of civil society, including diversity Not support content which promotes commercialism Facilitate union representation of creators of scripted and nonfiction content for YouTube Be advised by a team of leading independent experts who can ensure programming is commissioned that truly serves the educational, civic, and developmental needs of young people. As the leading global online destination for many millions of children, as well as the most powerful digital marketing entity, Google should be at the forefront of providing financial resources for quality content that is innovative, takes creative risks, and supports emerging program makers from many different backgrounds. For example, programming supported by the fund should reflect a major commitment to diversity by commissioning producers from around the world who represent diverse cultures and perspectives. The fund is also an opportunity for Google to make a significant contribution to the development of a distinct programming vision for young people that is primarily driven to foster their wellbeing. We urge Google to only fund programming free of commercial content, including influencer marketing, product and brand integration, and licensed characters or products. In addition, each program or series should have a robust release window that provides access to all children without being required to view digital advertising and other forms of commercial marketing. The expert commissioning board we advise you to adopt will help ensure that the fund will operate fairly, and help eliminate potential conflict of interests. Operating the fund using these principles will allow YouTube to cement its place as a leader in children’s programming and more importantly, make a world of difference—ensuring that time spent watching YouTube will enrich children. We stand ready to confer with you on these suggestions and your development of the fund, and would welcome the opportunity to meet with you in the near future to discuss these items. Sincerely, Jeffrey Chester, Executive Director, Center for Digital Democracy Jessica J. González, Co-CEO, Free Press Josh Golin, Executive Director, Campaign for a Commercial-Free Childhood Justin Ruben, Co-Director, ParentsTogether Lowell Peterson, Executive Director, Writers Guild of America, East, AFL-CIO
  • The Campaign for Commercial-Free Childhood (CCFC) and CDD filed comments with the UN’s Special Rapporteur on privacy, as part of a consultation designed to propose global safeguards for young people online. Both CCFC and CDD, along with allies in the U.S. and throughout the world, are working to advance stronger international protections for young people, especially related to their privacy and the impacts that digital marketing has on their development.
    Jeff Chester