CDD

Press Releases

  • Consumer Advocates Urge Action Walmart Deceptively Marketing to Kids on RobloxConsumer Advocates Urge ActionMADISON, CONN. January 23, 2023 – A coalition of advocacy groups led by ad watchdog truthinadvertising.org (TINA.org) is urging the Children’s Advertising Review Unit (CARU) – a BBB National Program – to immediately audit the Walmart Universe of Play advergame, a recent addition to the self-regulatory group’s COPPA Safe Harbor Program and bearer of one of the Program’s certification seals. According to a letter from TINA.org, Fairplay, Center for Digital Democracy and the National Association of Consumer Advocates, a copy of which was sent to Walmart, Roblox and the FTC, the retail giant is exposing children to deceptive marketing on Roblox, the online gaming and creation platform used by millions of kids on a daily basis.Walmart’s first foray into the Roblox metaverse came last September, when it premiered two experiences, Walmart Universe of Play and Walmart Land, which collectively have been visited more than 12 million times. Targeted at – and accessible to – young children on Roblox, Universe of Play features virtual products and characters from L.O.L. Surprise!, Jurassic World, Paw Patrol, and more and is advertised to allow kids to play with the “year’s best toys” and make a “wish list” of toys that can then be purchased at Walmart.As the consumer groups warn, Walmart completely blurs the distinction between advertising content and organic content, and simultaneously fails to provide clear or conspicuous disclosures that Universe of Play (or content within the virtual world) are ads. In addition, as kids’ avatars walk through the game, they are manipulated into opening additional undisclosed advertisements disguised as surprise wrapped gifts.To make matters worse, Walmart is using the CARU COPPA Safe Harbor Program seal to convey the false message that its children’s advergame is not only in compliance with COPPA (Children’s Online Privacy Protection Act), but CARU's Advertising Guidelines and truth-in-advertising laws, as well as a shield against enforcement action.“Walmart’s brazen use of stealth marketing directed at young children who are developmentally unable to recognize the promotional content is not only appalling, it’s deceptive and against truth-in-advertising laws. We urge CARU to take swift action to protect the millions of children being manipulated by Walmart on a daily basis.” Laura Smith, TINA.org Legal Director“Walmart's egregious and rampant manipulation of children on Roblox -- a platform visited by millions of children every day -- demands immediate action. The rise of the metaverse has enabled a new category of deceptive marketing practices that are harmful to children. CARU must act now to ensure that children are not collateral damage in Walmart's digital drive for profit.” Josh Golin, Executive Director, Fairplay“Walmart’s and Roblox’s practices demonstrate that self-regulation is woefully insufficient to protect children and teens online. Today, young people are targeted by a powerful set of online marketing tactics that are manipulative, unfair, and harmful to their mental and physical health. Digital advertising operates in a ‘wild west’ world where anything goes in terms of reaching and influencing the behaviors of kids and teens. Congress and the Federal Trade Commission must enact safeguards to protect the privacy and well-being of a generation of young people.” Katharina Kopp, Director of Policy, Center for Digital DemocracyTo read more about Walmart’s deceptive marketing on Roblox see: /articles/tina-org-urges-action-against-walmarts-undisclosed-advergame-on-robloxAbout TINA.org (truthinadvertising.org) TINA.org is a nonprofit organization that uses investigative journalism, education, and advocacy to empower consumers to protect themselves against false advertising and deceptive marketing.About Fairplay Fairplay is the leading nonprofit organization committed to helping children thrive in an increasingly commercialized, screen-obsessed culture, and the only organization dedicated to ending marketing to children.About Center for Digital DemocracyThe Center for Digital Democracy is a nonprofit organization using education, advocacy, and research into commercial data practices to ensure that digital technologies serve and strengthen democratic values, institutions, and processes.About National Association of Consumer AdvocatesThe National Association of Consumer Advocates is a nonprofit association of more than 1,500 attorneys and consumer advocates committed to representing consumers’ interests.For press inquiries contact: Shana Mueller at 203.421.6210 or press@truthinadvertising.org.walmart_caru_press_release_final.pdf
  • Josh Golin, executive director, Fairplay:The FTC’s landmark settlement against Epic Games is an enormous step forward towards creating a safer, less manipulative internet for children and teens. Not only is the Commission holding Epic accountable for violating COPPA by illegally collecting the data of millions of under 13-year-olds, but the settlement is also a shot across the bow against game makers who use unfair practices to drive in-game purchases by young people. The settlement rightly recognizes not only that unfair monetization practices harm young people financially, but that design choices used to drive purchases subject young people to a wide array of dangers, including cyberbullying and predation.Today’s breakthrough settlement underscores why it is so critical that Congress pass the privacy protections for children and teens currently under consideration for the Omnibus bill. These provisions give teens privacy rights for the first time, address unfair monetization by prohibiting targeted advertising, and empower regulators by creating a dedicated youth division at the FTC. Jeff Chester, executive director, Center for Digital Democracy:Through this settlement with EPIC Games using its vital power to regulate unfair business practices, the FTC has extended long-overdue and critically important online protections for teens.  This tells online marketers that from now on, teenagers cannot be targeted using unfair and manipulative tactics designed to take advantage of their young age and other vulnerabilities.Kids should also have their data privacy rights better respected through this enforcement of the federal kids data privacy law (COPPA).  Gaming is a “wild west” when it comes to its data gathering and online marketing tactics, placing young people among the half of the US population who play video games at especially greater risk.  While today’s FTC action creates new safeguards for young people, Congress has a rare opportunity to pass legislation this week ensuring all kids and teens have strong digital safeguards, regardless of what online service they use.
    Jeff Chester
  • Press Statement regarding today’s FTC Notice(link is external) of Proposed Rulemaking Regarding the Commercial Surveillance and Data SecurityKatharina Kopp, Deputy Director, Center for Digital Democracy:Today, the Federal Trade Commission issued its long overdue advanced notice of proposed rulemaking (ANPRM) regarding a trade regulation rule on commercial surveillance and data security. The ANPRM aims to address the prevalent and increasingly unavoidable harms of commercial surveillance. Civil society groups including civil rights groups, privacy and digital rights and children’s advocates had previously called on the commission to initiate this trade regulation rule to address the decades long failings of the commission to reign in predatory corporate practices online. CDD had called on the commission repeatedly over the last two decades to address the out-of-control surveillance advertising apparatus that is the root cause of increasingly unfair, manipulative, and discriminatory practices harming children, teens, and adults and which have a particularly negative impact on equal opportunity and equity.The Center for Digital Democracy welcomes this important initial step by the commission and looks forward to working with the FTC. CDD urges the commission to move forward expeditiously with the rule making and to ensure fair participation of stakeholders, particularly those that are disproportionately harmed by commercial surveillance.press_statement_8-11fin.pdf
  • Groups say FIFA: Ultimate Team preys on children’s vulnerability with loot boxes, “funny money" Contact:David Monahan, Fairplay david@fairplayforkids.orgJeff Chester, CDD jeff@democraticmedia.org; 202-494-7100Advocates call on FTC to investigate manipulative design abuses in popular FIFA gameGroups say FIFA: Ultimate Team preys on children’s vulnerability with loot boxes, “funny money”BOSTON and WASHINGTON, DC – Thursday, June 2, 2022 – Today, advocacy groups Fairplay and Center for Digital Democracy (CDD) led a coalition of 15 advocacy groups in calling on the Federal Trade Commission (FTC) to investigate video game company Electronic Arts (EA) for unfairly exploiting young users in EA’s massively popular game, FIFA: Ultimate Team. In a letter sent to the FTC, the advocates described how the use of loot boxes and virtual currency in FIFA: Ultimate Team exploits the many children who play the game, especially given their undeveloped financial literacy skills and poor understanding of the odds of receiving the most desirable loot box items.Citing the Norwegian Consumer Council’s recent report, Insert Coin: How the Gaming Industry Exploits Consumers Using Lootboxes, the advocates’ letter details how FIFA: Ultimate Team encourages gamers to engage in a constant stream of microtransactions as they play the game. Users are able to buy FIFA points, a virtual in-game currency, which can then be used to purchase loot boxes called FIFA packs containing mystery team kits; badges; and player cards for soccer players who can be added to a gamer’s team. In their letter, the advocates noted the game’s use of manipulative design abuses such as “lightning round” sales of premium packs to promote the purchase of FIFA packs, which children are particularly vulnerable to. The advocates also cite the use of virtual currency in the game, which obscures the actual cost of FIFA packs to adult users, let alone children. Additionally, the actual probability of unlocking the best loot box prizes in FIFA: Ultimate Team is practically inscrutable to anyone who is not an expert in statistics, according to the advocates and the NCC report. In order to unlock a specific desirable player in the game, users would have to pay around $14,000 or spend three years continuously playing the game. “By relentlessly marketing pay-to-win loot boxes, EA is exploiting children’s desire to compete with their friends, despite the fact that most adults, let alone kids, could not determine their odds of receiving a highly coveted card or what cards cost in real money. The FTC must use its power to investigate these design abuses and determine just how many kids and teens are being fleeced by EA.” Josh Golin, Executive Director, Fairplay“Lootboxes, virtual currencies, and other gaming features are often designed deceptively, aiming to exploit players’ known vulnerabilities. Due to their unique developmental needs, children and teens are particularly harmed. Their time and attention is stolen from them, they're financially exploited, and are purposely socialized to adopt gambling-like behaviors. Online gaming is a key online space where children and teens gather in millions, and regulators must act to protect them from these harmful practices.” Katharina Kopp, Deputy Director, Center for Digital Democracy“As illustrated in our report, FIFA: Ultimate Team uses aggressive in-game marketing and exploits gamers’ cognitive biases - adults and children alike - to manipulate them into spending large sums of money. Children especially are vulnerable to EA’s distortion of real-world value of its loot boxes and the complex, misleading probabilities given to describe the odds of receiving top prizes. We join our US partners in urging the Federal Trade Commission to investigate these troubling practices.” Finn LĂźtzow-Holm Myrstad, Digital Policy Director, Norwegian Consumer Council"The greed of these video game companies is a key reason why we're seeing a new epidemic of child gambling in our families. Thanks to this report, the FTC has more than enough facts to take decisive action to protect our kids from these predatory business practices." Les Bernal, National Director of Stop Predatory Gambling and the Campaign for Gambling-Free Kids“Exploiting consumers, especially children, by manipulating them into buying loot boxes that, in reality, rarely contain the coveted items they are seeking, is a deceptive marketing practice that causes real harm and needs to stop. TINA.org strongly urges the FTC to take action.” Laura Smith, Legal Director at TINA.orgAdvocacy groups signing today's FTC complaint include Fairplay; the Center for Digital Democracy; Campaign for Accountability; Children and Screens: Institute of Digital Media and Child Development; Common Sense Media; Consumer Federation of America; Electronic Privacy Information Center (EPIC); Florida Council on Compulsive Gambling, Inc.; Massachusetts Council on Gaming and Health; National Council on Problem Gambling; Parent Coalition for Student Privacy; Public Citizen; Stop Predatory Gambling and the Campaign for Gambling-Free Kids; TINA.org (Truth in Advertising, Inc.); U.S. PIRG### lootboxletter_pr.pdf, lootboxletterfull.pdf
  • Press Statement regarding today’s FTC Policy Statement on Education Technology and the Children’s Online Privacy Protection ActJeff Chester, Executive Director, Center for Digital Democracy:Today, the Federal Trade Commission adopts a long overdue policy designed to protect children’s privacy. By shielding school children from the pervasive forces of commercial surveillance, which gathers their data for ads and marketing, the FTC is expressly using a critical safeguard from the bipartisan Children’s Online Privacy Protection Act (COPPA). Fairplay, Center for Digital Democracy, and a coalition of privacy, children’s health, civil and consumer rights groups had previously called on the commission to enact policies that make this very Edtech safeguard possible.   We look forward to working with the FTC to ensure that parents can be confident that their child’s online privacy and security is protected in—or out of-the classroom.  However, the Commission must also ensure that adolescents receive protections from what is now an omniscient and manipulative data-driven complex that profoundly threatens their privacy and well-being.
    boy in red hoodie wearing black headphones by Compare Fibre
  • 60 leading advocacy organizations say unregulated Big Tech business model is “fundamentally at odds with children’s wellbeing”Contact:David Monahan, Fairplay david@fairplayforkids.org(link sends e-mail)Jeff Chester, Center for Digital Democracy, jeff@democraticmedia.org(link sends e-mail), 202-494-7100Diverse coalition of advocates urges Congress to pass legislation to protect kids and teens online60 leading advocacy organizations say unregulated Big Tech business model is “fundamentally at odds with children’s wellbeing”BOSTON, MA and WASHINGTON, DC - March 22, 2022 – Congressional leaders in the House and Senate were urged today to enact much needed protections for children and teens online. In a letter to Senate Majority Leader Chuck Schumer, Senate Minority Leader Mitch McConnell, House Speaker Nancy Pelosi and House Minority Leader Kevin McCarthy, a broad coalition of health, safety, privacy and education groups said it was time to ensure that Big Tech can no longer undermine the wellbeing of America’s youth. The letter reiterated President Biden’s State of the Union address call for increased online protections for young people.In their letter, the advocates outlined how the prevailing business model of Big Tech creates a number of serious risks facing young people on the internet today, including mental health struggles, loss of privacy, manipulation, predation, and cyberbullying. The advocates underscored the dangers posed by rampant data collection on popular platforms, including algorithmic discrimination and targeting children at particularly vulnerable moments.  The reforms called for by the advocates include:Protections for children and teens wherever they are online, not just on “child-directed” sites;Privacy protections to all minors;A ban on targeted advertising to young people;Prohibition of algorithmic discrimination of children and teens;Establishment of a duty of care that requires digital service providers to make the best interests of children a primary design consideration and prevent and mitigate harms to minors;Requiring platforms to turn on the most protective settings for minors by default;Greater resources for enforcement by the Federal Trade Commission.United by the desire to see Big Tech’s harmful business model regulated, the advocates’ letter represents a landmark moment for the movement to increase privacy protections for children and teenagers online, especially due to the wide-ranging fields and focus areas represented by signatories. Among the 60 signatories to the advocates’ letter are: Fairplay, Center for Digital Democracy, Accountable Tech, American Academy of Pediatrics, American Association of Child and Adolescent Psychiatry, American Psychological Association, Center for Humane Technology, Common Sense, Darkness to Light, ECPAT-USA, Electronic Privacy Information Center (EPIC), National Alliance to Advance Adolescent Health, National Center on Sexual Exploitation, National Eating Disorders Association, Network for Public Education, ParentsTogether, Public Citizen, Society for Adolescent Health and Medicine, and Exposure Labs, creators of The Social Dilemma.Signatories on the need for legislation to protect young people online:“Congress last passed legislation to protect children online 24 years ago – nearly a decade before the most popular social media platforms even existed. Big Tech's unregulated business model has led to a race to the bottom to collect data and maximize profits, no matter the harm to young people. We agree with the president that the time is now to update COPPA, expand privacy protections to teens, and put an end to the design abuses that manipulate young people into spending too much time online and expose them to harmful content.” – Josh Golin, Executive Director, Fairplay.“It’s long past time for Congress to put a check on Big Tech’s pervasive manipulation of young people’s attention and exploitation of their personal data. We applaud President Biden’s call to ban surveillance advertising targeting young people and are heartened by the momentum to rein in Big Tech and establish critical safeguards for minors engaging with their products.” – Nicole Gill, Co-Founder and Executive Director, Accountable Tech.“Digital technology plays an outsized role in the lives of today’s children and adolescents, exacerbated by the dramatic changes to daily life experienced during the pandemic. Pediatricians see the impact of these platforms on our patients and recognize the growing alarm about the role of digital platforms, in particular social media, in contributing to the youth mental health crisis. It has become clear that, from infancy through the teen years, children’s well-being is an afterthought in developing digital technologies. Strengthening privacy, design, and safety protections for children and adolescents online is one of many needed steps to create healthier environments that are more supportive of their mental health and well-being.”– Moira Szilagyi, MD, PhD, FAAP, President, American Academy of Pediatrics.“Children and teens are at the epicenter of a pervasive data-driven marketing system that takes advantage of their inherent developmental vulnerabilities. We agree with President Biden: now is the time for Congress to act and enact safeguards that protect children and teens.  It’s also long overdue for Congress to enact comprehensive legislation that protects parents and other adults from unfair, manipulative, discriminatory and privacy invasive commercial surveillance practices.”  – Katharina Kopp, Ph.D. Policy Director, Center for Digital Democracy."President Biden's powerful State of the Union plea to Congress to hold social media platforms accountable for the ‘national experiment’ they're conducting on our kids and teens could not be more important. It is clear that young people are being harmed by these platforms that continue to prioritize profits over the wellbeing of its youngest users. Children and teens' mental health is at stake. Congress and the Administration must act now to pass legislation to protect children’s and teens' privacy and well-being online." – Jim Steyer, Founder and CEO, Common Sense.“Online protections for children are woefully outdated and it's clear tech companies are more interested in profiting off of vulnerable children than taking steps to prevent them from getting hurt on their platforms. American kids are facing a mental health crisis partly fueled by social media and parents are unable to go it alone against these billion dollar companies. We need Congress to update COPPA, end predatory data collection on children, and regulate design practices that are contributing to social media addiction, mental health disorders, and even death.”– Justin Ruben, Co-Founder and Co-Director, ParentsTogether."A business model built on extracting our attention at the cost of our well being is bad for everyone, but especially bad for children. No one knows this better than young people themselves, many of whom write to us daily about the ways in which Big Social is degrading their mental health. Left unregulated, Big Social will put profits over people every time. It's time to put our kids first. We urge Congress to act swiftly and enact reforms like strengthening privacy, banning surveillance advertising, and ending algorithmic discrimination for kids so we can begin to build a digital world that supports, rather than demotes child wellbeing." – Julia Hoppock, Partnerships Director, The Social Dilemma, Exposure Labs.# # #press_release_letter_to_congress_updated_embargo_to_3_22.pdf, letter_to_congress_re_children_online_3_22_22.pdf
  • Congresswomen Anna G. Eshoo (D-CA) and Jan Schakowsky (D-IL) and Senator Cory Booker (D-NJ) Introduce Bill to Ban Surveillance AdvertisingWashington, DC 1-18-2022“Identifying, tracking, discriminating, sorting, targeting, and manipulating online users lies at the heart of all that is toxic about today’s digital world. Surveillance advertising drives discrimination and compounds inequities, it destroys democratic institutions and rights, strengthens monopoly power of Big Tech platforms, and is harmful to children, teens, families, and communities. If enacted, the Banning Surveillance Advertising Act would put a stop to surveillance advertising and would be an important first step in building a digital world that is less toxic to our democracy, economy, and collective well-being,” said Katharina Kopp, Ph.D., Director of Policy for the Center for Digital Democracy.Click here for statements of support.Click here for bill text.Click here for a section-by-section summary.Click here for additional background.
  • Groups urge Congress to stop Big Tech’s manipulation of young people BOSTON – Thursday, December 2, 2021 – Today a coalition of leading advocacy groups launched Designed With Kids in Mind, a campaign demanding a design code in the US to protect young people from online manipulation and harm. The campaign seeks to secure protections for US children and teens similar to the UK’s groundbreaking Age-Appropriate Design Code (AADC), which went into effect earlier this year. The campaign brings together leading advocates for child development, privacy, and a healthier digital media environment, including Fairplay, Accountable Tech, American Academy of Pediatrics, Center for Digital Democracy, Center for Humane Technology, Common Sense, ParentsTogether, RAINN, and Exposure Labs, creators of The Social Dilemma. The coalition will advocate for legislation and new Federal Trade Commission rules that protect children and teens from a business model that puts young people at risk by prioritizing data collection and engagement.The coalition has launched a website that explains how many of the most pressing problems faced by young people online are directly linked to platform’s design choices. They cite features that benefit platforms at the expense of young people’s wellbeing, such as: Autoplay: increases time on platforms, and excessive time on screens is linked to mental health challenges, physical risks like less sleep, and promotes family conflict.Algorithmic recommendations: risks exposure to self-harm, racist content, pornography, and mis/disinformation.Location tracking: makes it easier for strangers to track and contact children.Nudges to share: leads to loss of privacy, risks of sexual predation and identity theft.The coalition is promoting three bills which would represent a big step forward in protecting US children and teens online: the Children and Teens’ Online Privacy Protection Act S.1628; the Kids Internet Design and Safety (KIDS) Act S. 2918; and the Protecting the Information of our Vulnerable Children and Youth (PRIVCY) Act H.R. 4801. Taken together, these bills would expand privacy protections to teens for the first time and incorporate key elements of the UK’s AADC, such as requiring the best interest of children to be a primary design consideration for services likely to be accessed by young people. The legislation backed by the coalition would also protect children and teens from manipulative design features and harmful data processing. Members of the coalition on the urgent need for a US Design Code to protect children and teens:Josh Golin, Executive Director, Fairplay:We need an internet that helps children learn, connect, and play without exploiting their developmental vulnerabilities; respects their need for privacy and safety; helps young children disconnect at the appropriate time rather than manipulating them into spending even more time online; and prioritizes surfacing high-quality content instead of maximizing engagement. The UK’s Age-Appropriate Design Code took an important step towards creating that internet, and children and teens in the US deserve the same protections and opportunities. It’s time for Congress and regulators to insist that children come before Big Tech’s profits.Nicole Gill, Co-Founder and Executive Director of Accountable Tech:You would never put your child in a car seat that wasn't designed for them and met all safety standards, but that's what we do every day when our children go online using a network of apps and websites that were never designed with them in mind. Our children should be free to learn, play, and connect online without manipulative platforms like Facebook and Google's YouTube influencing their every choice. We need an age appropriate design code that puts kids and families first and protects young people from the exploitative practices and the perverse incentives of social media.Lee Savio Beers, MD, FAAP, President of the American Academy of Pediatrics:The American Academy of Pediatrics is proud to join this effort to ensure digital spaces are safe for children and supportive of their healthy development. It is in our power to create a digital ecosystem that works better for children and families; legislative change to protect children is long overdue. We must be bold in our thinking and ensure that government action on technology addresses the most concerning industry practices while preserving the positive aspects of technology for young people.Jeff Chester, Executive Director, Center for Digital Democracy:The “Big Tech” companies have long treated young people as just a means to generate vast profits – creating apps, videos and games designed to hook them to an online world designed to surveil and manipulate them. It’s time to stop children and teens from being victimized by the digital media industry. Congress and the Federal Trade Commission should adopt commonsense safeguards that ensure America’s youth reap all the benefits of the online world without having to constantly expose themselves to the risks.Randima Fernando, Executive Director, Center for Humane Technology:We need technology that respects the incredible potential – and the incredible vulnerability – of our kids' minds. And that should guide technology for adults, who can benefit from those same improvements.Irene Ly, Policy Counsel, Common Sense:This campaign acknowledges harmful features of online platforms and apps like autoplay, algorithms amplifying harmful content, and location tracking for what they are: intentional design choices. For too long, online platforms and apps have chosen to exploit children’s vulnerabilities through these manipulative design features. Common Sense has long supported designing online spaces with kids in mind, and strongly supports US rules that would finally require companies to put kids’ well-being first.Julia Hoppock, The Social Dilemma Partnerships Director, Exposure Labs:For too long, Big Social has put profits over people. It's time to put our kids first and build an online world that works for them.Dalia Hashad, Online Safety Director, ParentsTogether: From depression to bullying to sexual exploitation, tech companies knowingly expose children to unacceptable harms because it makes the platforms billions in profit. It's time to put kids first.Scott Berkowitz, President of RAINN (Rape, Abuse & Incest National Network):Child exploitation has reached crisis levels, and our reliance on technology has left children increasingly vulnerable. On our hotline, we hear from children every day who have been victimized through technology. An age-appropriate design code will provide overdue safeguards for children across the U.S.launch_-_design_code_to_protect_kids_online.pdf
  • Press Release

    Against surveillance-based advertising

    CDD joins an international coalition of more than 50 NGOs and scholars in a call for a surveillance-based advertising ban in its Digital Services Act and for the U.S. to enact a federal digital privacy and civil rights law

    International coalition calls for action against surveillance-based advertising Every day, consumers are exposed to extensive commercial surveillance online. This leads to manipulation, fraud, discrimination and privacy violations. Information about what we like, our purchases, mental and physical health, sexual orientation, location and political views are collected, combined and used under the guise of targeting advertising.   In a new report, the Norwegian Consumer Council (NCC) sheds light on the negative consequences that this commercial surveillance has on consumers and society. Together with [XXX] organizations and experts, NCC is asking authorities on both sides of the Atlantic to consider a ban. In Europe, the upcoming Digital Services Act can lay the legal framework to do so. In the US, legislators should seize the opportunity to enact comprehensive privacy legislation that protects consumers.  - The collection and combination of information about us not only violates our right to privacy, but renders us vulnerable to manipulation, discrimination and fraud. This harms individuals and society as a whole, says the director of digital policy in the NCC, Finn Myrstad.  In a Norwegian population survey conducted by YouGov on behalf of the NCC, consumers clearly state that they do not want commercial surveillance. Just one out of ten respondents were positive to commercial actors collecting personal information about them online, while only one out of five thought that ads based on personal information is acceptable. - Most of us do not want to be spied on online, or receive ads based on tracking and profiling. These results mirror similar surveys from Europe and the United States, and should be a powerful signal to policymakers looking at how to better regulate the internet, Myrstad says. Policymakers and civil society organisations on both sides of the Atlantic are increasingly standing up against these invasive practices. For example, The European Parliament and the European Data Protection Supervisor (EDPS) have already called for phasing out and banning surveillance-based advertising. A coalition of consumer and civil rights organizations in the United States has called for a similar ban.     Significant consequences  The NCC report ’Time to ban surveillance-based advertising’ exposes a variety of harmful consequences that surveillance-based advertising can have on individuals and on society:    1. Manipulation  Companies with comprehensive and intimate knowledge about us can shape their messages in attempts to reach us when we are susceptible, for example to influence elections or to advertise weight loss products, unhealthy food or gambling.     2. Discrimination  The opacity and automation of surveillance-based advertising systems increase the risk of discrimination, for example by excluding consumers based on income, gender, race, ethnicity or sexual orientation, location, or by making certain consumers pay more for products or services.     3. Misinformation   The lack of control over where ads are shown can promote and finance false or malicious content. This also poses significant challenges to publishers and advertisers regarding revenue, reputational damage, and opaque supply chains. 4. Undermining competition   The surveillance business model favours companies that collect and process information across different services and platforms. This makes it difficult for smaller actors to compete, and negatively impacts companies that respect consumers’ fundamental rights.  5. Security risks  When thousands of companies collect and process enormous amounts of personal data, the risk of identity theft, fraud and blackmail increases. NATO has described this data collection as a national security risk.    6. Privacy violations   The collection and use of personal data is happening with little or no control, both by large companies and by companies that are unknown to most consumers. Consumers have no way to know what data is collected, who the information is shared with, and how it may be used.   -  It is very difficult to justfy the negative consequences of this system. A ban will contribute to a healthier marketplace that helps protect individuals and society, Myrstad comments.  Good alternatives  In the report, the NCC points to alternative digital advertising models that do not depend on the surveillance of consumers, and that provide advertisers and publishers more oversight and control over where ads are displayed and which ads are being shown. - It is possible to sell advertising space without basing it on intimate details about consumers. Solutions already exist to show ads in relevant contexts, or where consumers self-report what ads they want to see, Myrstad says. - A ban on surveillance-based advertising would also pave the way for a more transparent advertising marketplace, diminishing the need to share large parts of ad revenue with third parties such as data brokers. A level playing field would contribute to giving advertisers and content providers more control, and keep a larger share of the revenue. The coordinated push behind the report and letter illustrates the growing determination of consumer, digital rights, human rights and other civil society groups to end the widespread business model of spying on the public.
  • Contact: Jeff Chester, CDD (jeff@democraticmedia.org (link sends e-mail); 202-494-7100) David Monahan, CCFC (david@commercialfreechildhood.org (link sends e-mail);) Advocates Ask FTC to Protect Youth From Manipulative “Dark Patterns” Online BOSTON, MA and WASHINGTON, DC — May 28, 2021—Two leading advocacy groups protecting children from predatory practices online filed comments today asking the FTC to create strong safeguards to ensure that internet “dark patterns” don’t undermine children’s well-being and privacy. Campaign for a Commercial-Free Childhood (CCFC) and the Center for Digital Democracy (CDD) cited leading authorities on the impacts of internet use on child development in their comments prepared by the Communications & Technology Law Clinic at Georgetown University Law Center. These comments follow testimony given by representatives of both groups last month at a FTC workshop spearheaded by FTC Acting Chair Rebecca Slaughter. CCFC and CDD say tech companies are preying upon vulnerable kids, capitalizing on their fear of missing out, desire to be popular, and inability to understand the value of misleading e-currencies, as well as putting them on an endless treadmill on their digital devices. They urged the FTC to take swift and strong action to protect children from the harms of dark patterns. Key takeaways include: - A range of practices, often called “dark patterns” are pervasive in the digital marketplace, manipulate children, are deceptive and unfair and violate Section 5 of the FTC Act. They take advantage of a young person’s psycho-social development, such as the need to engage with peers. - The groups explained the ways children are vulnerable to manipulation and other harms from “dark patterns,” including that they have “immature and developing executive functioning,” which leads to impulse behaviors. - The FTC should prohibit the use of dark pattern practices in the children’s marketplace; issue guidance to companies to ensure they do not develop or deploy such applications, and include new protections under their Children’s Online Privacy Protection Act (COPPA) rulemaking authority to better regulate them. The commission must bring enforcement actions against the developers using child-directed dark patterns. - The FTC should prohibit the use of micro-transactions in apps serving children, including the buying of virtual currency to participate in game playing. - The FTC should adopt a definition of dark patterns to include all “nudges” designed to use a range of behavioral techniques to foster desired responses from users. The groups’ filing was in response to the FTC’s call for comments (link is external) on the use of digital “dark patterns” — deceptive and unfair user interface designs — on websites and mobile apps. Comment of Jeff Chester, executive Director of the Center for Digital Democracy: “Dark Patterns” are being used in the design of child-directed services to manipulate them to spend more time and money on games and other applications, as well as give up more of their data. It’s time the FTC acted to protect young people from being unfairly treated by online companies. The commission should issue rules that prohibit the use of these stealth tactics that target kids and bring legal action against the companies promoting their use. Comment of Josh Golin, executive Director of the Campaign for a Commercial-Free Childhood: In their rush to monetize children, app and game developers are using dark patterns that take advantage of children’s developmental vulnerabilities. The FTC has all the tools it needs to stop unethical, harmful, and illegal conduct. Doing so would be a huge step forward towards creating a healthy media environment for children. Comment of Michael Rosenbloom, Staff Attorney & Clinical Teaching Fellow, Communications and Technology Law Clinic, Georgetown University Law Center: Software and game companies are using dark patterns to pressure children into playing more and paying more. Today, many apps and games that children play use dark patterns like arbitrary virtual currencies, encouragement from in-game characters, and ticking countdown timers, to get children to spend more time and money on microtransactions. These dark patterns harm children and violate Section 5 of the FTC Act, and we urge the FTC to act to stop these practices. ###
  • Press Release

    “Big Food” and “Big Data” Online Platforms Fueling Youth Obesity Crisis as Coronavirus Pandemic Rages

    New Report Calls for Action to Address Saturation of Social Media, Gaming Platforms, and Streaming Video with Unhealthy Food and Beverage Products

    “Big Food” and “Big Data” Online Platforms Fueling Youth Obesity Crisis as Coronavirus Pandemic RagesNew Report Calls for Action to Address Saturation of Social Media, Gaming Platforms, and Streaming Video with Unhealthy Food and Beverage Products Contact: Jeff Chester (202-494-7100) For Immediate ReleaseWashington, DC, May 12, 2021A report released today calls for federal and global action to check the growth of digital marketing of food and beverage products that target children and teens online. Tech platforms especially popular with young people—including Facebook’s Instagram, Amazon’s Twitch, ByteDance’s TikTok, and Google’s YouTube – are working with giant food and beverage companies, such as Coca Cola, KFC, Pepsi and McDonald’s, to promote sugar-sweetened soda, energy drinks, candy, fast food, and other unhealthy products across social media, gaming, and streaming video. The report offers fresh new analysis and insight into the most recent industry practices, documenting how “Big Food” and “Big Tech” are using AI, machine learning, and other data-driven techniques to ensure that food marketing permeates all of the online cultural spaces where children and teenagers congregate. The pandemic has dramatically increased exposure to these aggressive new forms of marketing, further increasing young people’s risks of becoming obese. Black and Brown youth are particularly vulnerable to new online promotional strategies. Noting that concerns about youth obesity have recently fallen off the public radar in the U.S., the report calls for both international and domestic policies to rein in the power of the global technology and food industries. The report and an executive summary are available at the Center for Digital Democracy’s (CDD) website, along with other background material.“Our investigation found that there is a huge amount of marketing for unhealthy foods and beverages all throughout the youth digital media landscape, and it has been allowed to flourish with no government oversight,” explained Kathryn C. Montgomery, PhD, the report’s lead author, Professor Emerita at American University and CDD’s Senior Strategist. “We know from decades of research that marketing of these products contributes to childhood obesity and related illnesses. And we’ve witnessed how so many children, teens, and young adults suffering from these conditions have been particularly vulnerable to the coronavirus. Both the technology industry and the food and beverage industry need to be held accountable for creating an online environment that undermines young people’s health.”The report examines an array of Big Data strategies and AdTech tools used by the food industry, focusing on three major sectors of digital culture that attract large numbers of young people -- the so-called “influencer economy,” gaming and esports platforms, and the rapidly expanding streaming and online video industry.Dozens of digital campaigns by major food and beverage companies, many of which have won prestigious ad industry awards, illustrate some of the latest trends and techniques in digital marketing:The use of influencers is one of the primary ways that marketers reach and engage children and teens. Campaigns are designed to weave branded material “seamlessly into the daily narratives” shared on social media. Children and teens are particularly susceptible to influencer marketing, which taps into their psycho-social development. Marketing researchers closely study how young people become emotionally attached to celebrities and other influencers through “parasocial” relationships.McDonald’s enlisted rapper Travis Scott, to promote the “Travis Scott Meal” to young people, featuring “a medium Sprite, a quarter pounder with bacon, and fries with barbecue sauce.” The campaign was so successful that some restaurants in the chain sold out of supplies within days of its launch. This and other celebrity endorsements have helped boost McDonald’s stock price, generated a trove of valuable consumer data, and triggered enormous publicity across social media.Food and beverage brands have flocked to Facebook-owned Instagram, which is considered one of the best ways to reach and engage teens.According to industry research, nearly all influencer campaigns (93%) are conducted on Instagram. Cheetos’ Chester Cheetah is now an “Instagram creator,” telling his own “stories” along with millions of other users on the platform.One Facebook report, “Quenching Today’s Thirsts: How Consumers Find and Choose Drinks,” found that “64% of people who drink carbonated beverages use Instagram for drinks-related activities, such as sharing or liking posts and commenting on drinks content,” and more than a third of them report following or “liking” soft drink “brands, hashtags, or influencer posts.”The online gaming space generates more revenue than TV, film or music, and attracts viewers and players – including many young people -- who are “highly engaged for a considerable length of time.” Multiplayer online battle arena (MOBA) and first-person shooter games are considered one of the best marketing environments, offering a wide range of techniques for “monetization,” including in-game advertising, sponsorship, product placement, use of influencers, and even “branded games” created by advertisers. Twitch, the leading gaming platform, owned by Amazon, has become an especially important venue for food and beverage marketers. Online gamers and fans are considered prime targets for snack, soft drink, and fast food brands, all products that lend themselves to uninterrupted game play and spectatorship.PepsiCo’s energy drink, MTN DEW Amp Game Fuel, is specifically “designed with gamers in mind.” To attract influencers, it was featured on Twitch’s “Bounty Board,” a one-stop-shopping tool for “streamers,” enabling them to accept paid sponsorship (or “bounties”) from brands that want to reach the millions of gamers and their followers.Red Bull recently partnered with Ninja“the most popular gaming influencer in the world with over 13 million followers on Twitch, over 21 million YouTube subscribers, and another 13 million followers on Instagram.”Dr. Pepper featured the faces of players of the popular Fortnite game on its bottles, with an announcement on Twitter that this campaign resulted in “the most engaged tweet” the soft-drink company had ever experienced.Wendy’s partnered with “five of the biggest Twitch streamers,” as well as food delivery app Uber Eats, to launch its “Never Stop Gaming” menu, with the promise of “five days of non-stop gaming, delicious meal combos and exclusive prizes.” Branded meals were created for each of the five streamers, who offered their fans the opportunity to order directly through their Twitch channels and have the food delivered to their doors.One of the newest marketing frontiers is streaming and online video, which have experienced a boost in viewership during the pandemic. Young people are avid users, accessing video on their mobile devices, gaming consoles, personal computers, and online connections to their TV sets.Concerned that teens “are drinking less soda,” Coca-Cola’s Fanta brand developed a comprehensive media campaign to trigger “an ongoing conversation with teen consumers through digital platforms” by creating four videos based on the brand’s most popular flavors, and targeting youth on YouTube, Hulu, Roku, Crackle, and other online video platforms. “From a convenience store dripping with orange flavor and its own DJ cat, to an 8-bit videogame-ified pizza parlor, the digital films transport fans to parallel universes of their favorite hangout spots, made more extraordinary and fantastic once a Fanta is opened.”New video ad formats allow virtual brand images to be inserted into the content and tailored to specific viewers. “Where one customer sees a Coca-Cola on the table,” explained a marketing executive, “the other sees green tea. Where one customer sees a bag of chips, another sees a muesli bar… in the exact same scene.”The major technology platforms are facilitating and profiting from the marketing of unhealthy food and beverage products.Facebook’s internal “creative shop” has helped Coca-Cola, PepsiCo, Unilever, Nestle and hundreds of other brands develop global marketing initiatives to promote their products across its platform. The division specializes in “building data-driven advertising campaigns, branded content, branded entertainment, content creation, brand management, social design,” and similar efforts.Google regularly provides a showcase for companies such as Pepsi, McDonald’s and Mondelez to tout their joint success promoting their respective products throughout the world.For example, Pepsi explained in a “Think with Google” post that it used Google’s “Director’s Mix” personalization video advertising technology to further what it calls its ability to “understand the consumer’s DNA,” meaning their “needs, context, and location in the shopping journey.” Pepsi could leverage Google’s marketing tools to help its goal of combining “insights with storytelling and drive personalized experiences at scale.”Hershey’s has been working closely with Amazon to market its candy products via streaming video, as well as through its own ecommerce marketplace. In a case study published online, Amazon explained that “…as viewing consumption began to fragment, the brand [Hershey’s] realized it was no longer able to reach its audience with linear TV alone.” Amazon gave Hershey’s access to its storehouse of data so the candy company could market its products on Amazon’s streaming services, such as IMDbTV. Amazon allowed Hershey’s to use Amazon’s data to ensure the candy brands would “be positioned to essentially ‘win’ search in that category on Amazon and end up as the first result….” Hershey’s also made use of “impulse buy” strategies on the Amazon platform, including “cart intercepts,” which prompt a customer to “add in snacks as the last step in their online shopping trip, mimicking the way someone might browse for candy during the checkout at a physical store.”Some of the largest food and beverage corporations—including Coca-Cola, McDonald’s, and Pepsi—have, in effect, transformed themselves into Big Data businesses.Coca-Cola operates over 40 interconnected social media monitoring facilities worldwide, which use AI to follow customers, analyze their online conversations, and track their behaviors.PepsiCo has developed a “fully addressable consumer database” (called “Consumer DNA”) that enables it to “see a full 360 degree view of our consumers.”McDonald’s made a significant investment in Plexure, a “mobile engagement” company specializing in giving fast food restaurants the ability “to build rich consumer profiles” and leverage the data “to provide deeply personalized offers and content that increase average transaction value” and help generate other revenues. One of its specialties is designing personalized messaging that triggers the release of the brain chemical, dopamine.The report raises particularly strong concerns about the impact of all these practices on youth of color, noting that food and beverage marketers “are appropriating some of the most powerful ‘multicultural’ icons of youth pop culture and enlisting these celebrities in marketing campaigns for sodas, ‘branded’ fast-food meals, and caffeine-infused energy drinks.” These promotions can “compound health risks for young Blacks and Hispanics,” subjecting them to “multiple layers of vulnerability, reinforcing existing patterns of health disparity that many of them experience.”“U.S. companies are infecting the world’s young people with invasive, stealth, and incessant digital marketing for junk food,” commented Lori Dorfman, DrPH, director, Berkeley Media Studies Group, one of CDD’s partners on the project. “And they are targeting Black and Brown youth because they know kids of color are cultural trendsetters,” she explained. “Big Food and Big Tech run away with the profits after trampling the health of children, youth, and families.”The Center for Digital Democracy and its allies are calling for a comprehensive and ambitious set of policies for limiting the marketing of unhealthy food and beverages to young people, arguing that U.S. policymakers must work with international health and youth advocacy organizations to develop a coordinated agenda for regulating these two powerful global industries. As the report explains, other governments in the UK, Europe, Canada, and Latin America have already developed policies for limiting or banning the promotion of foods that are high in fat, sugar, and salt, including on digital platforms. Yet, the United States has continued to rely on an outdated self-regulatory model that does not take into account the full spectrum of Big Data and AdTech practices in today’s contemporary digital marketplace, places too much responsibility on parents, and offers only minimal protections for the youngest children.“Industry practices have become so sophisticated, widespread, and entangled that only a comprehensive public policy approach will be able to produce a healthier digital environment for young people,” explained Katharina Kopp, PhD, CDD’s Deputy Director and Director of Research.The report lays out an eight-point research-based policy framework:Protections for adolescents as well as young children.Uniform, global, science-based nutritional criteria.Restrictions on brand promotion.Limits on the collection and use of data.Prohibition of manipulative and unfair marketing techniques and design features.Market research protections for children and teens.Elimination of digital racial discrimination.Transparency, accountability, and enforcement.###
  • Contact: Jeff Chester, CDD jeff@democraticmedia.org (link sends e-mail); 202-494-7100David Monahan, CCFC, david@commercialfreechildhood.org (link sends e-mail)Advocates say Google Play continues to disregard children’s privacy law and urge FTC to act BOSTON, MA and WASHINGTON, DC — March 31, 2021—Today, advocacy groups Campaign for a Commercial-Free Childhood (CCFC) and the Center for Digital Democracy (CDD) called on the Federal Trade Commission (FTC) to investigate Google’s promotion of apps which violate the Children’s Online Privacy Protection Act (COPPA). In December 2018, CCFC and CDD led a coalition of 22 consumer and health advocacy groups in asking the FTC to investigate these same practices. Since then Google has made changes to the Play Store, but the advocates say these changes fail to address the core problem: Google is certifying as safe and appropriate for children apps that violate COPPA and put children at risk. Recent studies found that a significant number of apps in Google Play violate COPPA by collecting and sharing children’s personal information without getting parental consent. For instance, a JAMA Pediatrics study found that 67% of apps used by children aged 5 and under were transmitting personal identifiers to third parties.Comment of Angela Campbell, Chair of the Board of Directors, Campaign for a Commercial-Free Childhood, Professor Emeritus, Communications & Technology Law Clinic, Georgetown University Law Center:“Parents reasonably expect that Google Play Store apps designated as ‘Teacher approved’ or appropriate for children under age 13 comply with the law protecting children’s privacy. But far too often, that is not the case. The FTC failed to act when this problem was brought to its attention over two years ago. Because children today are spending even more time using mobile apps, the FTC must hold Google accountable for violating children’s privacy.”Comment of Jeff Chester, executive Director of the Center for Digital Democracy:"The Federal Trade Commission must swiftly act to stop Google’s ongoing disregard of the privacy and well-being of children. For too long, the Commission has allowed Google’s app store, and the data marketing practices that are its foundation, to operate without enforcing the federal law that is designed to protect young people under 13. With children using apps more than ever as a consequence of the pandemic, the FTC should enforce the law and ensure Google engages with kids and families in a responsible manner."###
  • Press Statement, Center for Digital Democracy (CDD) and Campaign for a Commercial-Free Childhood (CCFC), 12-14-20 Today, the Federal Trade Commission announced (link is external) it will use its to 6(b) authority to launch a major new study into the data collection practices of nine major tech platforms and companies: ByteDance (TikTok), Amazon, Discord, Facebook, Reddit, Snap, Twitter, WhatsApp and YouTube. The Commission’s study includes a section on children and teens. In December, 2019, the Campaign for a Commercial-Free Childhood (CCFC), Center for Digital Democracy (CDD) and their attorneys at Georgetown Law’s Institute for Public Representation urged the Commission to use its 6(b) authority to better understand how tech companies collect and use data from children. Twenty-seven consumer and child advocacy organizations joined that request. Below are statements from CDD and CCFC on today’s announcement. Josh Golin, Executive Director, CCFC: “We are extremely pleased that the FTC will be taking a hard look at how platforms like TikTok, Snap, and YouTube collect and use young people’s data. These 6(b) studies will provide a much-needed window into the opaque data practices that have a profound impact on young people’s wellbeing. This much-needed study will not only provide critical public education, but lay the groundwork for evidence-based policies that protect young people’s privacy and vulnerabilities when they use online services to connect, learn, and play.” Jeff Chester, Executive Director, CDD: "The FTC is finally holding the social media and online video giants accountable, by requiring leading companies to reveal how they stealthily gather and use information that impacts our privacy and autonomy. It is especially important the commission is concerned about also protecting teens— who are the targets of a sophisticated and pervasive marketing system designed to influence their behaviors for monetization purposes." For questions, please contact: jeff@democraticmedia.org (link sends e-mail) See also: https://www.markey.senate.gov/news/press-releases/senator-markey-stateme... (link is external)
  • For Immediate Release September 24, 2020 Contact: Jeff Chester (202-494-7100) jeff@democraticmedia.org (link sends e-mail) A Step Backwards for Consumer Privacy: Why Californians Should Vote No on Proposition 24 Ventura, CA, and Washington, DC: The Center for Digital Democracy (CDD) announced today its opposition to the California Privacy Rights Act (CPRA), also known as Proposition 24, which will appear on the November 2020 California general election ballot. Prop 24 does not sufficiently strengthen Californians’ privacy and may, in fact, set a new lower and thus more dangerous standard for privacy protection in the U.S., according to its analyses. “We need strong and bold privacy legislation, not weaker standards and tinkering at the margins,” declared CDD Policy Director Katharina Kopp. “Prop 24 fails to significantly limit data uses that undermine our privacy, increase corporate manipulation and exploitation, and exacerbate racial and economic inequality. This initiative allows the much more powerful companies to set unfair terms by default. It also condones pay-for-privacy schemes, where corporations would be allowed to charge a premium (or eliminate a discount) in exchange for privacy. These schemes tend to hurt the already disadvantaged the most,” she explained. CDD intends to work with allies from the consumer and privacy communities to inform voters about Prop 24 and how best to protect their privacy. The Center for Digital Democracy is a leading nonprofit organization focused on empowering and protecting the rights of the public in the digital era.
  • Press Release

    Advocates Call on TikTok Suitors to Clean Up Kids’ Privacy Practices

    Groups had filed complaint at FTC documenting how TikTok flouts children’s privacy law, tracks millions of kids without parental consent.

    Contact: Katharina Kopp, CDD (kkopp@democraticmedia.org (link sends e-mail); 202-836-4621) David Monahan, CCFC (david@commercialfreechildhood.org (link sends e-mail)) Advocates Call on TikTok Suitors to Clean Up Kids’ Privacy Practices Groups had filed complaint at FTC documenting how TikTok flouts children’s privacy law, tracks millions of kids without parental consent. WASHINGTON, DC and BOSTON, MA—September 3, 2020—The nation’s leading children’s privacy advocates are calling on potential buyers of TikTok “to take immediate steps to comprehensively improve its privacy and data marketing practices for young people” should they purchase the platform. In separate letters to Microsoft, Walmart, and Oracle, Campaign for a Commercial-Free Childhood (CCFC) and the Center for Digital Democracy (CDD) detail TikTok’s extensive history of violating the Children’s Online Privacy Protection Act (COPPA), including a recent news report that TikTok internally classified more than one-third of its 49 million US users as fourteen or under. Given the likelihood that millions of these users are also under thirteen, the advocates urged Microsoft, Walmart, and Oracle to pledge to immediately stop collecting and processing data from any account flagged as or believed to be under thirteen if they acquire TikTok’s US operations, and only restore accounts that can be affirmatively verified as belonging to users that are thirteen or older. COPPA requires apps and websites to obtain verifiable parental consent before collecting the personal information of anyone under 13, but TikTok has not done so for its millions of accounts held by children. “Whoever purchases TikTok will have access to a treasure trove of ill-gotten, sensitive children’s data,” said Josh Golin, Executive Director of CCFC. “Any new owner must demonstrate their commitment to protecting young people’s privacy by immediately deleting any data that was illegally obtained from children under thirteen. With the keys to one of the most popular platforms for young people on the planet must come a commitment to protect children’s privacy and wellbeing.” In February 2019, TikTok was fined $5.7 million by the Federal Trade Commission (FTC) for COPPA violations and agreed to delete children’s data and properly request parental consent before allowing children under 13 on the site and collecting more data from them. This May, CCFC, CDD, and a coalition of 20 advocacy groups filed an FTC complaint against TikTok for ignoring their promises to delete kids’ data and comply with the law. To this day, the groups say, TikTok plays by its own rules, luring millions of kids under the age of 13, illegally collecting their data, and using it to manipulatively target them with marketing. In addition, they wrote to the companies today that, “By ignoring the presence of millions of younger children on its app, TikTok is putting them at risk for sexual predation; news reports and law enforcement agencies have documented many cases of inappropriate adult-to-child contact on the app.” In August, the groups’ allegations that TikTok had actual knowledge that millions of its users were under thirteen were confirmed by the New York Times. According to internal documents obtained by the Times, TikTok assigns an age range to each user utilizing a variety of methods including “facial recognition algorithms that scrutinize profile pictures and videos,” “comparing their activity and social connections in the app against those of users whose ages have already been estimated,” and drawing “upon information about users that is bought from other sources.” Using these methods, more than one third of TikTok’s 49 million users in the US were estimated to be under fourteen. Among daily users, the proportion that TikTok has designated as under fourteen rises to 47%. “The new owners of TikTok in the U.S. must demonstrate they take protecting the privacy and well-being of young people seriously,” said Katharina Kopp, policy director of the Center for Digital Democracy. “The federal law protecting kids’ privacy must be complied with and fully enforced. In addition, the company should implement a series of safeguards that prohibits manipulative, discriminatory and harmful data and marketing practices that target children and teens. Regulators should reject any proposed sale without ensuring a set of robust set of safeguards for youth are in place,” she noted. ###
  • Press Release

    USDA Online Buying Program for SNAP Participants Threatens Their Privacy and Can Exacerbate Racial and Health Inequities, Says New Report

    Digital Rights, Civil Rights and Public Health Groups Call for Reforms from USDA, Amazon, Walmart, Safeway/Albertson’s and Other Grocery Retailers - Need for Safeguards Urgent During Covid-19 Crisis

    Contact: Jeff Chester jeff@democraticmedia.org (link sends e-mail) 202-494-7100 Katharina Kopp kkopp@democraticmedia.org (link sends e-mail) https://www.democraticmedia.org/ USDA Online Buying Program for SNAP Participants Threatens Their Privacy and Can Exacerbate Racial and Health Inequities, Says New Report Digital Rights, Civil Rights and Public Health Groups Call for Reforms from USDA, Amazon, Walmart, Safeway/Albertson’s and Other Grocery Retailers Need for Safeguards Urgent During Covid-19 Crisis Washington, DC, July 16, 2020—A pilot program designed to enable the tens of millions of Americans who participate in the USDA’s Supplemental Nutrition Assistance Program (SNAP) to buy groceries online is exposing them to a loss of their privacy through “increased data collection and surveillance,” as well as risks involving “intrusive and manipulative online marketing techniques,” according to a report from the Center for Digital Democracy (CDD). The report reveals how online grocers and retailers use an orchestrated array of digital techniques—including granular data profiling, predictive analytics, geolocation tracking, personalized online coupons, AI and machine learning —to promote unhealthy products, trigger impulsive purchases, and increase overall spending at check-out. While these practices affect all consumers engaged in online shopping, the report explains, “they pose greater threats to individuals and families already facing hardship.” E-commerce data practices “are likely to have a disproportionate impact on SNAP participants, which include low-income communities, communities of color, the disabled, and families living in rural areas. The increased reliance on these services for daily food and other household purchases could expose these consumers to extensive data collection, as well as unfair and predatory techniques, exacerbating existing disparities in racial and health equity.” The report was funded by the Robert Wood Johnson Foundation, as part of a collaboration among four civil rights, digital rights, and health organizations: Color of Change, UnidosUS, Center for Digital Democracy, and Berkeley Media Studies Group. The groups issued a letter today to Secretary of Agriculture Sonny Perdue, urging the USDA to take immediate action to strengthen online protections for SNAP participants. USDA launched (link is external) its e-commerce pilot last year in a handful of states, with an initial set of eight retailers approved for participation: Amazon, Dash’s Market, FreshDirect, Hy-Vee, Safeway, ShopRite, Walmart and Wright’s Market. The program has rapidly expanded (link is external) to a majority of states, in part as a result of the current Covid-19 health crisis, in order to enable SNAP participants to shop more safely from home by following “shelter-in-place” rules. Through an analysis of the digital marketing and grocery ecommerce practices of the eight companies, as well as an assessment of their privacy policies, CDD found that SNAP participants and other online shoppers confront an often manipulative and nontransparent online grocery marketplace, which is structured to leverage the tremendous amounts of data gathered on consumers via their mobile devices, loyalty cards, and shopping transactions. E-commerce grocers deliberately foreground the brands and products that partner with them (which include some of the most heavily advertised, processed foods and beverages), making them highly visible on store home pages and on “digital shelves,” as well as through online coupons and well-placed reminders at the point of sale. Grocers working with the SNAP pilot have developed an arsenal of “adtech” (advertising technology) techniques, including those that use machine learning and behavioral science to foster “frictionless shopping” and impulsive purchasing of specific foods and beverages. The AI and Big Data operations documented in the report may also lead to unfair and discriminatory data practices, such as targeting low-income communities and people of color with aggressive promotions for unhealthy food. Data collected and profiles created during online shopping may be applied in other contexts as well, leading to increased exposure to additional forms of predatory marketing, or to denial of opportunities in housing, education, employment, and financial services. “The SNAP program is one of our nation’s greatest success stories because it puts food on the table of hungry families and money in the communities where they live,” explained Dr. Lori Dorfman, Director of the Berkeley Media Studies Group. “Shopping for groceries should not put these families in danger of being hounded by marketers intent on selling products that harm health. Especially in the time of coronavirus when everyone has to stay home to keep themselves and their communities safe, the USDA should put digital safeguards in place so SNAP recipients can grocery shop without being manipulated by unfair marketing practices.” CDD’s research also found that the USDA relied on the flawed and misleading privacy policies of the participating companies, which fail to provide sufficient data protections. According to the pilot’s requirement for participating retailers, privacy policies should clearly explain how a consumer’s data is gathered and used, and provide “optimal” protections. A review of these long, densely worded documents, however, reveals the failure of the companies to identify the extent and impact of their actual data operations, or the risks to consumers. The pilot’s requirements also do not adequately limit the use of SNAP participant’s data for marketing. In addition, CDD tested the companies’ data practices for tracking customers’ behavior online, and compared them to the USDA’s requirements. The research found widespread use of so-called “third party” tracking software (such as “cookies”), which can expose an individual’s personal data to others. “In the absence of strong baseline privacy and ecommerce regulations in the US, the USDA’s weak safeguards are placing SNAP recipients at substantial risk,” explained Dr. Katharina Kopp, one of the report’s authors. “The kinds of e-commerce and Big Data practices we have identified through our research could pose even greater threats to communities of color, including increased commercial surveillance and further discrimination.” “Being on SNAP, or any other assistance program, should not give corporations free rein to use intrusive and manipulative online marketing techniques on Black communities,” said Jade Magnus Ogunnaike, Senior Campaign Director at Color of Change. “Especially in the era of COVID, where online grocery shopping is a necessity, Black people should not be further exposed to a corporate surveillance system with unfair and predatory practices that exacerbate disparities in racial and health equity just because they use SNAP. The USDA should act aggressively to protect SNAP users from unfair, predatory, and discriminatory data practices.” “The SNAP program helps millions of Latinos keep food on the table when times are tough and our nation’s public health and economic crises have highlighted that critical role,” said Steven Lopez, Director of Health Policy at UnidosUS. “Providing enhanced access to healthy and nutritious foods at the expense of the privacy and health of communities of color is too high of a price. Predatory marketing practices have been linked to increased health disparities for communities of color. The USDA must not ignore that fact and should take strong and meaningful steps to treat all participants fairly, without discriminatory practices based on the color of their skin.” The report calls on the USDA to “take an aggressive role in developing meaningful and effective safeguards” before moving the SNAP online purchasing system beyond its initial trial. The agency needs to ensure that contemporary e-commerce, retail and digital marketing applications treat SNAP participants fairly, with strong privacy protections and safeguards against manipulative and discriminatory practices. The USDA should work with SNAP participants, civil rights, consumer and privacy groups, as well as retailers like Amazon and Walmart, to restructure its program to ensure the safety and well-being of the millions of people enrolled in the program. ###
  • Contact: Jeff Chester, CDD (jeff@democraticmedia.org (link sends e-mail); 202-494-7100) David Monahan, CCFC (david@commercialfreechildhood.org (link sends e-mail);) Statement from Campaign for a Commercial-Free Childhood and Center for Digital Democracy on Comments filed with FTC regarding Endorsement Guides WASHINGTON, DC and BOSTON, MA—June 23, 2020—Advocacy groups Campaign for a Commercial-Free Childhood (CCFC) and the Center for Digital Democracy (CDD) filed comments on Monday in response to the FTC’s request for public comment (link is external) on its Endorsement Guides. Jeff Chester, executive director, Center for Digital Democracy: "Influencer marketing should be declared an unfair and deceptive practice when it comes to children. The FTC is enabling so-called ‘kidfluencers,’ ‘brand ambassadors,’ and other ‘celebrity’ marketers to stealthily pitch kids junk food, toys and other products, despite the known risks to their privacy, personal health and security. Kids and teens are being targeted by a ‘wild west’ influencer marketing industry wherever they go online, including when they watch videos, play games, or use social media. It's time for the FTC to place the interests of America's youth before the manipulative commercial activities of influencers." Josh Golin, Executive Director, Campaign for a Commercial-Free Childhood: “The FTC’s failure to act has helped create an entire ecosystem of unfair and deceptive influencer marketing aimed at children. It’s past time for the Commission to send a strong message to everyone – platforms, brands, ad agencies and the influencers themselves – that children should not be targets for this insidious and manipulative marketing.” Angela J. Campbell, Director Emeritus of the Institute for Public Representation’s Communications and Technology Clinic at Georgetown Law, currently chair of CCFC’s Board, and counsel to CCFC and CDD: "Influencer videos full of hidden promotions and sometimes blatant marketing have largely displaced actual programs for children. The FTC must act now to stop these deceptive and unfair practices." ###
  • Supporting the Call for Racial JusticeThe Center for Digital Democracy supports the call for racial justice and the fight against police violence, against the systemic injustices that exist in all parts of our society – inferior educational opportunities; lack of affordable equitable health care; an unjust justice system; housing and employment discrimination; and discriminatory marketing practices.We grieve for the lives lost and the opportunities denied! We grieve for the everyday injustices people of color have to endure and had to endure for centuries.We grieve for an America that could be so much more!Our grieving is not enough! CDD will continue its fight for data justice in support of racial and social justiceJune 5, 2020
  • Contact: Jeff Chester, CDD (jeff@democraticmedia.org (link sends e-mail); 202-494-7100) David Monahan, CCFC (david@commercialfreechildhood.org (link sends e-mail);) Advocates Say TikTok In Contempt of Court Order More kids than ever use the site due to COVID19 quarantine, but TikTok flouts settlement agreement with the FTC WASHINGTON, DC and BOSTON, MA—May 14, 2020—Today, a coalition of leading U.S. child advocacy, consumer, and privacy groups filed a complaint (link is external) urging the Federal Trade Commission (FTC) to investigate and sanction TikTok for putting kids at risk by continuing to violate the Children’s Online Privacy Protection Act (COPPA). In February 2019, TikTok paid a $5.7 million fine for violating COPPA, including illegally collecting personal information from children. But more than a year later, with quarantined kids and families flocking to the site in record numbers, TikTok has failed to delete personal information previously collected from children and is still collecting kids’ personal information without notice to and consent of parents. Campaign for a Commercial-Free Childhood (CCFC), the Center for Digital Democracy (CDD), and a total of 20 organizations demonstrated in their FTC filing that TikTok continues to violate COPPA by: failing to delete personal information related to children under 13 it obtained prior to the 2019 settlement order; failing to give direct notice to parents and to obtain parents’ consent before collecting kids’ personal information; and failing to give parents the right to review or delete their children’s personal information collected by TikTok. TikTok makes it easy for children to avoid obtaining parental consent. When a child under 13 tries to register using their actual birthdate, they will be signed up for a “younger users account” with limited functions, and no ability to share their videos. If a child is frustrated by this limited functionality, they can immediately register again with a fake birthdate from the same device for an account with full privileges, thereby putting them at risk for both TikTok’s commercial data uses and inappropriate contact from adults. In either case, TikTok makes no attempt to notify parents or obtain their consent. And TikTok doesn’t even comply with the law for those children who stick with limited “younger users accounts.” For these accounts, TikTok collects detailed information about how the child uses the app and uses artificial intelligence to determine what to show next, to keep the child engaged online as long as possible. The advocates, represented by the Communications & Technology Law Clinic in the Institute for Public Representation at Georgetown Law, asked the FTC to identify and hold responsible those individuals who made or ratified decisions to violate the settlement agreement. They also asked the FTC to prevent TikTok from registering any new accounts for persons in the US until it adopts a reliable method of determining the ages of its users and comes into full compliance with the children’s privacy rules. In light of TikTok’s vast financial resources, the number and severity of the violations, and the large number of US children that use TikTok, they asked the FTC to seek the maximum monetary penalties allowed by law. Josh Golin, Executive Director of Campaign for a Commercial-Free Childhood, said “For years, TikTok has ignored COPPA, thereby ensnaring perhaps millions of underage children in its marketing apparatus, and putting children at risk of sexual predation. Now, even after being caught red-handed by the FTC, TikTok continues to flout the law. We urge the Commission to take swift action and sanction TikTok again – this time with a fine and injunctive relief commensurate with the seriousness of TikTok’s serial violations.” Jeff Chester, Executive Director of the Center for Digital Democracy, said “Congress empowered the FTC to ensure that kids have online protections, yet here is another case of a digital giant deliberately violating the law. The failure of the FTC to ensure that TikTok protects the privacy of millions of children, including through its use of predictive AI applications, is another reason why there are questions whether the agency can be trusted to effectively oversee the kids’ data law.” Michael Rosenbloom, Staff Attorney and Teaching Fellow at the Institute for Public Representation, Georgetown Law, said “The FTC ordered TikTok to delete all personal information of children under 13 years old from its servers, but TikTok has clearly failed to do so. We easily found that many accounts featuring children were still present on TikTok. Many of these accounts have tens of thousands to millions of followers, and have been around since before the order. We urge the FTC to hold TikTok to account for continuing to violate both COPPA and its consent decree.” Katie McInnis, Policy Counsel at Consumer Reports, said "During the pandemic, families and children are turning to digital tools like TikTok to share videos with loved ones. Now more than ever, effective protection of children's personal information requires robust enforcement in order to incentivize companies, including TikTok, to comply with COPPA and any relevant consent decrees. We urge the FTC to investigate the matters raised in this complaint" Groups signing on to the complaint to the FTC are: Campaign for a Commercial-Free Childhood, the Center for Digital Democracy, Badass Teachers Association, Berkeley Media Studies Group, Children and Screens: Institute of Digital Media and Child Development, Consumer Action, Consumer Federation of America, Consumer Reports, Defending the Early Years, Electronic Privacy Information Center, Media Education Foundation, Obligation, Inc., Parent Coalition for Student Privacy, Parents Across America, ParentsTogether Foundation, Privacy Rights Clearinghouse, Public Citizen, The Story of Stuff, United Church of Christ, and USPIRG. ###
  • Press Release

    Groups Say White House Must Show Efficacy, Protect Privacy, and Ensure Equity When Deploying Technology to Fight Virus

    Fifteen leading consumer, privacy, civil and digital rights organizations called on the federal government to set guidelines to protect individuals’ privacy, ensure equity in the treatment of individuals and communities, and communicate clearly about public health objectives in responding to the COVID-19 pandemic. There must be consensus among all relevant stakeholders on the most efficacious solution before relying on a technological fix to respond to the pandemic.

    FOR IMMEDIATE RELEASE Contacts: Susan Grant (link sends e-mail), CFA, 202-939-1003 May 5, 2020 Katharina Kopp (link sends e-mail), CDD, 202-836 4621 White House Must Act To protect privacy and ensure equity in responding to COVID-19 pandemic Groups Tell Pence to Set Standards to Guide Government and Public-Private Partnership Data Practices and Technology Use Washington, D.C. – Today, 15 leading consumer, privacy, civil and digital rights organizations called on the federal government (link is external) to set guidelines to protect individuals’ privacy, ensure equity in the treatment of individuals and communities, and communicate clearly about public health objectives in responding to the COVID-19 pandemic. In a letter to Vice President Michael R. Pence, who leads the Coronavirus Task Force, the groups said that the proper use of technology and data have the potential to provide important public health benefits, but must incorporate privacy and security, as well as safeguards against discrimination and violations of civil and other rights. Developing a process to assess how effective technology and other tools will be to achieve the desired public health objectives is also vitally important, the groups said. The letter (link is external) was signed by the Campaign for a Commercial Free Childhood, Center for Democracy & Technology, Center for Digital Democracy, Constitutional Alliance, Consumer Action, Consumer Federation of America, Electronic Privacy Information Center (EPIC), Media Alliance, MediaJustice, Oakland Privacy, Parent Coalition for Student Privacy, Privacy Rights Clearinghouse, Public Citizen, Public Knowledge, and Rights x Tech. “A headlong rush into technological solutions without carefully considering how well they work and whether they could undermine fundamental American values such as privacy, equity, and fairness would be a mistake,” said Susan Grant, Director of Consumer Protection and Privacy at the Consumer Federation of America. “Fostering public trust and confidence in the programs that are implemented to combat COVID-19 is crucial to their overall success.” “Measures to contain the deadly spread of COVID-19 must be effective and protect those most exposed. History has taught us that the deployment of technologies is often driven by forces that tend to risk privacy, undermine fairness and equity, and place our civil rights in peril. The White House Task Force must work with privacy, consumer and civil rights groups, and other experts, to ensure that the efforts to limit the spread of the virus truly protect our interests,” said Katharina Kopp, Director of Policy, Center for Digital Democracy. In addition to concerns about government plans that are being developed to address the pandemic, such as using technology for contact tracing, the groups noted the need to ensure that private-sector partnerships incorporate comprehensive privacy and security standards. The letter outlines 11 principles that should form the basis for standards that government agencies and the private sector can follow: Set science-based, public health objectives to address the pandemic. Then design the programs and consider what tools, including technology, might be most efficacious and helpful to meet those objectives. Assess how technology and other tools meet key criteria. This should be done before deployment when possible and consistent with public health demands, and on an ongoing basis. Questions should include: Can they be shown to be effective for their intended purposes? Can they be used without infringing on privacy? Can they be used without unfairly disadvantaging individuals or communities? Are there other alternatives that would help meet the objectives well without potentially negative consequences? Use of technologies and tools that are ineffective or raise privacy or other societal concerns should be discontinued promptly. Protect against bias and address inequities in technology access. In many cases, communities already disproportionately impacted by COVID-19 may lack access to technology, or not be fairly represented in data sets. Any use of digital tools must ensure that nobody is left behind. Set clear guidelines for how technology and other tools will be used. These should be aimed at ensuring that they will serve the public health objective while safeguarding privacy and other societal values. Public and private partners should be required to adhere to those guidelines, and the guidelines should be readily available to the public. Ensure that programs such as technology-assisted contact tracing are voluntary. Individual participation should be based on informed, affirmative consent, not coercion. Only collect individuals’ personal information needed for the public health objective. No other personal information should be collected in testing, contact tracing, and public information portals. Do not use or share individuals’ personal information for any other purposes. It is important to avoid “mission creep” and to prevent use for purposes unrelated to the pandemic such as for advertising, law enforcement, or for reputation management in non-public health settings. Secure individuals’ personal information from unauthorized access and use. Information collected from testing, contact tracing and information portals may be very revealing, even if it is not “health” information, and security breaches would severely damage public trust. Retain individuals’ personal information only for as long as it is needed. When it is no longer required for the public health objective, the information should be safely disposed of. Be transparent about data collection and use. Before their personal information is collected, individuals should be informed about what data is needed, the specific purposes for which the data will be used, and what rights they have over what’s been collected about them. Provide accountability. There must be systems in place to ensure that these principles are followed and to hold responsible parties accountable. In addition, individuals should have clear means to ask questions, make complaints, and seek recourse in connection with the handling of their personal information. The groups asked Vice President Pence for a meeting to discuss their concerns and suggested that the Coronavirus Task Force immediately create an interdisciplinary advisory committee comprised of experts from public health, data security, privacy, social science, and civil society to help develop effective standards. The Consumer Federation of America (link is external) is a nonprofit association of more than 250 consumer groups that was founded in 1968 to advance the consumer interest through research, advocacy, and education. The Center for Digital Democracy (CDD) is recognized as one of the leading NGOs organizations promoting privacy and consumer protection, fairness and data justice in the digital age. Since its founding in 2001 (and prior to that through its predecessor organization, the Center for Media Education), CDD has been at the forefront of research, public education, and advocacy.