CDD

Press Releases

  • Press Release

    Groups Tell FTC to Investigate TikTok’s Failure to Protect Children’s Privacy

    TikTok gathers data from children despite promise made to commission

    Contact: Jeff Chester, CDD (jeff@democraticmedia.org (link sends e-mail); 202-494-7100) David Monahan, CCFC (david@commercialfreechildhood.org (link sends e-mail);) Advocates Say TikTok In Contempt of Court Order More kids than ever use the site due to COVID19 quarantine, but TikTok flouts settlement agreement with the FTC WASHINGTON, DC and BOSTON, MA—May 14, 2020—Today, a coalition of leading U.S. child advocacy, consumer, and privacy groups filed a complaint (link is external) urging the Federal Trade Commission (FTC) to investigate and sanction TikTok for putting kids at risk by continuing to violate the Children’s Online Privacy Protection Act (COPPA). In February 2019, TikTok paid a $5.7 million fine for violating COPPA, including illegally collecting personal information from children. But more than a year later, with quarantined kids and families flocking to the site in record numbers, TikTok has failed to delete personal information previously collected from children and is still collecting kids’ personal information without notice to and consent of parents. Campaign for a Commercial-Free Childhood (CCFC), the Center for Digital Democracy (CDD), and a total of 20 organizations demonstrated in their FTC filing that TikTok continues to violate COPPA by: failing to delete personal information related to children under 13 it obtained prior to the 2019 settlement order; failing to give direct notice to parents and to obtain parents’ consent before collecting kids’ personal information; and failing to give parents the right to review or delete their children’s personal information collected by TikTok. TikTok makes it easy for children to avoid obtaining parental consent. When a child under 13 tries to register using their actual birthdate, they will be signed up for a “younger users account” with limited functions, and no ability to share their videos. If a child is frustrated by this limited functionality, they can immediately register again with a fake birthdate from the same device for an account with full privileges, thereby putting them at risk for both TikTok’s commercial data uses and inappropriate contact from adults. In either case, TikTok makes no attempt to notify parents or obtain their consent. And TikTok doesn’t even comply with the law for those children who stick with limited “younger users accounts.” For these accounts, TikTok collects detailed information about how the child uses the app and uses artificial intelligence to determine what to show next, to keep the child engaged online as long as possible. The advocates, represented by the Communications & Technology Law Clinic in the Institute for Public Representation at Georgetown Law, asked the FTC to identify and hold responsible those individuals who made or ratified decisions to violate the settlement agreement. They also asked the FTC to prevent TikTok from registering any new accounts for persons in the US until it adopts a reliable method of determining the ages of its users and comes into full compliance with the children’s privacy rules. In light of TikTok’s vast financial resources, the number and severity of the violations, and the large number of US children that use TikTok, they asked the FTC to seek the maximum monetary penalties allowed by law. Josh Golin, Executive Director of Campaign for a Commercial-Free Childhood, said “For years, TikTok has ignored COPPA, thereby ensnaring perhaps millions of underage children in its marketing apparatus, and putting children at risk of sexual predation. Now, even after being caught red-handed by the FTC, TikTok continues to flout the law. We urge the Commission to take swift action and sanction TikTok again – this time with a fine and injunctive relief commensurate with the seriousness of TikTok’s serial violations.” Jeff Chester, Executive Director of the Center for Digital Democracy, said “Congress empowered the FTC to ensure that kids have online protections, yet here is another case of a digital giant deliberately violating the law. The failure of the FTC to ensure that TikTok protects the privacy of millions of children, including through its use of predictive AI applications, is another reason why there are questions whether the agency can be trusted to effectively oversee the kids’ data law.” Michael Rosenbloom, Staff Attorney and Teaching Fellow at the Institute for Public Representation, Georgetown Law, said “The FTC ordered TikTok to delete all personal information of children under 13 years old from its servers, but TikTok has clearly failed to do so. We easily found that many accounts featuring children were still present on TikTok. Many of these accounts have tens of thousands to millions of followers, and have been around since before the order. We urge the FTC to hold TikTok to account for continuing to violate both COPPA and its consent decree.” Katie McInnis, Policy Counsel at Consumer Reports, said "During the pandemic, families and children are turning to digital tools like TikTok to share videos with loved ones. Now more than ever, effective protection of children's personal information requires robust enforcement in order to incentivize companies, including TikTok, to comply with COPPA and any relevant consent decrees. We urge the FTC to investigate the matters raised in this complaint" Groups signing on to the complaint to the FTC are: Campaign for a Commercial-Free Childhood, the Center for Digital Democracy, Badass Teachers Association, Berkeley Media Studies Group, Children and Screens: Institute of Digital Media and Child Development, Consumer Action, Consumer Federation of America, Consumer Reports, Defending the Early Years, Electronic Privacy Information Center, Media Education Foundation, Obligation, Inc., Parent Coalition for Student Privacy, Parents Across America, ParentsTogether Foundation, Privacy Rights Clearinghouse, Public Citizen, The Story of Stuff, United Church of Christ, and USPIRG. ###
  • Press Release

    Groups Say White House Must Show Efficacy, Protect Privacy, and Ensure Equity When Deploying Technology to Fight Virus

    Fifteen leading consumer, privacy, civil and digital rights organizations called on the federal government to set guidelines to protect individuals’ privacy, ensure equity in the treatment of individuals and communities, and communicate clearly about public health objectives in responding to the COVID-19 pandemic. There must be consensus among all relevant stakeholders on the most efficacious solution before relying on a technological fix to respond to the pandemic.

    FOR IMMEDIATE RELEASE Contacts: Susan Grant (link sends e-mail), CFA, 202-939-1003 May 5, 2020 Katharina Kopp (link sends e-mail), CDD, 202-836 4621 White House Must Act To protect privacy and ensure equity in responding to COVID-19 pandemic Groups Tell Pence to Set Standards to Guide Government and Public-Private Partnership Data Practices and Technology Use Washington, D.C. – Today, 15 leading consumer, privacy, civil and digital rights organizations called on the federal government (link is external) to set guidelines to protect individuals’ privacy, ensure equity in the treatment of individuals and communities, and communicate clearly about public health objectives in responding to the COVID-19 pandemic. In a letter to Vice President Michael R. Pence, who leads the Coronavirus Task Force, the groups said that the proper use of technology and data have the potential to provide important public health benefits, but must incorporate privacy and security, as well as safeguards against discrimination and violations of civil and other rights. Developing a process to assess how effective technology and other tools will be to achieve the desired public health objectives is also vitally important, the groups said. The letter (link is external) was signed by the Campaign for a Commercial Free Childhood, Center for Democracy & Technology, Center for Digital Democracy, Constitutional Alliance, Consumer Action, Consumer Federation of America, Electronic Privacy Information Center (EPIC), Media Alliance, MediaJustice, Oakland Privacy, Parent Coalition for Student Privacy, Privacy Rights Clearinghouse, Public Citizen, Public Knowledge, and Rights x Tech. “A headlong rush into technological solutions without carefully considering how well they work and whether they could undermine fundamental American values such as privacy, equity, and fairness would be a mistake,” said Susan Grant, Director of Consumer Protection and Privacy at the Consumer Federation of America. “Fostering public trust and confidence in the programs that are implemented to combat COVID-19 is crucial to their overall success.” “Measures to contain the deadly spread of COVID-19 must be effective and protect those most exposed. History has taught us that the deployment of technologies is often driven by forces that tend to risk privacy, undermine fairness and equity, and place our civil rights in peril. The White House Task Force must work with privacy, consumer and civil rights groups, and other experts, to ensure that the efforts to limit the spread of the virus truly protect our interests,” said Katharina Kopp, Director of Policy, Center for Digital Democracy. In addition to concerns about government plans that are being developed to address the pandemic, such as using technology for contact tracing, the groups noted the need to ensure that private-sector partnerships incorporate comprehensive privacy and security standards. The letter outlines 11 principles that should form the basis for standards that government agencies and the private sector can follow: Set science-based, public health objectives to address the pandemic. Then design the programs and consider what tools, including technology, might be most efficacious and helpful to meet those objectives. Assess how technology and other tools meet key criteria. This should be done before deployment when possible and consistent with public health demands, and on an ongoing basis. Questions should include: Can they be shown to be effective for their intended purposes? Can they be used without infringing on privacy? Can they be used without unfairly disadvantaging individuals or communities? Are there other alternatives that would help meet the objectives well without potentially negative consequences? Use of technologies and tools that are ineffective or raise privacy or other societal concerns should be discontinued promptly. Protect against bias and address inequities in technology access. In many cases, communities already disproportionately impacted by COVID-19 may lack access to technology, or not be fairly represented in data sets. Any use of digital tools must ensure that nobody is left behind. Set clear guidelines for how technology and other tools will be used. These should be aimed at ensuring that they will serve the public health objective while safeguarding privacy and other societal values. Public and private partners should be required to adhere to those guidelines, and the guidelines should be readily available to the public. Ensure that programs such as technology-assisted contact tracing are voluntary. Individual participation should be based on informed, affirmative consent, not coercion. Only collect individuals’ personal information needed for the public health objective. No other personal information should be collected in testing, contact tracing, and public information portals. Do not use or share individuals’ personal information for any other purposes. It is important to avoid “mission creep” and to prevent use for purposes unrelated to the pandemic such as for advertising, law enforcement, or for reputation management in non-public health settings. Secure individuals’ personal information from unauthorized access and use. Information collected from testing, contact tracing and information portals may be very revealing, even if it is not “health” information, and security breaches would severely damage public trust. Retain individuals’ personal information only for as long as it is needed. When it is no longer required for the public health objective, the information should be safely disposed of. Be transparent about data collection and use. Before their personal information is collected, individuals should be informed about what data is needed, the specific purposes for which the data will be used, and what rights they have over what’s been collected about them. Provide accountability. There must be systems in place to ensure that these principles are followed and to hold responsible parties accountable. In addition, individuals should have clear means to ask questions, make complaints, and seek recourse in connection with the handling of their personal information. The groups asked Vice President Pence for a meeting to discuss their concerns and suggested that the Coronavirus Task Force immediately create an interdisciplinary advisory committee comprised of experts from public health, data security, privacy, social science, and civil society to help develop effective standards. The Consumer Federation of America (link is external) is a nonprofit association of more than 250 consumer groups that was founded in 1968 to advance the consumer interest through research, advocacy, and education. The Center for Digital Democracy (CDD) is recognized as one of the leading NGOs organizations promoting privacy and consumer protection, fairness and data justice in the digital age. Since its founding in 2001 (and prior to that through its predecessor organization, the Center for Media Education), CDD has been at the forefront of research, public education, and advocacy.
  • Press Release

    Children’s privacy advocates call on FTC to require Google, Disney, AT&T and other leading companies to disclose how they gather and use data to target kids and families

    Threats to young people from digital marketing and data collection are heightened by home schooling and increased video and mobile streaming in response to COVID-19

    Contact: Jeffrey Chester, CDD, jeff@democraticmedia.org (link sends e-mail), 202-494-7100 Josh Golin, CCFC, josh@commercialfreechildhood.org (link sends e-mail), 339-970-4240 Children’s privacy advocates call on FTC to require Google, Disney, other leading companies to disclose how they gather and use data to target kids and families Threats to young people from digital marketing and data collection are heightened by home schooling and increased video and mobile streaming in response to COVID-19 WASHINGTON, DC and BOSTON, MA – March 26, 2020 – With children and families even more dependent on digital media during the COVID-19 crisis, the Campaign for a Commercial-Free Childhood (CCFC) and the Center for Digital Democracy (CDD) called on the Federal Trade Commission (FTC) to require leading digital media companies to turn over information on how they target kids, including the data they collect. In a letter to the FTC, the advocates proposed a series of questions to shed light on the array of opaque data collection and digital marketing practices which the tech companies employ to target kids. The letter includes a proposed list of numerous digital media and marketing companies and edtech companies that should be the targets of the FTC’s investigation—among them are Google, Zoom, Disney, Comcast, AT&T, Viacom, and edtech companies Edmodo and Prodigy. The letter—sent by the Institute for Public Representation at Georgetown Law, attorneys for the advocates—is in response to the FTC’s early review of the rules protecting children under the Children’s Online Privacy Protection Act (COPPA). The groups said “children’s privacy is under siege more than ever,” and urged the FTC “not to take steps that could undermine strong protections for children’s privacy without full information about a complex data collection ecosystem.” The groups ask the Commission to request vital information from two key sectors that greatly impact the privacy of children: the edtech industry, which provides information and technology applications in the K-12 school setting, and the commercial digital data and marketing industry that provides the majority of online content and communications for children, including apps, video streaming, and gaming. The letter suggests numerous questions for the FTC to get to the core of how digital companies conduct business today, including contemporary Big Data practices that capture, analyze, track, and target children across platforms. “With schools closed across the country, American families are more dependent than ever on digital media to educate and occupy their children,” said CCFC’s Executive Director, Josh Golin. “It’s now urgent that the FTC use its full authority to shed light on the business models of the edtech and children’s digital media industries so we can understand what Big Tech knows about our children and what they are doing with that information. The stakes have never been higher.” “Although children’s privacy is supposed to be protected by federal law and the FTC, young people remain at the epicenter of a powerful data-gathering and commercial online advertising system," said Dr. Katharina Kopp, Deputy Director of the Center for Digital Democracy. “We call on the FTC to investigate how companies use data about children, how these data practices work against children’s interests, and also how they impact low-income families and families of color. Before it proposes any changes to the COPPA rules, the FTC needs to obtain detailed insights into how contemporary digital data practices pose challenges to protecting children. Given the outsize intrusion of commercial surveillance into children’s and families’ lives via digital services for education, entertainment, and communication, the FTC must demonstrate it is placing the welfare of kids as its highest priority.” In December, CCFC and CDD led a coalition of 31 groups—including the American Academy of Pediatrics, Center for Science in the Public Interest, Common Sense Media, Consumer Reports, Electronic Privacy Information Center, and Public Citizen—in calling on the FTC to use its subpoena authority. The groups said the Commission must better assess the impacts on children from today’s digital data-driven advertising system, and features such as cross-device tracking, artificial intelligence, machine learning, virtual reality, and real-time measurement. “Childhood is more digital than ever before, and the various ways that children's data is collected, analyzed, and used have never been more complex or opaque,” said Lindsey Barrett, Staff Attorney and Teaching Fellow at IPR’s Communications and Technology Law Clinic at Georgetown Law. “The Federal Trade Commission should shed light on how children's privacy is being invaded at home, at school, and throughout their lives by investigating the companies that profit from collecting their data, and cannot undertake an informed and fact-based revision of the COPPA rules without doing so.” "Children today, more than ever, have an incredible opportunity to learn, play, and socialize online,” said Celia Calano, student attorney at the Institute for Public Representation. “But these modern playgrounds and classrooms come with new safety concerns, including highly technical and obscure industry practices. The first step to improving the COPPA Rule and protecting children online is understanding the current landscape—something the FTC can achieve with a 6(b) investigation." ###
  • Press Release

    Popular Dating, Health Apps Violate Privacy

    Leading Consumer and Privacy Groups Urge Congress, the FTC, State AGs in California, Texas, Oregon to Investigate

    Popular Dating, Health Apps Violate Privacy Leading Consumer and Privacy Groups Urge Congress, the FTC, State AGs in California, Texas, Oregon to Investigate For Immediate Release: Jan. 14, 2020 Contact: David Rosen, drosen@citizen.org (link is external), (202) 588-7742 Angela Bradbery, abradbery@citizen.org (link is external), (202) 588-7741 WASHINGTON, D.C. – Nine consumer groups today asked (link is external) the Federal Trade Commission (FTC), congressional lawmakers and the state attorneys general of California, Texas and Oregon to investigate several popular apps available in the Google Play Store. A report (link is external) released today by the Norwegian Consumer Council (NCC) alleges that the apps are systematically violating users’ privacy. The report found that 10 well-known apps – Grindr, Tinder, OkCupid, Happn, Clue, MyDays, Perfect365, Qibla Finder, My Talking Tom 2 and Wave Keyboard – are sharing information they collect on users with third-party advertisers without users’ knowledge or consent. The European Union’s General Data Protection Regulation forbids sharing information with third parties without users’ knowledge or consent. When it comes to drafting a new federal privacy law, American lawmakers cannot trust input from companies who do not respect user privacy, the groups maintain. Congress should use the findings of the report as a roadmap for a new law that ensures that such flagrant violations of privacy found in the EU are not acceptable in the U.S. The new report alleges that these apps (and likely a great many others) are allowing commercial third parties to collect, use and share sensitive consumer data in a way that is hidden from the user and involves parties that the consumer neither knows about nor would be familiar with. Although consumers can limit some tracking on desktop computers through browser settings and extensions, the same cannot be said for smartphones and tablets. As consumers use their smartphones throughout the day, the devices are recording information about sensitive topics such as our health, behavior, religion, interests and sexuality. “Consumers cannot avoid being tracked by these apps and their advertising partners because they are not provided with the necessary information to make informed choices when launching the apps for the first time. In addition, consumers are unable to make an informed choice because the extent of tracking, data sharing, and the overall complexity of the adtech ecosystem is hidden and incomprehensible to average consumers,” the letters sent to lawmakers and regulators warn. The nine groups are the American Civil Liberties Union of California, Campaign for a Commercial-Free Childhood, the Center for Digital Democracy, Consumer Action, Consumer Federation of America, Consumer Reports, the Electronic Privacy Information Center (EPIC), Public Citizen and U.S. PIRG. In addition to calling for an investigation, the groups are calling for a strong federal digital privacy law that includes a new data protection agency, a private right of action and strong enforcement mechanisms. Below are quotes from groups that signed the letters: “Every day, millions of Americans share their most intimate personal details on these apps, upload personal photos, track their periods and reveal their sexual and religious identities. But these apps and online services spy on people, collect vast amounts of personal data and share it with third parties without people’s knowledge. Industry calls it adtech. We call it surveillance. We need to regulate it now, before it’s too late.” Burcu Kilic, digital rights program director, Public Citizen “The NCC’s report makes clear that any state or federal privacy law must provide sufficient resources for enforcement in order for the law to effectively protect consumers and their privacy. We applaud the NCC’s groundbreaking research on the adtech ecosystem underlying popular apps and urge lawmakers to prioritize enforcement in their privacy proposals.” Katie McInnis, policy counsel, Consumer Reports “U.S. PIRG is not surprised that U.S. firms are not complying with laws giving European consumers and citizens privacy rights. After all, the phalanx of industry lobbyists besieging Washington, D.C., has been very clear that its goal is simply to perpetuate a 24/7/365 surveillance capitalism business model, while denying states the right to protect their citizens better and denying consumers any real rights at all.” Ed Mierzwinski, senior director for consumer programs, U.S. PIRG “This report reveals how the failure of the U.S. to enact effective privacy safeguards has unleashed an out-of-control and unaccountable monster that swallows up personal information in the EU and elsewhere. The long unregulated business practices of digital media companies have shred the rights of people and communities to use the internet without fear of surveillance and manipulation. U.S. policymakers have been given a much-needed wake-up call by Norway that it’s overdue for the enactment of laws that bring meaningful change to the now lawless digital marketplace.” Jeff Chester, executive director, Center for Digital Democracy “For those of us in the U.S., this research by our colleagues at the Norwegian Consumer Council completely debunks the argument that we can protect consumers’ privacy in the 21st century with the old notice-and-opt-out approach, which some companies appear to be clinging to in violation of European law. Business practices have to change, and the first step to accomplish that is to enact strong privacy rights that government and individuals can enforce.” Susan Grant, director of consumer protection and privacy, Consumer Federation of America “The illuminating report by our EU ally the Norwegian Consumer Council highlights just how impossible it is for consumers to have any meaningful control over how apps and advertising technology players track and profile them. That’s why Consumer Action is pressing for comprehensive U.S. federal privacy legislation and subsequent strong enforcement efforts. Enough is enough already! Congress must protect us from ever-encroaching privacy intrusions.” Linda Sherry, director of national priorities, Consumer Action “For families who wonder what they’re trading off for the convenience of apps like these, this report makes the answer clear. These companies are exploiting us – surreptitiously collecting sensitive information and using it to target us with marketing. It’s urgent that Congress pass comprehensive legislation which puts the privacy interests of families ahead of the profits of businesses. Thanks to our friends at the Norwegian Consumer Council for this eye-opening research.” David Monahan, campaign manager, Campaign for a Commercial-Free Childhood “This report highlights the pervasiveness of corporate surveillance and the failures of the FTC notice-and-choice model for privacy protection. Congress should pass comprehensive data protection legislation and establish a U.S. Data Protection Agency to protect consumers from the privacy violations of the adtech industry.” Christine Bannan, consumer protection counsel, EPIC
  • Contact: Jeff Chester, CDD (jeff@democraticmedia.org (link sends e-mail); 202-494-7100) David Monahan, CCFC (david@commercialfreechildhood.org (link sends e-mail); 617-896-9397) Groups Praise Sen. Markey and Google for Ensuring Children on YouTube Receive Key Safeguards BOSTON, MA & WASHINGTON, DC—December 18, 2019—The organizations that spurred the landmark FTC settlement with Google over COPPA violations applauded the announcement of additional advertising safeguards for children on YouTube today. The Campaign for a Commercial-Free Childhood (CCFC) and the Center for Digital Democracy (CDD) commended Google for announcing it would apply most of its robust marketing protections on YouTube Kids, including no advertising of food or beverages or harmful products, to all child-directed content on its main YouTube platform. The groups also lauded Senator Markey for securing (link is external) a public commitment from Google to implement these long-overdue safeguards. The advocates expressed disappointment, however, that Google did not agree to prohibit paid influencer marketing and product placement to children on YouTube as it does on YouTube Kids “Sen. Ed Markey has long been and remains the champion for kids,” said Jeff Chester, CDD’s executive director. “Through the intervention of Sen. Markey, Google has finally committed to protecting children whether they are on the main YouTube platform or using the YouTube Kids app. Google has acted responsibly in announcing that its advertising policies now prohibit any food and beverage marketing on YouTube Kids, as well as ads involving ‘sexually suggestive, violent or dangerous content.’ However, we remain concerned that Google may try to weaken these important child- and family-friendly policies in the near future. Thus we call on Google to commit to keeping these rules in place, and to implement other needed safeguards that children deserve,” added Chester. Josh Golin, Executive Director of CCFC, said, “We are so grateful to Senator Markey for his leadership on one of the most crucial issues faced by children and families today. And we commend Google for implementing a robust set of advertising safeguards on the most popular online destination for children. We urge Google to take another critical step and prohibit child-directed influencer content on YouTube; if this manipulative marketing isn’t allowed on children’s TV or YouTube Kids, it shouldn’t be targeted to children on the main YouTube platform either.” ###
  • Washington, December 11, 2019 In comments filed today in response to the Federal Trade Commission’s review of COPPA, the Center for Digital Democracy, the Campaign for a Commercial-Free Childhood, the American Academy of Pediatrics, and a total of 19 advocacy groups faulted the FTC for failing to engage in sufficient enforcement and oversight of the children’s privacy law. The groups suggested how COPPA can better protect children’s privacy, and urged the Commission not to weaken the law to satisfy industry’s thirst for more data about kids. The advocates also urged the FTC first to investigate the children’s “kid tech” market before it proposes any changes in how to implement its rules. The following can be attributed to Jeff Chester, Executive Director, Center for Digital Democracy: “Children are at greater risk today of losing their digital privacy because the FTC has failed to enforce COPPA. For years, the Commission has allowed Google and many others to ignore the landmark bipartisan law designed to protect children under 13. It’s time for the FTC to stand up to the big data companies and put the interests of young people and families first.” The following can be attributed to Josh Golin, Executive Director, Campaign for a Commercial-Free Childhood: “This is a critical moment for the future of children’s online privacy. The ink is barely dry on the FTC’s first major COPPA enforcement, and already industry is mobilizing to weaken the rules. The FTC should not make any changes to COPPA until it uses its authority to learn exactly how Big Tech is collecting and monetizing our children’s data.” The following can be attributed to Kyle Yasuda, MD, FAAP, President, American Academy of Pediatrics: “Keeping children safe and healthy where they learn and grow is core to what pediatricians do every day, and today more than ever before that extends to the digital spaces that children inhabit. The Children’s Online Privacy Protection Act is a foundational law that helps hold companies accountable to basic standards of safety when it comes to children’s digital privacy, but it’s only as effective as its enforcement by the Federal Trade Commission. Before any major changes are made to COPPA, we must ensure that the FTC is doing its part to keep children safe wherever they engage online.” The following can be attributed to Laura Moy, Associate Professor of Law, Director of the Communications and Technology Law Clinic, Institute for Public Representation at Georgetown University Law Center: “A recent survey showed that the majority of Americans feel that ‘the threat to personal privacy online is a crisis.’ We are at a critical point in our nation’s history right now—when we are deciding whether or not to allow companies to track, profile, and target us to an extent that compromises our ability to be and make decisions for ourselves. At the forefront of that discussion are children. We must protect the next generation from inappropriate tracking so that they can grow up with privacy and dignity. To make good on that, the FTC must thoroughly investigate how companies are collecting and using children’s data, and must enforce and strengthen COPPA.”
  • Press Release

    Leading child advocacy, health, and privacy groups call on FTC to Investigate Children’s Digital Media Marketplace Before Proposing any Changes to Privacy Protections for Children

    Threats to young people from digital marketing and data collection must be analyzed to ensure meaningful safeguards under the Children’s Online Privacy Protection Act (COPPA).

    EMBARGOED UNTIL DECEMBER 5, 2019 AT 12:01 AM Contact: Jeffrey Chester, CDD, jeff@democraticmedia.org (link sends e-mail), (202) 494-7100 Josh Golin, CCFC, josh@commercialfreechildhood.org (link sends e-mail); 617-896-9369 Leading child advocacy, health, and privacy groups call on FTC to Investigate Children’s Digital Media Marketplace Before Proposing any Changes to Privacy Protections for Children Threats to young people from digital marketing and data collection must be analyzed to ensure meaningful safeguards under the Children’s Online Privacy Protection Act (COPPA). WASHINGTON, DC and BOSTON, MA – December 5, 2019 – A coalition of 31 advocacy groups is urging the Federal Trade Commission to use its subpoena authority to obtain information from leading digital media companies that target children online. In comments filed today by the Institute for Public Representation at Georgetown and organized by Center for Digital Democracy (CDD) and the Campaign for a Commercial-Free Childhood (CCFC), the coalition explained the opaque data and digital marketing practices targeting kids. The comments are filed with the FTC as part of its early review of the rules protecting children under the Children’s Online Privacy Protection Act (COPPA). The advocates’ call was supported by Sesame Workshop, the leading producer of children’s educational programming, in a separate filing. To better assess the impacts on children from today’s digital data-driven advertising system, and features such as cross-device tracking, artificial intelligence, machine learning, virtual reality, and real-time measurement—the advocates urge the commission to gather and analyze data from leading companies that target children. Any proposed changes to COPPA must be based on empirical data, which is consistent with calls by Commissioners Wilson, Phillips, and Simons that rulemaking must be evidence-based. In their comments, the organizations ask the FTC to use its authority under rule 6(b) to: - Examine today’s methods of advertising to children and their impact, including their discriminatory effects - Examine practices concerning data collection and retention - Illuminate children’s presence on “general audience” platforms and those platforms’ awareness of children’s presence - Identify how the data of children is being used by contemporary data platforms, including “marketing clouds,” “identity management” systems, in-house data management platforms, and data brokers - Illuminate the efficacy—or lack thereof—of safe harbors Groups that have signed the comments are Campaign for a Commercial-Free Childhood; Center for Digital Democracy; American Academy of Pediatrics; Badass Teachers Association; Berkeley Media Studies Group; Center for Science in the Public Interest; Children and Screens; Color of Change; Common Sense Media; Consumer Action; Consumer Federation of America; Consumer Federation of California; Consumer Reports; Consumer Watchdog; Corporate Accountability; Defending the Early Years; Electronic Frontier Foundation; Electronic Privacy Information Center; Obligation, Inc.; Parent Coalition for Student Privacy; Parents Across America; Parents Television Council; P.E.A.C.E. (Peace Educators Allied For Children Everywhere) (link is external); Privacy Rights Clearinghouse; Public Citizen; Public Knowledge; The Story of Stuff; TRUCE (Teachers Resisting Unhealthy Childhood Entertainment); UnidosUS; United Church of Christ; and U.S. Public Interest Research Group (U.S. PIRG). …. The following can be attributed to Kyle Yasuda, MD, FAAP, President, American Academy of Pediatrics: “As children become more digitally connected, it becomes even more important for parents, pediatricians and others who care for young children to understand how digital media impacts their health and development. Since digital technology evolves rapidly, so must our understanding of how data companies are engaging with children’s information online. As we pursue the promise of digital media for children’s development, we must design robust protections to keep them safe based on an up-to-date understanding of the digital spaces they navigate.” The following can be attributed to Josh Golin, Executive Director of Campaign for Commercial-Free Childhood: As kids are spending more time than ever on digital devices, we need the full power of the law to protect them from predatory data collection -- but we can't protect children from Big Tech business models if we don't know how those models truly work. The FTC must use its full authority to investigate opaque data and marketing practices before making any changes to COPPA. We need-to-know what Big Tech knows about our kids." The following can be attributed to Katharina Kopp, Director of Policy, Center for Digital Democracy (CDD): “Children are being subjected to a purposefully opaque ‘Big Data’ digital marketing system that continually gathers their information when they are online. The FTC must use its authority to understand how new and evolving advertising practices targeting kids really work, and whether these data practices are having a discriminatory, or other harmful impact, on their lives.” The following can be attributed to James P. Steyer, CEO and Founder of Common Sense: "Kids and families have to be the priority in any changes to COPPA and in order to do that, we must fully understand what the industry is and isn’t doing when it comes to tracking and targeting kids. Tech companies are never going to be transparent about their business practices which is why it is critical that the FTC use its authority to look behind the curtain and shed light on what they are doing when it comes to kids so that if any new rules are needed, they can be smart and well-informed." The following can be attributed to Katie McInnis, Policy Counsel, Consumer Reports: "We’re glad the FTC is asking for comments on the implementation of COPPA through the 2013 COPPA rule. But the Commission should have the fullest possible picture of how children's personal information is being collected and used before it considers any changes. It’s well-documented that compliance with COPPA is uneven among apps, connected toys, and online services. The FTC must fully understand how kids’ personal information is treated before the 2013 rule can be modified, in order to ensure that children and their data are protected.” The following can be attributed to Marc Rotenberg, President, Electronic Privacy Information Center (EPIC): “The FTC should complete its homework before it proposes changes to the regulations that safeguard children’s privacy. Without a clear understanding of current industry practices, the agency’s proposal will be ill-informed and counterproductive." The following can be attributed to Lindsey Barrett, Staff Attorney and Teaching Fellow, Institute for Public Representation, Georgetown Law: The FTC should conduct 6(b) studies to shed light on the complex and evolving profiling practices that violate children’s privacy. Children are being monitored, quantified, and analyzed more than ever before, and the Commission cannot make informed decisions about the rules that protect them online based on limited or skewed information about the online ecosystem. The following can be attributed to Robert Weissman, President, Public Citizen: “The online corporate predators are miles ahead of the FTC, employing surveillance and targeting tactics against children that flout the protections enshrined in COPPA. The first thing the FTC should do is invoke its investigative powers to get a firm grasp on how Big Tech is systematically invading children’s privacy.” The following can be attributed to Cheryl A. Leanza, Policy Advisor, UCC OC Inc.: “In the modern era, our data are our lives and our children’s lives are monitored and tracked in more detail than any previous generation to unknown effect. Parents seek to pass on their own values and priorities to their children, but feel subverted at every turn by unknown algorithms and marketing efforts directed to their children. At a minimum, the FTC must collect basic facts and trends about children and their data privacy.” The following can be attributed to Eric Rodriguez, Senior Vice President, UnidosUS: “All children should have the right to privacy and live free from discrimination, including in digital spaces. Latino children are targeted by digital marketing efforts and with real consequences to their health and wellbeing. UnidosUS urges the Commission to use its authority and study how children of color operate in the digital space, what happens to their personal data, and how well they are protected by COPPA. Only then can the Commission take effective and objective action to strengthen COPPA to protect an increasingly diverse youth population.”
    Jeff Chester
  • SUBJECT: CCFC and CDD statement on today’s YouTube inquiry by Senator Markey Campaign for a Commercial-Free Childhood and the Center for Digital Democracy, whose complaint led to the FTC settlement (link is external) which requires YouTube to change its practices to comply with federal children’s privacy law, applaud Senator Ed Markey for writing to Google (link is external) to inquire about YouTube’s child-directed advertising practices. “To its credit, Google has developed a robust set of safeguards and policies on YouTube Kids to protect children from advertising for harmful products and exploitative influencer marketing. Now that Google has been forced to abandon the fiction that the main YouTube platform is exclusively for ages 13 and up, it should apply the same protections on all child-directed content, regardless of which YouTube platform kids are using.” Josh Golin, Campaign for a Commercial-Free Childhood “Google should treat all children fairly on YouTube and apply the same set of advertising and content safeguards it has especially developed for YouTube Kids. When young people view child-directed programming on YouTube, they should also be protected from harmful and unfair practices such as ‘influencer’ marketing, exposure to ‘dangerous’ material, violent content, and exposure to food and beverage marketing.” Jeff Chester, Center for Digital Democracy
  • Press Release

    Grading Digital Privacy Proposals in Congress

    Which digital privacy proposals in Congress make the grade?

    Subject: Which digital privacy proposals in Congress make the grade? Nov. 21, 2019 Contact: David Rosen, drosen@citizen.org (link sends e-mail), (202) 588-7742 Susan Grant, sgrant@consumerfed.org (link sends e-mail), (202) 387-6121 Caitriona Fitzgerald, fitzgerald@epic.org (link sends e-mail), (617) 945-8409 Katharina Kopp, kkopp@democraticmedia.org (link sends e-mail), (202) 836-4621 Campaign for a Commercial-Free Childhood · Center for Digital Democracy · Color of Change · Consumer Federation of America · Consumer Action · Electronic Privacy Information Center · Parent Coalition for Student Privacy · Privacy Rights Clearinghouse · Public Citizen · U.S. PIRG NOTE TO REPORTERS Grading Digital Privacy Proposals in Congress When it comes to digital privacy, we’re facing an unprecedented crisis. Tech giants are spying on our families and selling the most intimate details about our lives for profit. Bad actors, both foreign and domestic, are targeting personal data gathered by U.S. companies – including our bank details, email messages and Social Security numbers. Algorithms used to determine eligibility for jobs, housing, credit, insurance and other life necessities are having disparate, discriminatory impacts on disadvantaged groups. We need a new approach. Consumer, privacy and civil rights groups are encouraged by some of the bills that recently have been introduced in Congress, many of which follow recommendations in the groups’ Framework for Comprehensive Privacy Protection and Digital Rights in the United States (link is external). The framework calls for baseline federal privacy legislation that: - Has a clear and comprehensive definition of personal data; - Establishes an independent data protection agency; - Establishes a private right of action allowing individuals to enforce their rights; - Establishes individual rights to access, control and delete data; - Puts meaningful privacy obligations on companies that collect personal data; - Requires the establishment of algorithmic governance to advance fair and just data practices; - Requires companies to minimize privacy risks and minimize data collection; - Prohibits take-it-or-leave-it or pay-for-privacy terms; - Limits government access to personal data; and - Does not preempt stronger state laws. Three bills attained the highest marks in the recent Privacy Legislation Scorecard (link is external) compiled by the Electronic Privacy Information Center (EPIC): - The Online Privacy Act (H.R. 4978 (link is external)), introduced by U.S. Reps. Anna Eshoo (D-Calif.) and Zoe Lofgren (D-Calif.), takes a comprehensive approach and is the only bill that calls for a U.S. Data Protection Agency. The bill establishes meaningful rights for individuals and clear obligations for companies. It does not preempt state law, but it lacks explicit anti-preemption language, which would make it more effective. - The Mind Your Own Business Act (S. 2637 (link is external)), introduced by U.S. Sen. Ron Wyden (D-Ore.), requires companies to assess the impact of the automated systems they use to make decisions about consumers and how well their data protection mechanisms are working. It has explicit anti-preemption language and holds companies accountable when they fail to protect privacy. The private right of action should be broader, and the bill needs clear limits on data uses. - The Privacy Rights for All Act (S. 1214 (link is external)), introduced by U.S. Sen. Ed Markey (D-Mass.), has important provisions minimizing data collection and delinking user identities from collected data, and prohibits bias and discrimination in automated decision-making. It also includes a strong private right of action and bans forced arbitration for violations. It does not preempt state law, but it lacks explicit anti-preemption language, which would make it more effective. Two bills are plainly anti-privacy. The Information Transparency & Personal Data Control Act (H.R. 2013 (link is external)), introduced by U.S. Rep. Suzan DelBene (D-Wash.), falls woefully short. It provides few protections for individuals, contains overly broad exemptions and preempts stronger state laws. The Balancing the Rights of Web Surfers Equally and Responsibility (BROWSER) Act (S. 1116 (link is external)), introduced by U.S. Sen. Marsha Blackburn (R-Tenn.), is based on the old, ineffective take-it-or-leave-it terms of use model, does not allow agency rulemaking, is weak on enforcement and preempts state laws. Both are bad, anti-privacy bills. Future federal privacy bills must make the grade. Additional privacy bills are expected to be introduced by U.S. Sen. Maria Cantwell (D-Wash.) and U.S. Rep. Jan Schakowsky (D-Ill.). Separately, U.S. Sens. Richard Blumenthal (D-Conn.), Roger Wicker (R-Miss.) and Josh Hawley (R-Mo.) may release their own bills. These leaders should strive to meet the standards that the framework lays out. Baseline privacy legislation must not preempt stronger state protections and laws – such as the California Consumer Privacy Protection Act (link is external) that takes effect in 2020, biometric data protection laws such as those in Illinois (link is external) and Texas (link is external), and data breach notification laws (link is external) that exist in every state. States must be allowed to continue serving as “laboratories of democracy,” pioneering innovative new protections to keep up with rapidly changing technologies. In addition, federal privacy legislation must include a strong private right of action – a crucial tool consumers need to enforce their rights and change the behavior of powerful corporations – and establish safeguards against data practices that lead to unjust, unfair, manipulative and discriminatory outcomes. For more information, see these fact sheets (link is external). Please contact any of the individuals listed above to speak with an expert. ###
  • Press Release

    Will the FTC Weaken Children’s Privacy Rules?

    Invited Advocates Raise Concerns About Upcoming COPPA Workshop, Plans to Undermine Federal Protections for Kids, October 7 D.C. lineup dominated by tech industry supporters

    Contact: David Monahan, CCFC (david@commercialfreechildhood.org (link sends e-mail); 617-896-9397) Jeff Chester, CDD (jeff@democraticmedia.org (link sends e-mail); 202-494-7100) Will the FTC Weaken Children’s Privacy Rules? Invited Advocates Raise Concerns About Upcoming COPPA Workshop, Plans to Undermine Federal Protections for Kids October 7 D.C. lineup dominated by tech industry supporters WHAT: The Future of the Children’s Online Privacy Protection Act Rule (COPPA): An FTC Workshop (link is external) WHEN: October 7, 2019, 9:00 am ET WHERE: Constitution Center, 400 7th St SW, Washington, DC WORKSHOP PRESENTERS FOR CAMPAIGN FOR A COMMERCIAL-FREE CHILDHOOD (CCFC) AND CENTER FOR DIGITAL DEMOCRACY (CDD): THE CHALLENGE: In 2012, the FTC approved new safeguards to protect children’s privacy in the digital era, heeding the advice of child advocates, consumer groups, privacy experts and health professionals. But now the Commission has called for comments (link is external) on COPPA three years before a new review is mandated by statute. The questions posed by the Commission, as well as public comments made by FTC staff, make privacy advocates wary that the FTC’s goal is to roll back COPPA safeguards rather than strengthen protections for children. Concerns about the FTC creating new loopholes or supporting industry calls to weaken the rules are heightened by the FTC’s speaker list for this workshop, replete with tech and marketing companies and their lawyers and lobbyists, with just a few privacy and children’s advocates at the table. The advocates are also concerned that the FTC is contemplating this action just weeks after its most significant COPPA enforcement action to date—requiring major changes to Google’s data collection practices on YouTube—a move that could result in rules being changed before those new practices have even been implemented. Children and families need increased COPPA enforcement, not weaker rules. The key problems, the advocates note, are the lack of enforcement of the law by the FTC; the failure of the agency to protect children from unfair marketing practices, such as influencers; and the need to maintain the strongest possible safeguards—whether in the home, school or on mobile devices. Speakers at the workshop include: Josh Golin, Executive Director, CCFC Will participate in a panel entitled Scope of the COPPA Rule. Katharina Kopp, Ph.D., Deputy Director, Director of Policy, CDD Will participate in a panel entitled Uses and Misuses of Persistent Identifiers. Laura M. Moy, Associate Professor of Law, Director, Communications & Technology Law Clinic, Georgetown University Law Center Will participate in a panel entitled State of the World in Children’s Privacy. Josh, Katharina, and Laura are available for questions in advance of the workshop, and will also be available to speak with press on site. See video of Future of COPPA Workshop here: https://www.ftc.gov/news-events/audio-video/video/future-coppa-rule-ftc-... (link is external) https://www.ftc.gov/news-events/audio-video/video/future-coppa-rule-ftc-... (link is external) ###
  • Contact: Jeff Chester, CDD (jeff@democraticmedia.org (link sends e-mail); 202-494-7100) David Monahan, CCFC (david@commercialfreechildhood.org (link sends e-mail); 617-896-9397) Advocates Who Filed the Privacy Complaint Against Google/YouTube Laud Improvements, But Say FTC Settlement Falls Far Short BOSTON, MA & WASHINGTON, DC—September 4, 2019—The advocates who triggered the Federal Trade Commission’s (FTC) investigation into YouTube’s violations of the Children’s Online Privacy Protection Act (COPPA) say the FTC’s settlement with Google will likely significantly reduce behavioral marketing to children on YouTube, but doesn’t do nearly enough to ensure children will be protected or to hold Google accountable. In April, 2018, Campaign for a Commercial-Free Childhood (CCFC) and the Center for Digital Democracy (CDD), through their attorneys at Georgetown Law’s Institute for Public Representation (IPR), filed an FTC complaint (link is external) detailing YouTube’s COPPA violations. Twenty-one other privacy and consumer groups signed on to CCFC and CDD’s complaint, which detailed how Google profits by collecting personal information from kids on YouTube, without first providing direct notice to parents and obtaining their consent as required by law. Google uses this information to target advertisements to children across the internet and across devices, in clear violation of COPPA. Today, the FTC and the New York Attorney General announced a settlement with Google, fining the company $170 million. The settlement also “requires Google and YouTube to develop, implement, and maintain a system that permits channel owners to identify their child-directed content on the YouTube platform so that YouTube can ensure it is complying with COPPA.” Content creators will be asked to disclose if they consider their videos to be child-directed; if they do, no behavioral advertising will be served to viewers of those videos. “We are pleased that our advocacy has compelled the FTC to finally address YouTube’s longstanding COPPA violations and that there will be considerably less behavioral advertising targeted to children on the number one kids’ site in the world,” said CCFC’s Executive Director Josh Golin. “But it’s extremely disappointing that the FTC isn’t requiring more substantive changes or doing more to hold Google accountable for harming children through years of illegal data collection. A plethora of parental concerns about YouTube – from inappropriate content and recommendations to excessive screen time – can all be traced to Google’s business model of using data to maximize watch time and ad revenue.” In a July 3, 2019 (link is external) letter to the FTC, the advocates specifically warned that shifting the burden of COPPA compliance from Google and YouTube to content creators would be ineffective. The letter noted many children’s channels were unlikely to become COPPA compliant by turning off behavioral advertising, since Google warns that turning off these ads “may significantly reduce your channel’s revenue.” The letter also detailed Google’s terrible track record of ensuring COPPA compliance on its platforms; a 2018 study found that 57% of apps in the Google Play Store’s Designed for Families program were violating COPPA despite Google’s policy that apps in the program must be COPPA compliant. And as Commissioner Rebecca Slaughter wrote in her dissent, many children’s content creators are not U.S.-based and therefore are unlikely to be concerned about FTC enforcement. “We are gratified that the FTC has finally forced Google to confront its longstanding lie that it wasn’t targeting children on YouTube,” said CDD’s executive director Jeff Chester, who helped spearhead the campaign that led to the 1998 passage of COPPA “However, we are very disappointed that the Commission failed to penalize Google sufficiently for its ongoing violations of COPPA and failed to hold Google executives personally responsible for the roles they played. A paltry financial penalty of $170 million—from a company that earned nearly $137 billion in 2018 alone -- sends a signal that if you are a politically powerful corporation, you do not have to fear any serious financial consequences when you break the law. Google made billions off the backs of children, developing a host of intrusive and manipulative marketing practices that take advantage of their developmental vulnerabilities. More fundamental changes will be required to ensure that YouTube is a safe and fair platform for young people.” Echoing Commissioner Rohit Copra’s dissent, the advocates noted that unlike smaller companies sanctioned by the FTC, Google was not forced to pay a penalty larger than its “ill-gotten gains.” In fact, with YouTube earning a reported $750 million annually from children’s content alone, the $170 million fine amounts to less than three months of advertising revenue from kids’ videos. With a maximum fine of $41,484 per violation, the FTC easily could have sought a fine in the tens of billions of dollars. "I am pleased that the FTC has made clear that companies may no longer avoid complying with COPPA by claiming their online services are not intended for use by children when they know that many children in fact use their services,” said Angela Campbell, Director Emeritus of IPR’s Communications and Technology Clinic at Georgetown Law, which researched and drafted the complaint. Campbell, currently chair of CCFC’s Board, served as lead counsel to CCFC and CDD on the YouTube and other complaints alleging COPPA violations. She, along with Chester, was responsible for filing an FTC complaint in 1996 against a child-directed website that led to Congress’s passage of COPPA in 1998 (link is external). COPPA gave the FTC expanded authority to implement and enforce the law, for example, by including civil penalties. About the proposed settlement, Campbell noted: “It’s disappointing that the FTC has not fully used its existing authority to hold Google and YouTube executives personally liable for adopting and continuing to utilize a business model premised on ignoring children’s privacy protection, to adopt a civil penalty substantial enough to deter future wrongdoing, or to require Google to take responsibility for ensuring that children’s content on YouTube platforms complies with COPPA.” On the heels of a sweetheart settlement with Facebook, the advocates said the deal with Google was further proof the FTC wasn’t up to the task of protecting consumers’ privacy. Said Campbell, “I support Commissioner Slaughter’s call to state attorney generals to step up and hold Google accountable. Added Chester, “The commission’s inability to stop Google’s cynically calculated defiance of COPPA underscores why Congress must create a new consumer watchdog that will truly protect Americans’ privacy.” Organizations which signed on to the CCFC/CDD 2018 FTC complaint were Berkeley Media Studies Group; Center for Media Justice; Common Sense; Consumer Action; Consumer Federation of America; Consumer Federation of California; Consumers Union, the advocacy division of Consumer Reports; Consumer Watchdog; Corporate Accountability; Defending the Early Years; Electronic Privacy Information Center (“EPIC”); New Dream; Obligation, Inc.; Parent Coalition for Student Privacy; Parents Across America; Parents Television Council; Privacy Rights Clearinghouse; Public Citizen; The Story of Stuff Project; TRUCE (Teachers Resisting Unhealthy Childhood Entertainment); and USPIRG. ###
  • Press Statement Google YouTube FTC COPPA Settlement Statement of Katharina Kopp, Ph.D. Deputy Director Center for Digital Democracy August 30, 2019 It has been reported that Google has agreed to pay between $150 million and $200 million to resolve an FTC investigation into YouTube over alleged violations of a children's privacy law. A settlement amount of $150-200 million would be woefully low, considering the egregious nature of the violation, how much Google profited from violating the law, and given Google’s size and revenue. Google’s unprecedented violation requires an unprecedented FTC response. A small amount like this would effectively reward Google for engaging in massive and illegal data collection without any regard to children’s safety. In addition to assessing substantial civil penalties, the FTC must enjoin Google from committing further violations of COPPA and impose effective means for monitoring compliance; the FTC must impose a 20-year consent decree to ensure Alphabet Inc. acts responsibly when it comes to serving children and parents. ------ In April, 2018, the Center for Digital Democracy (CDD) and the Campaign for Commercial-Free Childhood (CCFC), through their attorneys at Georgetown Law’s Institute for Public Representation (IPR), filed an FTC complaint (link is external) detailing YouTube’s COPPA violations. Twenty-one other privacy and consumer groups signed on to CCFC and CDD’s complaint, which detailed how Google profits by collecting personal information from kids on YouTube, without first providing direct notice to parents and obtaining their consent as required by law. Google uses this information to target advertisements to children across the internet and across devices, in clear violation of COPPA.
  • Press Release

    FTC Fails to Protect Privacy in Facebook decision

    Instead of serious structural and behavioral change, 3-2 deal is a huge giveaway. By dismissing all other claims, Simons' FTC does disservice to public

    Statement of Jeff Chester, executive director, Center for Digital Democracy--CDD helped bring the 2009 FTC complaint that is the subject of today's decision on the Consent Order Once again, the Federal Trade Commission has shown itself incapable of protecting the privacy of the public and also preventing ongoing consumer harms. Today's announcement of a fine and--yet again! --improved system of internal compliance and other auditing controls doesn't address the fundamental problems. First, the FTC should have required Facebook to divest both its Instagram and Whatsapp platforms. By doing so, the commission would have prevented what will be the tremendous expansion of Facebook's ability to continually expand its data gathering activities. By failing to require this corporate break-up, the FTC has set the stage for what will be "Groundhog Day" violations of privacy for years to come. The FTC should have insisted that an independent panel of experts--consumer groups, data scientists, civil rights groups, etc.--be empaneled to review all the company's data related products, to decide which ones are to be modified, eliminated, or allowed to continue (such as lookalike modeling, role of influencers, cross-device tracking, etc.). This group should have been given the authority to review all new products proposed by the company for a period of at least five years. What was needed here was a serious change in the corporate culture, along with serious structural remedies, if the FTC really wanted to ensure that Facebook would act more responsibly in the future. The dissents by Commissioners Chopra and Slaughter illustrate that the FTC majority could have taken another path, instead of supporting a decision that will ultimately enable the problems to continue. Today's decision also dismisses all other complaints and requests for investigation related to Facebook's consent decree failures--a huge giveway. The FTC should be replaced by a new data protection agency to protect privacy. The commission has repeatedly demonstrated that--regardless of who is in charge--it is incapable of confronting the most powerful forces that undermine our privacy--and digital rights.
  • Press Release

    FTC must impose maximum fine and ensure Google’s YouTube business practices obey children’s privacy law, say groups

    Google’s unprecedented violation requires an unprecedented FTC response and a 20-year consent decree to ensure Alphabet Inc. acts responsibly when it comes to serving children and parents; Google executives should also be held accountable.

    June 25, 2019 The Honorable Joseph Simons The Honorable Noah Phillips Chairman Commissioner Federal Trade Commission Federal Trade Commission 600 Pennsylvania Avenue, NW 600 Pennsylvania Avenue, NW Washington, DC 20580 Washington, DC 20580 The Honorable Rohit Chopra The Honorable Rebecca Slaughter Commissioner Commissioner Federal Trade Commission Federal Trade Commission 600 Pennsylvania Avenue, NW 600 Pennsylvania Avenue, NW Washington, DC 20580 Washington, DC 20580 The Honorable Christine Wilson Commissioner Federal Trade Commission 600 Pennsylvania Avenue, NW Washington, DC 20580 Dear Chairman Simons, Commissioner Phillips, Commissioner Chopra, Commissioner Slaughter, and Commissioner Wilson: The Campaign for a Commercial-Free Childhood (CCFC) and Center for Digital Democracy (CDD) have been encouraged by recent media reports that the Federal Trade Commission is preparing to take action against Google and YouTube for violating the Children’s Online Privacy Protection Act (COPPA). CCFC and CDD, represented by the Institute for Public Representation at Georgetown Law (IPR), are the organizations responsible for drafting the Request to Investigate Google and YouTube filed with the FTC on April 9, 2018. Previously, IPR filed on behalf of CCFC and CDD Requests to Investigate YouTube’s promotion of unfair and deceptive influencer marketing to children (October 21, 2016) and unfair and deceptive marketing practices on YouTube Kids (April 7, and November 24, 2015). As you are aware, YouTube has profited enormously by hosting channels and videos of nursery rhymes, unboxing videos, popular cartoons, and other content specifically designed for children on the main YouTube platform. But instead of getting the verifiable parental consent required before collecting children’s personal information, Google claims that YouTube is not for children under thirteen, and therefore, no consent is required. This defense is outlandish given that YouTube is the number one online destination for kids. In short, Google has profited by violating the law and the privacy of tens of millions of children. For this reason, the FTC must sanction Google at a scale commensurate with the company’s unprecedented and unparalleled violations of COPPA. As we pointed out in our Request to Investigate, the maximum civil penalties should be imposed because: Google had actual knowledge of both the large number of child-directed channels on YouTube and the large numbers of children using YouTube. Yet, Google collected personal information from nearly 25 million children in the U.S over a period of years, and used this data to engage in very sophisticated digital marketing techniques. Google’s wrongdoing allowed it to profit in two different ways: Google has not only made a vast amount of money by using children’s personal information as part of its ad networks to target advertising, but has also profited from advertising revenues from ads on its YouTube channels that are watched by children. [April 9, 2018 Request to Investigate at 26-27 (footnotes omitted)]. Moreover, any consent order must mandate meaningful changes to YouTube’s business practices. For example, all child-directed content should be placed on a separate platform where targeted advertising, commercial data collection, links to other sites or content, and autoplay are prohibited. Google must also live up to its Terms of Service – which stipulate YouTube is only for persons thirteen and older – by removing all kids’ content from the main YouTube platform. By ensuring such changes, the Commission will do a tremendous service to America’s families seeking to provide a healthy media environment for their children, while sending a clear message to all online and mobile operators that no one is above the law. Google’s disregard of children’s welfare is demonstrated not only by the evidence in our complaints, but by numerous reports of violent, sexual and other inappropriate content available to children on both YouTube Kids and on the main YouTube platform. Moreover, the company refused to turn off recommendations on videos featuring young children in leotards and bathing suits even after researchers demonstrated YouTube’s algorithm was recommending these videos to pedophiles. These ongoing and serious issues require that the FTC take strong action. We believe that Google should repay America’s families by creating a truly safe space for kids and fostering the production of quality non-commercial children’s programming. Attached you will find a list of recommended penalties and conditions to be included in a final consent order. We would be happy to meet with you to discuss our proposed remedies in greater detail. Thank you. Sincerely, Jeffrey Chester Executive Director Center for Digital Democracy Josh Golin Executive Director Campaign for a Commercial-Free Childhood Angela J. Campbell Director Institute for Public Representation Georgetown Law Encl.: Proposed Consent Order Penalties and Conditions Proposed Consent Order Penalties and Conditions The FTC should seek a 20-year consent decree which includes the following forms of relief: Injunctive relief Destroy all data collected from children under 13, in all forms in Google’s possession, including inferences drawn from this data, custody, or control of YouTube and all of Alphabet’s subsidiaries engaged in online data collection or commercial uses (e.g. advertising), including, but not limited to, Google Ads, Google Marketing Platform and their predecessors. Immediately stop collecting data from any user known to be under age 13, and any user that a reasonable person would likely believe to be under age 13, including, but not limited to, persons that are viewing any channel or video primarily directed to children, persons who have been identified for targeted ads based on being under 13 or any proxy for under 13 (e.g., grade in school, interest in toys, etc.), or any other factors. Identify, as of the date of this consent order, as well as on an ongoing basis, any users under age 13, and prohibit them from accessing content on YouTube. Prohibit users under age 13 from accessing content on YouTube Kids unless and until YouTube has provided detailed notice to parents, obtained parental consent, and complied with all of the other requirements of COPPA and this consent order. Remove all channels in the Parenting and Family lineup, as well as any other YouTube channels and videos directed at children, from YouTube. YouTube may make such channels and videos available on a platform specifically intended for children (e.g. YouTube Kids) only after qualified human reviewers have reviewed the content and determined that the programming comply with all of the policies for YouTube’s child-directed platform, which must include, but are not limited to: No data collection for commercial purposes. Any data collected for “internal purposes” must be clearly identified as to what is being collected, for what purpose, and who has access to the data. It may not be sold to any third parties. No links out to other sites or online services. No recommendations or autoplay. No targeted marketing. No product or brand integration, including influencer marketing. Consumer education Require Google to fund independent organizations to undertake educational campaigns to help children and parents understand the true nature of Google’s data-driven digital marketing systems and its potential impacts on children’s wellbeing and privacy. Require Google to publicly admit (in advertising and in other ways) that it has violated the law and warn parents that no one under 13 should use YouTube. Record keeping and monitoring provisions Google must submit to an annual audit by a qualified, independent auditor to ensure that Google is complying with all aspects of the consent decree. The auditor must submit their report to the FTC. The FTC shall provide reports to Congress about the findings. All of the annual audits must be publicly available without redaction on the Commission’s website within 30 days of receipt. Google may not launch any new child-directed service until the new service has been reviewed and approved by an independent panel of experts – including child development and privacy experts – to be appointed by the FTC. Google must retain, and make available to the FTC on request, documentation of its compliance with the consent decree. Civil penalties and other monetary relief Google will pay the maximum possible civil penalties – $42,530 per violation. Whether violations are counted per child or per day, the total amount of the penalty must be sufficiently high to deter Google and YouTube from any further violations of COPPA. Google to establish a $100 million fund to be used to support the production of noncommercial, high-quality, and diverse content for children. Decisions about who receives this money must be insulated from influence by Google. In addition, we ask the FTC to consider using its authority under Section 13(b) of the FTC Act to require Google and YouTube to disgorge ill-gotten gains, and to impose separate civil penalties on the management personnel at Google and YouTube who knowingly allowed these COPPA violations to occur.
  • Contact: Josh Golin, CCFC: josh@commercialfreechildhood.org (link sends e-mail); (617) 896-9369 Jeff Chester, CDD: jeff@democraticmedia.org (link sends e-mail); (202) 494-7100 Advocates Demand FTC Investigation of Echo Dot Kids Edition Amazon violates COPPA in many ways, including keeping data that parents believe they deleted BOSTON, MA — May 9, 2019 — Today, a coalition of 19 consumer and public health advocates led by the Campaign for a Commercial-Free Childhood (CCFC) and the Center for Digital Democracy (CDD) called on the Federal Trade Commission (FTC) (link is external) to investigate and sanction Amazon for infringing on children’s privacy through its Amazon Echo Dot Kids Edition. An investigation by CCFC and the Institute for Public Representation (IPR) at Georgetown Law revealed that Echo Dot Kids, a candy-colored version of Amazon’s home assistant with Alexa voice technology, violates the Children’s Online Privacy Protection Act (COPPA) in many ways. Amazon collects sensitive personal information from kids, including their voice recordings and data gleaned from kids’ viewing, reading, listening, and purchasing habits, and retains it indefinitely. Most shockingly, Amazon retains children’s data even after parents believe they have deleted it. CCFC and IPR have produced a video (link is external) demonstrating how Amazon ignores the request to delete or “forget” a child’s information it has remembered. The advocates’ FTC complaint also say Amazon offers parents a maze of multiple privacy policies, which violate COPPA because they are confusing, misleading and even contradictory. “Amazon markets Echo Dot Kids as a device to educate and entertain kids, but the real purpose is to amass a treasure trove of sensitive data that it refuses to relinquish even when directed to by parents,” said Josh Golin, CCFC’s Executive Director. “COPPA makes clear that parents are the ones with the final say about what happens to their children’s data, not Jeff Bezos. The FTC must hold Amazon accountable for blatantly violating children’s privacy law and putting kids at risk.” Amazon Echo Dot Kids Edition comes with a one-year subscription to FreeTime Unlimited, which connects children with entertainment like movies, music, audiobooks, and video games. The always-on listening device is often placed in the child’s bedroom, and kids are encouraged to interact with it as if Alexa was a close friend. Kids can download “skills,” similar to apps, to add functionality. In clear violation of COPPA, Amazon disavows responsibility for the data collection practices of Alexa skills for kids and tells parents to check the skill developers’ privacy policies. To make matters worse, 85% of skills for kids have no privacy policy posted. Amazon does not verify that the person consenting to data collection is an adult, let alone the child’s parent. The advocates also say the Echo Dot has a “playdate problem”: a child whose parents have not consented will have their conversations recorded and sensitive information retained when visiting a friend who owns the device. “We spent months analyzing the Echo Dot Kids and the device’s myriad privacy policies and we still don’t have a clear picture of what data is collected by Amazon and who has access to it,” said Angela Campbell, a CCFC Board Member and Director of IPR’s Communications and Technology Clinic at Georgetown Law, which researched and drafted the complaint. “If privacy experts can’t make heads or tails of Amazon’s privacy policy labyrinth, how can a parent meaningfully consent to the collection of their children’s data?” “By providing misleading tools that don’t actually allow parents to delete their children’s data, Amazon has made a farce of parents’ difficult task of protecting their children’s privacy,” said Lindsey Barrett, Staff Attorney and Teaching Fellow at IPR. “COPPA requires companies to allow parents to delete their children’s personal information, and Amazon is breaking the law— not to mention breaking parents’ trust.” “It’s shameful that Amazon is ensnaring children and their valuable data in its race to market dominance,” said Jeff Chester of CDD. "COPPA was enacted to empower parents to have control over their children’s data, but at every turn Echo Dot Kids thwarts parents who want to limit what Amazon knows about their child. The FTC must hold Amazon accountable to make clear that voice-activated, always-on devices must respect children’s privacy." Organizations which signed today’s complaint were the Campaign for a Commercial-Free Childhood, Center for Digital Democracy, Berkeley Media Studies Group, Color of Change, Consumer Action, Consumer Federation of America, Defending the Early Years, Electronic Privacy Information Center, New Dream, Open MIC (Open Media and Information Companies Initiative), Parents Across America, Parent Coalition for Student Privacy, Parents Television Council, Peace Educators Allied for Children Everywhere (P.E.A.C.E.), Public Citizen, Raffi Foundation for Child Honouring, Story of Stuff, TRUCE (Teachers Resisting Unhealthy Childhood Entertainment), and U.S. PIRG. In May 2018, CCFC and CDD issued a warning (link is external), supported by experts like Drs. Sherry Turkle, Jenny Radesky, and Dipesh Navsaria, that parents should steer clear of Echo Dot Kids. The advocates cautioned that Echo Dot endangers children’s privacy, and by encouraging young children to spend more time with and form “faux relationships” with digital devices, it threatens their healthy development. Added Josh Golin: “Echo Dot Kids interferes with children’s healthy development and relationships and threatens their privacy. Parents should resist Amazon’s efforts to indoctrinate children into a culture of surveillance, and say ‘no’ to Echo Dot Kids.” The investigation by CCFC and IPR was made possible by a generous grant from the Rose Foundation for Communities and the Environment (link is external).
  • Press Release

    Press Briefing: Senators, Coalition Call for Sweeping National Digital Privacy Legislation

    Experts to Demand Federal Action to Combat Growing digital-consumer and Civil Rights Threats

    For Immediate Release: Feb. 26, 2019 Contact: Mike Stankiewicz, mstankiewicz@citizen.org (link sends e-mail), (202) 588-7779 Jeff Chester jeff@democraticmedia.org (link sends e-mail) (202) 494-7100 MEDIA ADVISORY Press Briefing: Senators, Coalition Call for Sweeping National Digital Privacy Legislation Experts to Demand Federal Action to Combat Growing digital-consumer and Civil Rights Threats WHAT: Staff briefing, open to the press and sponsored by U.S. Sens. Ed Markey (D-Mass.) Tom Udall (D-N.M.), at which consumer protection and civil rights advocates will explain the urgent need for sweeping federal digital privacy legislation with a new approach. It includes baseline legislation that doesn’t pre-empt state privacy laws, the creation of a new federal data protection agency and safeguards against data practices that lead to unjust, unfair, manipulative or discriminatory, outcomes. The briefing comes as Americans increasingly demand protection of their digital privacy in light of unethical privacy sharing and harvesting by tech giants. Without any constraints, Big Tech companies and their partners are collecting sensitive information on American citizens, ranging from financial and health information to data that tracks our Internet activity across all our devices, our location throughout the day and much more. Both the federal government and the states must be empowered to protect the public from this ever-growing online threat to their privacy, welfare and civil rights. Members of the Privacy and Digital Rights for All coalition, which has announced a Framework for Comprehensive Privacy Protection And Digital Rights (link is external) in the U.S. also will speak about the need for enduring privacy innovation and limiting government access to personal data. WHEN: 2 p.m. EST, Mon., March 4 WHO: Ed Mierzwinski, consumer program director, U.S. PIRG Jeffrey Chester, executive director, Center for Digital Democracy Brandi Collins-Dexter, senior campaign director, Color of Change Josh Golin, executive director, Campaign for a Commercial-Free Childhood Burcu Kilic, research director, Public Citizen’s Access to Medicines program Christine Bannan, administrative law and policy fellow, Electronic Privacy Information Center (EPIC) WHERE: Hart Senate Office Building Room 216 120 Constitution Ave NE Washington, DC 20002 ###
  • Curbing Companies’ Bad Behavior Will Require Stronger Data Privacy Laws and a New Federal Data Privacy Agency Federal Privacy Laws Are Antiquated and Need Updating; New Data Privacy Legislation Must Include Civil Rights Protections and Enhanced Punishments for Violations Jan. 17, 2019 Contact: Don Owens, dowens@citizen.org (link sends e-mail), (202) 588-7767 Jeffrey Chester, jeff@democraticmedia.org (link sends e-mail), (202) 494-7100 WASHINGTON, D.C. – U.S. data privacy laws must be overhauled without pre-empting state laws and a new data privacy agency should be created to confront 21st century threats and address emerging concerns for digital customers, consumer and privacy organizations said today as they released a framework (link is external) for comprehensive privacy protection and digital rights for members of Congress. “Big Tech is coming to Washington looking for a deal that affords inadequate protections for privacy and other consumer rights but pre-empts states from defending their citizens against the tech companies’ surveillance and misuse of data,” said Robert Weissman, president of Public Citizen. “But here’s the bad news for the tech giants: That deal isn’t going to fly. Instead, the American people are demanding – and intend to win – meaningful federal restraints on tech company abuses of power that also ensure the right of states to craft their own consumer protections.” From the Equifax data breach to foreign election interference and targeted digital ads based on race, health and income, it’s clear that U.S. consumers face a crisis of confidence born from federal data privacy laws that are decades out of date and a lack of basic protections afforded them by digital conglomerates. These corporations, many of which dominate online spaces, are far more interested in monetizing every key stroke or click than protecting consumers from data breaches. For that reason, federal and state authorities must act, the groups maintain. The groups will push for federal legislation based on a familiar privacy framework, such as the original U.S. Code of Fair Information Practices and the widely followed Organization for Economic Cooperation and Development Privacy Guidelines. These frameworks should require companies that collect personal data and rights for individuals to: Establish limits on the collection, use and disclosure of sensitive personal data; Establish enhanced limits on the collection, use and disclosure of data of children and teens; Regulate consumer scoring and other business practices that diminish people’s physical health, education, financial and work prospects; and Prohibit or prevent manipulative marketing practices. The groups are calling for federal baseline legislation and oppose the pre-emption of state digital privacy laws. States have long acted as the “laboratories of democracy” and must continue to have the power to enact appropriate protections for their citizens as technology develops, the groups say. “Black communities should not have to choose between accessing the Internet and the right to control our data,” said Brandi Collins-Dexter, senior campaign director at Color Of Change. “We need privacy legislation that holds powerful corporations accountable for their impacts. Burdening our communities with the need to discern how complex terms of service and algorithms could harm us will only serve to reinforce discriminatory corporate practices. The privacy protection and digital rights principles released today create an important baseline for proactive data protections for our communities.” “For years now, Big Tech has used our sensitive information as a cash cow,” said Josh Golin, executive director of Campaign for a Commercial-Free Childhood. “Each innovation – whether it’s talking home assistants, new social media tools or software for schools – is designed to spy on families and children. We desperately need both 21st century legislation and a new federal agency with broad enforcement powers to ensure that children have a chance to grow up without their every move, keystroke, swipe and utterance tracked and monetized.” The United States is woefully behind other nations worldwide in providing these modern data protections for its consumers, instead relying solely on the Federal Trade Commission (FTC) to safeguard consumers and promote competition. But corporations understand that the FTC lacks rulemaking authority and that the agency often fails to enforce rules it has established. “The FTC has failed to act,” said Caitriona Fitzgerald, policy director at the Electronic Privacy Information Center. “The U.S. needs a dedicated data protection agency.” Alternately, many democratic nations like Canada, Mexico, the U.K., Ireland and Japan already have dedicated data protection agencies with independent authority and enforcement capabilities. Groups that have signed on to the framework include Berkeley Media Studies Group, Campaign for a Commercial-Free Childhood, Center for Digital Democracy, Center for Media Justice, Color of Change, Consumer Action, Consumer Federation of America, Defending Rights & Dissent, Electronic Privacy Information Center, Media Alliance, Parent Coalition for Student Privacy, Privacy Rights Clearinghouse, Privacy Times, Public Citizen, Stop Online Violence Against Women and U.S. PIRG. Read the groups’ proposal below. ###
  • Contact: David Monahan, CCFC (david@commercialfreechildhood.org (link sends e-mail); 617-896-9397)Jeff Chester, CDD (jeff@democraticmedia.org (link sends e-mail); 202-494-7100)Apps which Google rates as safe for kids violate their privacy and expose them to other harmsAdvocates, lawmakers call on FTC to address how Google’s Play Store promotes children’s games which violate kids’ privacy law, feature inappropriate content, and lure kids to watch ads and make in-app purchases BOSTON, MA and WASHINGTON, DC — December 19, 2018 — Today, a coalition of 22 consumer and public health advocacy groups led by Campaign for a Commercial-Free Childhood (CCFC) and Center for Digital Democracy (CDD) called on the Federal Trade Commission (link is external) (“FTC”) to investigate and sanction Google for the deceptive marketing of apps for young children. Google represents that the apps in the “Family” section of the Google Play Store are safe for children, but the apps often violate federal children’s privacy law, expose children to inappropriate content, and disregard Google’s own policies by manipulating children into watching ads and making in-app purchases.The Play Store is Google’s one-stop shop for Android apps, games, and entertainment. Apps in the “Family” section are promoted with a green star and, in some cases, a recommended age, like “Ages 5 & Under,” or “Ages 6-8.” Google is aware from several recent academic studies that many of the apps in this section are a threat to children’s privacy and wellbeing, yet it continues to promote them with these kid-friendly ratings.“The business model for the Play Store’s Family section benefits advertisers, developers, and Google at the expense of children and parents,” said CCFC’s Executive Director Josh Golin. “Google puts its seal of approval on apps that break the law, manipulate kids into watching ads and making purchases, and feature content like kids cleaning their eyes with sharp objects. Given Google’s long history of targeting children with unfair marketing and inappropriate content, including on YouTube, it is imperative that the FTC take swift action.”Lawmakers echoed the call for FTC action. “We’re repeatedly confronted with examples of tech companies that are just not doing enough to protect consumer privacy – and I’m particularly concerned about what this failure means for our children,” said U.S. Senator Tom Udall (D-NM) regarding today’s action by the advocates. “When real-world products are dangerous or violate the law, we expect retailers to pull them off the shelves. Google’s refusal to take responsibility for privacy issues in their Play Store allows for app developers to violate COPPA, all while Google cashes in on our children’s activity. It is past time for the Federal Trade Commission to crack down to protect children’s privacy.”“Google’s dominance in the app market cannot come at the expense of its clear legal obligations to protect kids that use its products.” said David N. Cicilline (RI-01), the top Democrat on the House Antitrust Subcommittee, who raised his concerns about this issue when the Chairman of the FTC testified last week. “I am pleased that this coalition of consumer and children’s advocacy groups are urging the FTC to scrutinize whether Google is improperly tracking children and selling their data.”Google policies require apps in the Kids and Family section of its Play Store to be compliant with the Children’s Online Privacy Protection Act (COPPA). But, Google doesn’t verify compliance, so Play Store apps for children consistently violate COPPA. Many apps send children’s data unencrypted, while others access children’s locations or transmit persistent identifiers without notice or verifiable parental consent. Google has known about these COPPA violations since at least July 2017, when they were publicly reported by Serge Egelman, a researcher at the University of California, Berkeley Center for Long-Term Cybersecurity. Yet Google continues to promote such apps as COPPA-compliant.“Our research revealed a surprising number of privacy violations on Android apps for children, including sharing geolocation with third parties,” said Serge Egelman, a researcher at the University of California, Berkeley. “Given Google’s assertion that Designed for Families apps must be COPPA compliant, it’s disappointing these violations still abound, even after Google was alerted to the scale of the problem.”Google’s policies also require apps for children to avoid “overly aggressive” commercial tactics, but the advocates’ FTC complaint reveals that many popular apps feature ads that interrupt gameplay, are difficult to click out of, or are required to watch in order to advance in a game. In addition, games represented to parents as free often pressure children to make in-app purchases, sometimes going so far as to show characters crying if kids don’t buy locked items. The complaint also offers examples of multiple children’s apps that serve ads for alcohol and gambling, despite those ads being barred by Google’s Ad Policy.Other apps designated as appropriate for children are clearly not. Some contain graphic, sexualized images, like TutoTOONS Sweet Baby Girl Daycare 4 – Babysitting Fun, which has over 10 million downloads. Others model actively harmful behavior, like TabTale’s Crazy Eye Clinic, which teaches children to clean their eyes with a sharp instrument, and has over one million downloads."Parents who download apps recommended for ages 8 and under don’t expect their child to see ads which promote gambling, alcoholic beverages, or violent video games,” said Angela Campbell, Director of the Communications and Technology Clinic at Georgetown Law, which drafted the complaint. “But Google falsely claims that apps listed in the Family section only have ads which are appropriate for children. It’s important for the FTC to act quickly to protect children, especially in light of Google's dominance in the app market."The coalition has previously asked the FTC to investigate developers of children’s apps, citing research from the University of Michigan that revealed manipulative advertising is rampant in apps popular with preschoolers. Today’s complaint focuses on Google, whose misrepresentation and promotion of those apps has led to hundreds of millions of downloads.“Google (Alphabet, Inc.) has long engaged in unethical and harmful business practices, especially when it comes to children,” explained Jeff Chester, executive director of the Center for Digital Democracy (CDD). "And the Federal Trade Commission has for too long ignored this problem, placing both children and their parents at risk over their loss of privacy, and exposing them to a powerful and manipulative marketing apparatus. As one of the world’s leading providers of content for kids online, Google continues to put the enormous profits they make from kids ahead of any concern for their welfare," Chester noted. “It’s time federal and state regulators acted to control Google’s 'wild west' Play Store App activities.”Joining the Campaign for a Commercial-Free Childhood and the Center for Digital Democracy in signing today’s complaint to the FTC are Badass Teachers Association, Berkeley Media Studies Group, Color of Change, Consumer Action, Consumer Federation of America, Consumer Watchdog, Defending the Early Years, Electronic Privacy Information Center, Media Education Foundation, New Dream, Open MIC (Open Media and Information Companies Initiative), Parents Across America, Parent Coalition for Student Privacy, Parents Television Council, Peace Educators Allied for Children Everywhere (P.E.A.C.E.), Privacy Rights Clearinghouse, Public Citizen, the Story of Stuff, TRUCE (Teachers Resisting Unhealthy Childhood Entertainment), and USPIRG.In addition to filing an FTC complaint, CCFC has launched a petition (link is external) asking Google to adopt the Kids’ Safer App Store Standards, which would bar advertising in apps for kids under 5, limit ads in apps for kids 6 -12, bar in-app purchases, and require apps to be reviewed by a human before being included in the Kids and Family section of the Play Store.###
  • 34 Civil Rights, Consumer, and Privacy Organizations Unite to Release Principles for Privacy Legislation Contact: Katharina Kopp (kkopp@democraticmedia.org (link sends e-mail)); 202-836 4621 Washington, DC ----- Today, 34 civil rights, consumer, and privacy organizations join in releasing public interest principles for privacy legislation (link is external), because the public needs and deserves strong and comprehensive federal legislation to protect their privacy and afford meaningful redress. Irresponsible data practices lead to a broad range of harms, including discrimination in employment, housing, healthcare, and advertising. They also lead to data breaches and loss of individuals’ control over personal information. Existing enforcement mechanisms fail to hold data processors accountable and provide little-to-no relief for privacy violations. The privacy principles outline four concepts that any meaningful data protection legislation should incorporate at a minimum: Privacy protections must be strong, meaningful, and comprehensive. Data practices must protect civil rights, prevent unlawful discrimination, and advance equal opportunity. Governments at all levels should play a role in protecting and enforcing privacy rights. Legislation should provide redress for privacy violations. These public interest privacy principles include a framework providing guidelines for policymakers considering how to protect the privacy of all Americans effectively while also offering meaningful redress. They follow three days of Federal Trade Commission hearings (link is external) about big data, competition, and privacy as well as the comment deadline on “Developing the Administration’s Approach to Privacy (link is external),” a request for comment from the National Telecommunications and Information Administration as the agency works to develop privacy policy recommendations for the Trump Administration, and ongoing work (link is external) at the National Institute for Standards and Technology to develop a privacy risk framework. The groups urge members of Congress to pass privacy legislation that ensures fairness, prevents discrimination, advances equal opportunity, protects free expression, and facilitates trust between the public and companies that collect their personal data. New America’s Open Technology Institute, Public Knowledge, Access Humboldt, Access Now, Berkeley Media Studies Group, Campaign for a Commercial-Free Childhood, Center for Democracy & Technology, Center for Digital Democracy, Center for Media Justice, Center on Privacy & Technology at Georgetown Law, Color of Change, Common Cause, Common Sense Kids Action, Consumer Action, Consumer Federation of America, Consumers Union, Customer Commons, Demand Progress, Free Press Action Fund, Human Rights Watch, Lawyers’ Committee for Civil Rights Under Law, Media Alliance, Media Mobilizing Project, National Association of Consumer Advocates, National Consumer Law Center, National Consumers League, National Digital Inclusion Alliance, National Hispanic Media Coalition, Oakland Privacy, Open MIC (Open Media and Information Companies Initiative), Privacy Rights Clearinghouse, Public Citizen, U.S. PIRG, and United Church of Christ, OC Inc. signed the principles. Additional local and national privacy advocates are encouraged to sign on. The following can be attributed to Eric Null, Senior Policy Counsel at New America’s Open Technology Institute: “For decades, privacy regulation has favored the company over the user -- companies set their own rules and users are left to fend for themselves. Worse, companies have even discriminated based on protected classes through algorithmic decision-making. Comprehensive privacy legislation must disrupt this status quo. Legislation that follows the public interest privacy principles will better protect users and give users more control over their data.” The following can be attributed to Allie Bohm, Policy Counsel at Public Knowledge: “It is imperative that any comprehensive privacy legislation reflect the concerns, interests, and priorities of actual human beings. Today, consumer protection, privacy, and civil rights groups come together to articulate those interests, priorities, and concerns. Importantly, these principles address the many harms people can experience from privacy violations and misuse of personal data, including enabling unfair price discrimination; limiting awareness of opportunities; and contributing to employment, housing, health care, and other forms of discrimination.” The following can be attributed to Amie Stepanovich, U.S. Policy Manager at Access Now: “From Europe to India to Brazil, data privacy legislation is becoming the norm around the world, and people in the United States are getting left behind. It is long past time that our legislators acted to protect people across the country from opaque data practices that can result in its misuse and abuse, and any acceptable package must start with these principles.” The following can be attributed to Josh Golin, Executive Director at Campaign for a Commercial-Free Childhood: “What big tech offers for ‘free’ actually comes at a high cost -- our privacy. Worst of all is how vulnerable kids are tracked online and then targeted with manipulative marketing. This has to stop. We need laws that will empower parents to protect their children’s privacy.” The following can be attributed to Joseph Jerome, Policy Counsel at Center for Democracy & Technology: “Debates about national privacy laws focus on how companies should implement Fair Information Practices. The operative word is ‘fair.’ When it comes to how companies collect, use, and share our data, too many business practices are simply unfair. Federal law must go beyond giving consumers more notices and choices about their privacy, and we think it is time for legislators in Congress to flip the privacy presumption and declare some data practices unfair.” The following can be attributed to Katharina Kopp, Director of Policy at Center for Digital Democracy: “To this day, U.S. citizens have had to live without effective privacy safeguards. Commercial data practices have grown ever more intrusive, ubiquitous and harmful. It is high time to provide Americans with effective safeguards against commercial surveillance. Any legislation must not only effectively protect individual privacy, it must advance equitable, just and fair data uses, and must protect the most vulnerable among us, including children. In other words, they must bring about real changes in corporate practices. We have waited long enough; the time is now.” The following can be attributed to Laura Moy, Executive Director at Center on Privacy & Technology at Georgetown Law: “Americans want their data to be respected, protected, and used in ways that are consistent with their expectations. Any new legislation governing commercial data practices must advance these goals, and also protect us from data-driven activities that are harmful to society. We need privacy to protect us from uses of data that exclude communities from important opportunities, enable faceless brokers to secretly build ever-more-detailed profiles of us, and amplify political polarization and hate speech.” The following can be attributed to Yosef Getachew, Director of Media and Democracy Program at Common Cause: “An overwhelming majority of Americans believe they have lost control over how their personal information is collected and used across the internet ecosystem. Numerous data breaches and abuses in data sharing practices, which have jeopardized the personal information of millions of Americans, have validated these fears. Our current privacy framework no longer works, and the lack of meaningful privacy protections poses a serious threat to our democracy. Companies can easily manipulate data to politically influence voters or engage in discriminatory practices. These principles should serve as a baseline for any comprehensive privacy legislation that guarantees all Americans control over their data.” The following can be attributed to James P. Steyer, CEO and Founder, at Common Sense: “Any federal legislation should provide for strong baseline protections, particularly for the most surveilled and vulnerable generation ever -- our kids. These principles reflect that as privacy, consumer, and civil rights advocates, we only want federal legislation that will move the ball forward in terms of protecting kids, families, and all of us.” The following can be attributed to Linda Sherry, director of national priorities at Consumer Action: “Our country has floundered far too long without strong federal regulations governing data collection, retention, use and sharing. These privacy principles, developed by a coalition of leading consumer, civil rights and privacy organizations, are offered as a framework to guide Congress in protecting consumers from the many harms that can befall them when they are given little or no choice in safeguarding their data, and companies have few, if any, restrictions on how they use that information.” The following can be attributed to Susan Grant, Director of Consumer Protection and Privacy at Consumer Federation of America: “We need to move forward on data protection in the United States, from a default that allows companies to do what they want with Americans’ personal information as long as they don’t lie about it, to one in which their business practices are aligned with respect for privacy rights and the responsibility to keep people’s data secure.” The following can be attributed to Katie McInnis, Policy Counsel for Consumers Union, the advocacy division of Consumer Reports: “As new data breaches are announced at an alarming rate, now is the time to protect consumers with strong privacy laws. We need laws that do more than just address broad transparency and access rights. Consumers deserve practical controls and robust enforcement to ensure all of their personal information is sufficiently protected.” The following can be attributed to Gaurav Laroia, Policy Counsel at Free Press Action Fund: “The public has lost faith in technology companies' interest and ability to police their own privacy and data usage practices. It’s past time for Congress to pass a strong law that empowers people to make meaningful choices about their data, protects them from discrimination and undue manipulation, and holds companies accountable for those practices.” The following can be attributed to David Brody, Counsel & Senior Fellow for Privacy and Technology at the Lawyers’ Committee for Civil Rights Under Law: “Protecting the right to privacy is essential to protecting civil rights and advancing racial equity in a modern, Internet-focused society. Privacy rights are civil rights. Invasive profiling of online activity enables discrimination in employment, housing, credit, and education; helps bad actors target voter suppression and misinformation; assists biased law enforcement surveillance; chills the free association of advocates; and creates connections between hateful extremists exacerbating racial tensions.” The following can be attributed to Tracy Rosenberg, Executive Director at Media Alliance: “After a flood of data breaches and privacy violations, Americans overwhelmingly support meaningful protections for their personal information that are not written by, for and in the interests of the data collection industry. These principles start to define what that looks like.” The following can be attributed to Francella Ochillo, Vice President of Policy & General Counsel at National Hispanic Media Coalition: “For years, tech platforms have been allowed to monetize personal data without oversight or consequence, losing sight of the fact that personal data belongs to the user. Meanwhile, Latinos and other marginalized communities continue to be exposed to the greatest risk of harm and have the fewest opportunities for redress. The National Hispanic Media Coalition joins the chorus of advocates calling for a comprehensive regulatory framework that protects a user’s right to privacy and access as well as the right to be forgotten.” The following can be attributed to JP Massar, Organizer at Oakland Privacy: “We must not only watch the watchers, and regulate the sellers of our information. We must begin to unravel the information panopticon that has already formed. This is a start.” The following can be attributed to Robert Weissman, President at Public Citizen: “Internet privacy means control. Either we get to control our own lives as lived through the Internet, or the Big Tech companies do. That's what is at stake in whether the U.S. adopts real privacy protections.” The following can be attributed to Ed Mierzwinski, Senior Director for Consumer Programs at U.S. PIRG: “The big banks and the big tech companies all say that they want a federal privacy law, but the law that their phalanx of lobbyists seeks isn’t designed to protect consumers. Instead, it’s designed to protect their business models that treat consumers as commodities for sale; it fails to guarantee that their secret sauce big data algorithms don’t discriminate; it eliminates stronger and innovative state laws forever and it denies consumers any real, enforceable rights when harmed. We can’t allow that.” You may view the privacy principles (link is external) for more information.
  • Press Release

    Advocates ask FTC to investigate apps which manipulate kids

    Popular games for kids 5 and under lure them to watch ads and make in-app purchases

    A coalition of 22 consumer and public health advocacy groups called on the Federal Trade Commission (“FTC”) to investigate the preschool app market. The advocates’ letter urges the FTC to hold app makers accountable for unfair and deceptive practices, including falsely marketing apps that require in-app purchases as “free” and manipulating children to watch ads and make purchases. The complaint was filed in conjunction with a major new study that details a host of concerning practices in apps targeted to young children. The study (link is external) (paywall), “Advertising in Young Children’s Apps,” was led by researchers at University of Michigan C.S. Mott Children’s Hospital, and examined the type and content of advertising in 135 children’s apps.