CDD

program areas Digital Citizen

  • A new report (link is external) on how political marketing insiders and platforms such as Facebook view the “ethical” issues raised by the role of digital marketing in elections illustrates why advocates and others concerned about election integrity should make this issue a public-policy priority. We cannot afford to leave it in the hands of “Politech” firms and political campaign professionals, who appear unable to acknowledge the consequences to democracy of their unfettered use of powerful data-driven online-marketing applications. “Digital Political Ethics: Aligning Principles with Practice” reports on a series of conversations and a two-day meeting last October that included representatives of firms (such as Blue State, Targeted Victory, WPA Intelligence, and Revolution Messaging) that work either for Democrats or Republicans, as well as officials from both Facebook and Twitter. The goal of the project was to “identify areas of agreement among key stakeholders concerning ethical principles and best practices in the conduct of digital campaigning in the U.S.” Perhaps it should not be a surprise that this group of people appears to be incapable of critically examining (or even candidly assessing) all of the problems connected with the role of digital marketing in political campaigns. Missing from the report is any real concern about how today’s electoral process takes advantage of the absence of any meaningful privacy safeguards in the U.S. A vast commercial surveillance apparatus that has no bounds has been established. This same system that is used to market goods and services, and which is driven by data-brokers, marketing clouds, (link is external) real-time ad-decision engines, geolocation (link is external) identification and other AI-based (link is external)technologies—along with the clout of leading platforms and publishers—is now also used for political purposes. All of us are tracked and profiled 24/7, including where we go and what we do—with little location privacy anymore. Political insiders and data ad companies such as Facebook, however, are unwilling to confront the problem of this loss of privacy, given how valuable all this personal data is to their business model or political goal. Another concern is that these insiders now view digital marketing as a normative, business-as-usual process—and nothing out of the ordinary. But anyone who knows how the system operates should be deeply concerned about the nontransparent and often far-reaching ways digital marketing is constructed to influence (link is external) our decision-making and behaviors, including at emotional (link is external) and subconscious (link is external) levels. The report demonstrates that campaign officials have largely accepted as reasonable the various invasive and manipulative technologies and techniques that the ad-tech industry has developed over the past decade. Perhaps these officials are simply being pragmatic. But society cannot afford such a cynical position. Today’s political advertising is not yesterday’s TV commercial—nor is it purely an effort to “microtarget” sympathetic market segments. Today’s digital marketing apparatus follows all of us continuously, Democrats, Republicans, and independents alike. The marketing ecosystem (link is external) is finely tuned to learn how we react, transforming itself depending on those reactions, and making decisions about us in milliseconds in order to use—and refine—various tactics to influence us, entirely including new ad formats, each tested and measured to have us think and behave one way or another. And this process is largely invisible to voters, regulators and the news media. But for the insiders, microtargeting helps get the vote out and encourages participation. Nothing much is said about what happened in the 2016 U.S. election, when some political marketers sought to suppress the vote among communities of color, while others engaged is disinformation. Some of these officials now propose that political campaigns should be awarded a digital “right of way” that would guarantee them unfettered access to Facebook, Google and other sites, as well as ensure favorable terms and support. This is partly in response to the recent and much-needed reforms adopted by Twitter (link is external)and Google (link is external)that either eliminate or restrict how political campaigns can use their platforms, which many in the politech industry dislike. Some campaign officials see FCC (link is external) rules regulating TV ads for political ads as an appropriate model to build policies for digital campaigning. That notion should be alarming to those who care about the role that money plays in politics, let alone the nature of today’s politics (as well as those who know the myriad failures of the FCC over the decades). The U.S. needs to develop a public policy for digital data and advertising that places the interests of the voter and democracy before that of political campaigns. Such a policy should include protecting the personal information of voters; limiting deceptive and manipulative ad practices (such as lookalike (link is external) modeling); as well as prohibiting those contemporary ad-tech practices (e.g., algorithmic based real-time programmatic (link is external) ad systems) that can unfairly influence election outcomes. Also missing from the discussion is the impact of the never-ending expansion of “deep-personalization (link is external)” digital marketing applications designed to influence and shift consumer behavior more effectively. The use of biodata, emotion recognition (link is external), and other forms of what’s being called “precision data”—combined with a vast expansion of always-on sensors operating in an Internet of Things world—will provide political groups with even more ways to help transform electoral outcomes. If civil society doesn’t take the lead in reforming this system, powerful insiders who have their own conflicts of interests will be able to shape the future of democratic decision-making in the U.S. We cannot afford to leave it to the insiders to decide what is best for our democracy.
  • CDD, EPIC, USPIRG Opposition to Google/Doubleclick "Big Data" Merger

    2007 FTC filings example of groups calling for antitrust, privacy and other safeguards for digital marketplace

    Working closely with the Electronic Privacy Information Center (epic.org) and US PIRG, CDD led a campaign to oppose (link is external) the acquisition of Doubleclick by Google. CDD opposed (link is external) the deal on privacy, consumer protection and competiton grounds. We all foresaw what would happen if Google was allowed to swallow a leading digital marketing giant--more data collection, industry consolidation, weakening of consumer and privacy rights. It all happened of course, in part because the FTC hasn't ever been able to deal with the marketplace. Here are two of the filings done in this case.
    Jeff Chester
  • I played a key role (link is external) helping get the Children’s Online Privacy Protection Act (COPPA) passed by Congress in 1998 (when I was executive director of the Center for Media Education). Since then, I have tried to ensure that the country’s only federal law addressing commercial privacy online was taken seriously. That’s why it has been especially egregious to have witnessed Google violating COPPA for many (link is external) years, as it deliberately developed YouTube as the leading site for children. Google disingenuously claimed in its terms of service that YouTube was only meant for those 13 (link is external) and older, while it simultaneously unleashed programming and marketing strategies designed to appeal directly to kids. Google’s behavior sent a message that any powerful and well-connected corporation could ignore U.S. privacy law, even when that law was specifically designed to protect young people. In collaborations with our colleagues at the Campaign for Commercial-Free Childhood (CCFC (link is external)), our attorneys at the Institute for Public Representation (IPR (link is external)) at Georgetown University Law Center, and a broad coalition of consumer, privacy, public health and child rights groups, we began filing complaints at the FTC in 2015 concerning Google’s child-directed practices (on YouTube, its YouTube Kids app, and elsewhere). We also told top officials at the commission that Google was not abiding by COPPA, and repeatedly provided them documentation (link is external) of Google’s child-directed business operations. CCFC, CDD and IPR kept up the pressure on the FTC, in Congress and with the news media (see attached, for example). For a variety of reasons, the FTC, under the leadership of Chairman Joe Simons, finally decided to take action. The result was last week’s decision (link is external)—which in many ways is both historic and highly positive. Google was fined $170 million for its violations of children’s privacy, a record amount in terms of previous COPPA-connected financial sanctions. The FTC’s action also implemented important new policies (link is external) protecting children: Children will no longer be targeted with data-driven marketing and advertising on YouTube programming targeted to kids: This is the most important safeguard. Google announced that starting around January 2020, there would no longer be any form of personalized “behavioral” marketing permitted on YouTube’s programming that targets children. The “Official” YouTube blog post explained that Google “will limit data collection and use on videos made for kids only to what is needed to support the operation of the service. We will also stop serving personalized ads on this content entirely….” Google will require video producers and distributers to self-identify that their content is aimed at kids; it also committed to “use machine learning to find videos that clearly target young audiences, for example those that have an emphasis on kids characters, themes, toys, or games.” Google also explained that child-directed programming on YouTube will receive an additional safeguard—it won’t permit any personalized targeting on its child-directed content. Google committed to make substantial investments in its YouTube Kids (link is external) service: Google launched the YouTube Kids “app” in 2015, claiming it was “the first Google product (link is external) built from the ground up with little ones in mind.” But the app never rivaled the main YouTube platform’s hold on children, and was plagued with a number of problems (such as harmful content). Now, as a result of the FTC investigation, Google announced that it will bring “the YouTube Kids experience to the desktop,” increase its promotion of the service to parents, and more effectively curate different programming that will appeal to more young people—with new tiers of content suitable for “Preschool (ages 4 & under); Younger (ages 5-7); and Older (ages 8-12).” Google created a $100 million fund for “quality kids, family and educational content.” This is another proposal CCFC and CDD made and we are gratified Google acknowledged it bears responsibility to support programing that enriches the lives of children. This is to be a three-year program that is designed for “the creation of thoughtful, original children’s content on YouTube and YouTube globally.” Google has made changes to make YouTube a “safer platform for children:” The company is proactively promoting “quality” children’s programming by revising the algorithm used to make recommendations. It is also not permitting comments and notifications on its YouTube child-directed content. There are questions that still need to be answered about how Google will implement these new policies. For example, will the company prohibit the data targeting of children on YouTube worldwide? (It should.) How will it treat programming classified as “family viewing”—exempt it from the new data targeting safeguards? (It should not be permitted to do so.) Will the new $100 million production fund commit to supporting child-directed non-commercial content (instead of serving as a venture investment strategy for Google to expand its marketing to kids plans). Will Google ensure that its other child-directed commercial activities—such as its Play Store—also reflect the new safeguards the company have adopted for YouTube? Google also targets young people via so-called “influencers,” including videos where toys and other products are “unboxed.” Google needs to declare such content as child-directed (and should refrain from these practices as well). CCFC, CDD and our allies intend to play a proactive role holding Google, its programmers, advertisers and the FTC accountable to make sure that these new policies are implemented effectively. These new FTC-forced changes to how Google serves children are part of our commitment to ensuring that young people around the world grow up in a media environment that respects and promotes their health, privacy, and well-being.
    Jeff Chester
  • Contact: Jeff Chester, CDD (jeff@democraticmedia.org (link sends e-mail); 202-494-7100) David Monahan, CCFC (david@commercialfreechildhood.org (link sends e-mail); 617-896-9397) Advocates Who Filed the Privacy Complaint Against Google/YouTube Laud Improvements, But Say FTC Settlement Falls Far Short BOSTON, MA & WASHINGTON, DC—September 4, 2019—The advocates who triggered the Federal Trade Commission’s (FTC) investigation into YouTube’s violations of the Children’s Online Privacy Protection Act (COPPA) say the FTC’s settlement with Google will likely significantly reduce behavioral marketing to children on YouTube, but doesn’t do nearly enough to ensure children will be protected or to hold Google accountable. In April, 2018, Campaign for a Commercial-Free Childhood (CCFC) and the Center for Digital Democracy (CDD), through their attorneys at Georgetown Law’s Institute for Public Representation (IPR), filed an FTC complaint (link is external) detailing YouTube’s COPPA violations. Twenty-one other privacy and consumer groups signed on to CCFC and CDD’s complaint, which detailed how Google profits by collecting personal information from kids on YouTube, without first providing direct notice to parents and obtaining their consent as required by law. Google uses this information to target advertisements to children across the internet and across devices, in clear violation of COPPA. Today, the FTC and the New York Attorney General announced a settlement with Google, fining the company $170 million. The settlement also “requires Google and YouTube to develop, implement, and maintain a system that permits channel owners to identify their child-directed content on the YouTube platform so that YouTube can ensure it is complying with COPPA.” Content creators will be asked to disclose if they consider their videos to be child-directed; if they do, no behavioral advertising will be served to viewers of those videos. “We are pleased that our advocacy has compelled the FTC to finally address YouTube’s longstanding COPPA violations and that there will be considerably less behavioral advertising targeted to children on the number one kids’ site in the world,” said CCFC’s Executive Director Josh Golin. “But it’s extremely disappointing that the FTC isn’t requiring more substantive changes or doing more to hold Google accountable for harming children through years of illegal data collection. A plethora of parental concerns about YouTube – from inappropriate content and recommendations to excessive screen time – can all be traced to Google’s business model of using data to maximize watch time and ad revenue.” In a July 3, 2019 (link is external) letter to the FTC, the advocates specifically warned that shifting the burden of COPPA compliance from Google and YouTube to content creators would be ineffective. The letter noted many children’s channels were unlikely to become COPPA compliant by turning off behavioral advertising, since Google warns that turning off these ads “may significantly reduce your channel’s revenue.” The letter also detailed Google’s terrible track record of ensuring COPPA compliance on its platforms; a 2018 study found that 57% of apps in the Google Play Store’s Designed for Families program were violating COPPA despite Google’s policy that apps in the program must be COPPA compliant. And as Commissioner Rebecca Slaughter wrote in her dissent, many children’s content creators are not U.S.-based and therefore are unlikely to be concerned about FTC enforcement. “We are gratified that the FTC has finally forced Google to confront its longstanding lie that it wasn’t targeting children on YouTube,” said CDD’s executive director Jeff Chester, who helped spearhead the campaign that led to the 1998 passage of COPPA “However, we are very disappointed that the Commission failed to penalize Google sufficiently for its ongoing violations of COPPA and failed to hold Google executives personally responsible for the roles they played. A paltry financial penalty of $170 million—from a company that earned nearly $137 billion in 2018 alone -- sends a signal that if you are a politically powerful corporation, you do not have to fear any serious financial consequences when you break the law. Google made billions off the backs of children, developing a host of intrusive and manipulative marketing practices that take advantage of their developmental vulnerabilities. More fundamental changes will be required to ensure that YouTube is a safe and fair platform for young people.” Echoing Commissioner Rohit Copra’s dissent, the advocates noted that unlike smaller companies sanctioned by the FTC, Google was not forced to pay a penalty larger than its “ill-gotten gains.” In fact, with YouTube earning a reported $750 million annually from children’s content alone, the $170 million fine amounts to less than three months of advertising revenue from kids’ videos. With a maximum fine of $41,484 per violation, the FTC easily could have sought a fine in the tens of billions of dollars. "I am pleased that the FTC has made clear that companies may no longer avoid complying with COPPA by claiming their online services are not intended for use by children when they know that many children in fact use their services,” said Angela Campbell, Director Emeritus of IPR’s Communications and Technology Clinic at Georgetown Law, which researched and drafted the complaint. Campbell, currently chair of CCFC’s Board, served as lead counsel to CCFC and CDD on the YouTube and other complaints alleging COPPA violations. She, along with Chester, was responsible for filing an FTC complaint in 1996 against a child-directed website that led to Congress’s passage of COPPA in 1998 (link is external). COPPA gave the FTC expanded authority to implement and enforce the law, for example, by including civil penalties. About the proposed settlement, Campbell noted: “It’s disappointing that the FTC has not fully used its existing authority to hold Google and YouTube executives personally liable for adopting and continuing to utilize a business model premised on ignoring children’s privacy protection, to adopt a civil penalty substantial enough to deter future wrongdoing, or to require Google to take responsibility for ensuring that children’s content on YouTube platforms complies with COPPA.” On the heels of a sweetheart settlement with Facebook, the advocates said the deal with Google was further proof the FTC wasn’t up to the task of protecting consumers’ privacy. Said Campbell, “I support Commissioner Slaughter’s call to state attorney generals to step up and hold Google accountable. Added Chester, “The commission’s inability to stop Google’s cynically calculated defiance of COPPA underscores why Congress must create a new consumer watchdog that will truly protect Americans’ privacy.” Organizations which signed on to the CCFC/CDD 2018 FTC complaint were Berkeley Media Studies Group; Center for Media Justice; Common Sense; Consumer Action; Consumer Federation of America; Consumer Federation of California; Consumers Union, the advocacy division of Consumer Reports; Consumer Watchdog; Corporate Accountability; Defending the Early Years; Electronic Privacy Information Center (“EPIC”); New Dream; Obligation, Inc.; Parent Coalition for Student Privacy; Parents Across America; Parents Television Council; Privacy Rights Clearinghouse; Public Citizen; The Story of Stuff Project; TRUCE (Teachers Resisting Unhealthy Childhood Entertainment); and USPIRG. ###
  • Blog

    CDD Memo to FTC on Facebook Consent Decree Violations--2013

    FTC has long ignored how market operates-it still does in 2019

  • News

    Groups Join Legal Battle to Fight Ineffective FTC Privacy Decision on Facebook

    Statements from Campaign for Commercial-Free Childhood, CDD, Color of Change, Common Sense Media, Consumer Action, Consumer Federation of America, Open Markets, Public Citizen, USPIRG

    FOR RELEASE July 26, 2019 Consumer Privacy Organizations to Challenge Facebook Settlement Statement from Groups --------- “The Settlement Fails to Provide Meaningful Relief to Facebook Users” WASHINGTON, DC – Many of the nation’s leading consumer privacy organizations are urging a federal court in Washington, DC to consider public comments before finalizing a proposed settlement between the Federal Trade Commission and Facebook. “The Facebook settlement is both historic and controversial. Many believe the FTC failed to establish meaningful safeguards for consumer privacy. We believe the court overseeing the case should consider the views of interested parties,” said Marc Rotenberg, President of the Electronic Privacy Information Center. Under the terms of the settlement, Facebook will pay a record-breaking $5 b fine to the United States Treasury, but there will be no significant changes in Facebook’s business practices and the FTC will release all pending complaints against the company. Typically in a proposed FTC settlement, the public would be provided an opportunity to provide comments to the agency before finalizing the deal. But no such opportunity was provided in the Facebook settlement. Many of the organizations that are joining the effort have also filed detailed complaints with the Federal Trade Commission, alleging that Facebook has violated privacy laws, including the Children’s Online Privacy Protection Act. A Freedom of Information Act case revealed that there are more than 26,000 complaints against Facebook currently pending at the Commission. In a similar case in 2012, the privacy group Consumer Watchdog challenged the FTC settlement with Google regarding the Safari hack. In other consumer privacy cases, courts have created opportunities for interested parties to file papers and be heard prior to a final determination on a proposed settlement. The case is In the Matter of Facebook, No. 19-cv-2184 (D.D.C. Filed July 24, 2019) EPIC filed with the court today: https://epic.org/2019/07/epic-challenges-ftc-facebook-s.html (link is external) Statements of Support: Brandi Collins-Dexter, Senior Campaign Director, Color of Change, “Despite the large price tag, the FTC settlement provides no meaningful changes to Facebook’s structure or financial incentives. It allows Facebook to continue to set its own limits on how much user data it can collect and it gives Facebook immunity for unspecified violations. The public has a right to know what laws Facebook violated. Corporations should face consequences for violating the public trust, not be given a rubber stamp to carry out business as usual. This settlement limits the ability of Black users to challenge Facebook’s misuse of their data and force real accountability which is why the courts must review the fairness of this settlement.” Susan Grant, Director of Consumer Protection and Privacy, Consumer Federation of America: “The FTC’s settlement with Facebook sells consumers short by failing to change the company’s mass surveillance practices and wiping away other complaints that deserved to be addressed. It needs to be stronger to truly protect our privacy.” Linda Sherry, Director of National Priorities, Consumer Action: “The FTC’s pending Facebook settlement does not take adequate measures to limit the collection and sharing of consumers’ personal information, but appears to provide the company with extensive protections from even future violations. Consumer Action respectfully urges the court to consider positions from interested parties who have related complaints filed with the FTC to ensure that the most fair and comprehensive agreement is approved.” Sally Hubbard, Director of Enforcement Strategy, Open Markets. “The FTC’s settlement is woefully insufficient in light of Facebook’s persistent privacy violations. The fine is a mere cost of doing business that makes breaking the law worth it for Facebook. Remedies must curb Facebook’s widespread data collection and promote competition. Otherwise Facebook will continue to fortify its monopoly power by surveilling users both on Facebook and off, and users can’t vote with their feet when Facebook violates their privacy. The public must have the opportunity to be heard on this negligent settlement." Robert Weissman, President, Public Citizen: “The FTC's settlement amounts to Facebook promising yet again to adhere to its own privacy policy, while reserving the right to change that policy at any time. That approach will fail to protect users' privacy. The court should reject the settlement and order the FTC to try again and do better.” Josh Golin, Executive Director, Campaign for Commercial-Free Childhood: “Facebook has been exploiting kids for years, and this proposed settlement is essentially a get-out-of-jail-free card. It potentially extinguishes our children's privacy complaints against Facebook, but offers absolutely no protections for kids' privacy moving forward. It also sweeps under the rug a complaint detailing how Facebook knowingly and intentionally tricked kids into spending money on mobile games over several years, sometimes to the tune of thousands of dollars per child.” James P. Steyer, CEO and Founder of Common Sense Media: "On behalf of families across the country, Common Sense fully stands behind EPIC's motion. The proposed settlement is a "get out of jail free" card for Facebook, purporting to absolve Facebook not only of liability for privacy abuses but for other -- completely unaddressed and unexplored -- Section 5 abuses. One such abuse that the FTC is aware of and that court documents confirm includes tricking kids into making in-app purchases that have put families out hundreds and even thousands of dollars —something the company has yet to meaningfully change its policies on to this day. Such a broad release is unprecedented, unjustified and unacceptable." Edmund Mierzwinski, Senior Director for Federal Consumer Programs, U.S. PIRG: "This laughable $5 billion settlement with the category-killer social media giant Facebook makes the much smaller Equifax settlement for sloppy security look harsh. Facebook intentionally collects and shares an ever-growing matrix of information about consumers, their friends and their interests in a mass surveillance business model. It routinely changes its previous privacy promises without consent. It doesn't adequately audit its myriad business partners. The FTC essentially said to Facebook: "Pay your parking ticket but don't ever change. Your fast-and-loose practices are okay with 3 of the 5 of us." Not changing those practices will come back to haunt the FTC, consumers and the world.” Jeff Chester, Executive Director, Center for Digital Democracy: "The 3-2 Facebook decision by the FTC leaves millions of Americans vulnerable to all the problems unleashed by the Cambridge Analytica scandal. The commission adopted a woefully inadequate remedy that does nothing to stem the fundamental loss of its user privacy which led to our original 2009 complaint."
    Jeff Chester
  • Press Release

    FTC Fails to Protect Privacy in Facebook decision

    Instead of serious structural and behavioral change, 3-2 deal is a huge giveaway. By dismissing all other claims, Simons' FTC does disservice to public

    Statement of Jeff Chester, executive director, Center for Digital Democracy--CDD helped bring the 2009 FTC complaint that is the subject of today's decision on the Consent Order Once again, the Federal Trade Commission has shown itself incapable of protecting the privacy of the public and also preventing ongoing consumer harms. Today's announcement of a fine and--yet again! --improved system of internal compliance and other auditing controls doesn't address the fundamental problems. First, the FTC should have required Facebook to divest both its Instagram and Whatsapp platforms. By doing so, the commission would have prevented what will be the tremendous expansion of Facebook's ability to continually expand its data gathering activities. By failing to require this corporate break-up, the FTC has set the stage for what will be "Groundhog Day" violations of privacy for years to come. The FTC should have insisted that an independent panel of experts--consumer groups, data scientists, civil rights groups, etc.--be empaneled to review all the company's data related products, to decide which ones are to be modified, eliminated, or allowed to continue (such as lookalike modeling, role of influencers, cross-device tracking, etc.). This group should have been given the authority to review all new products proposed by the company for a period of at least five years. What was needed here was a serious change in the corporate culture, along with serious structural remedies, if the FTC really wanted to ensure that Facebook would act more responsibly in the future. The dissents by Commissioners Chopra and Slaughter illustrate that the FTC majority could have taken another path, instead of supporting a decision that will ultimately enable the problems to continue. Today's decision also dismisses all other complaints and requests for investigation related to Facebook's consent decree failures--a huge giveway. The FTC should be replaced by a new data protection agency to protect privacy. The commission has repeatedly demonstrated that--regardless of who is in charge--it is incapable of confronting the most powerful forces that undermine our privacy--and digital rights.
  • For years, consumer and privacy advocates attempted to get the Federal Trade Commission to act responsibly when it came to ensuring that the digital giants treated the public fairly, including their privacy. Since the mid-1990's, when I first started working to press the commission to be more responsive to the threats to autonomy and fairness triggered by the unrelenting and stealth gathering of all of our personal information (often working with EPIC), I was confronted by an agency which was so cautious, it blinded itself to the problems. The agency has never been able to address the role that digital marketing plays, for example, in manipulating people, helping it collect even more data on individuals. It refused to stop or curtail "Big Data" connected mergers or acquisitions, even though these deals further eroded our privacy. Overall, regardless of political party, the FTC has too often been timid, fearful, weak-kneed to industry, uninformed. Indeed, I believe that the massive global erosion of privacy and the growth of universal commercial surveillance is due, in large part, to the failure of the FTC to stop Google, Facebook and others from constantly expanding how they are able to get control over our personal details and use it anyway they desire. The FTC is an un-indicted conspirator in any privacy case. Cambridge Analytica was merely emblematic of the way the digital data marketing industry operates daily throughout the world. It wasn't an aberration, and there were and are many more like it. During the nearly 25 years I worked to pressure the FTC to do "the right thing," I and my many colleagues attempted to be a voice of information, conscience, political pressure. It helped no doubt. But I don't think we can save the agency at this point. We need a new digital watchdog that is set up from the get go with a clear mission to protect and empower the public--including ensuring their civil rights. Here's a memo, btw, we sent to Jim Kohm and other FTC officials working on the Facebook consent decree in 2013. We also organized a briefing for them; sent them trade stories, documenting the many ways we believe Facebook was violating its 2011 agreement. We gave them similar information on Google and its own consent decree failings. The FTC staff didn't see the problem. We can discuss why at some point, but I gather it's because they don't really want to tackle the forces that shape contemporary digital marketing. This is a sad story of the consumer agency that has a "don't ask, don't tell me" attitude when it comes to the powerful companies shaping our digital lives.
  • Press Release

    FTC must impose maximum fine and ensure Google’s YouTube business practices obey children’s privacy law, say groups

    Google’s unprecedented violation requires an unprecedented FTC response and a 20-year consent decree to ensure Alphabet Inc. acts responsibly when it comes to serving children and parents; Google executives should also be held accountable.

    June 25, 2019 The Honorable Joseph Simons The Honorable Noah Phillips Chairman Commissioner Federal Trade Commission Federal Trade Commission 600 Pennsylvania Avenue, NW 600 Pennsylvania Avenue, NW Washington, DC 20580 Washington, DC 20580 The Honorable Rohit Chopra The Honorable Rebecca Slaughter Commissioner Commissioner Federal Trade Commission Federal Trade Commission 600 Pennsylvania Avenue, NW 600 Pennsylvania Avenue, NW Washington, DC 20580 Washington, DC 20580 The Honorable Christine Wilson Commissioner Federal Trade Commission 600 Pennsylvania Avenue, NW Washington, DC 20580 Dear Chairman Simons, Commissioner Phillips, Commissioner Chopra, Commissioner Slaughter, and Commissioner Wilson: The Campaign for a Commercial-Free Childhood (CCFC) and Center for Digital Democracy (CDD) have been encouraged by recent media reports that the Federal Trade Commission is preparing to take action against Google and YouTube for violating the Children’s Online Privacy Protection Act (COPPA). CCFC and CDD, represented by the Institute for Public Representation at Georgetown Law (IPR), are the organizations responsible for drafting the Request to Investigate Google and YouTube filed with the FTC on April 9, 2018. Previously, IPR filed on behalf of CCFC and CDD Requests to Investigate YouTube’s promotion of unfair and deceptive influencer marketing to children (October 21, 2016) and unfair and deceptive marketing practices on YouTube Kids (April 7, and November 24, 2015). As you are aware, YouTube has profited enormously by hosting channels and videos of nursery rhymes, unboxing videos, popular cartoons, and other content specifically designed for children on the main YouTube platform. But instead of getting the verifiable parental consent required before collecting children’s personal information, Google claims that YouTube is not for children under thirteen, and therefore, no consent is required. This defense is outlandish given that YouTube is the number one online destination for kids. In short, Google has profited by violating the law and the privacy of tens of millions of children. For this reason, the FTC must sanction Google at a scale commensurate with the company’s unprecedented and unparalleled violations of COPPA. As we pointed out in our Request to Investigate, the maximum civil penalties should be imposed because: Google had actual knowledge of both the large number of child-directed channels on YouTube and the large numbers of children using YouTube. Yet, Google collected personal information from nearly 25 million children in the U.S over a period of years, and used this data to engage in very sophisticated digital marketing techniques. Google’s wrongdoing allowed it to profit in two different ways: Google has not only made a vast amount of money by using children’s personal information as part of its ad networks to target advertising, but has also profited from advertising revenues from ads on its YouTube channels that are watched by children. [April 9, 2018 Request to Investigate at 26-27 (footnotes omitted)]. Moreover, any consent order must mandate meaningful changes to YouTube’s business practices. For example, all child-directed content should be placed on a separate platform where targeted advertising, commercial data collection, links to other sites or content, and autoplay are prohibited. Google must also live up to its Terms of Service – which stipulate YouTube is only for persons thirteen and older – by removing all kids’ content from the main YouTube platform. By ensuring such changes, the Commission will do a tremendous service to America’s families seeking to provide a healthy media environment for their children, while sending a clear message to all online and mobile operators that no one is above the law. Google’s disregard of children’s welfare is demonstrated not only by the evidence in our complaints, but by numerous reports of violent, sexual and other inappropriate content available to children on both YouTube Kids and on the main YouTube platform. Moreover, the company refused to turn off recommendations on videos featuring young children in leotards and bathing suits even after researchers demonstrated YouTube’s algorithm was recommending these videos to pedophiles. These ongoing and serious issues require that the FTC take strong action. We believe that Google should repay America’s families by creating a truly safe space for kids and fostering the production of quality non-commercial children’s programming. Attached you will find a list of recommended penalties and conditions to be included in a final consent order. We would be happy to meet with you to discuss our proposed remedies in greater detail. Thank you. Sincerely, Jeffrey Chester Executive Director Center for Digital Democracy Josh Golin Executive Director Campaign for a Commercial-Free Childhood Angela J. Campbell Director Institute for Public Representation Georgetown Law Encl.: Proposed Consent Order Penalties and Conditions Proposed Consent Order Penalties and Conditions The FTC should seek a 20-year consent decree which includes the following forms of relief: Injunctive relief Destroy all data collected from children under 13, in all forms in Google’s possession, including inferences drawn from this data, custody, or control of YouTube and all of Alphabet’s subsidiaries engaged in online data collection or commercial uses (e.g. advertising), including, but not limited to, Google Ads, Google Marketing Platform and their predecessors. Immediately stop collecting data from any user known to be under age 13, and any user that a reasonable person would likely believe to be under age 13, including, but not limited to, persons that are viewing any channel or video primarily directed to children, persons who have been identified for targeted ads based on being under 13 or any proxy for under 13 (e.g., grade in school, interest in toys, etc.), or any other factors. Identify, as of the date of this consent order, as well as on an ongoing basis, any users under age 13, and prohibit them from accessing content on YouTube. Prohibit users under age 13 from accessing content on YouTube Kids unless and until YouTube has provided detailed notice to parents, obtained parental consent, and complied with all of the other requirements of COPPA and this consent order. Remove all channels in the Parenting and Family lineup, as well as any other YouTube channels and videos directed at children, from YouTube. YouTube may make such channels and videos available on a platform specifically intended for children (e.g. YouTube Kids) only after qualified human reviewers have reviewed the content and determined that the programming comply with all of the policies for YouTube’s child-directed platform, which must include, but are not limited to: No data collection for commercial purposes. Any data collected for “internal purposes” must be clearly identified as to what is being collected, for what purpose, and who has access to the data. It may not be sold to any third parties. No links out to other sites or online services. No recommendations or autoplay. No targeted marketing. No product or brand integration, including influencer marketing. Consumer education Require Google to fund independent organizations to undertake educational campaigns to help children and parents understand the true nature of Google’s data-driven digital marketing systems and its potential impacts on children’s wellbeing and privacy. Require Google to publicly admit (in advertising and in other ways) that it has violated the law and warn parents that no one under 13 should use YouTube. Record keeping and monitoring provisions Google must submit to an annual audit by a qualified, independent auditor to ensure that Google is complying with all aspects of the consent decree. The auditor must submit their report to the FTC. The FTC shall provide reports to Congress about the findings. All of the annual audits must be publicly available without redaction on the Commission’s website within 30 days of receipt. Google may not launch any new child-directed service until the new service has been reviewed and approved by an independent panel of experts – including child development and privacy experts – to be appointed by the FTC. Google must retain, and make available to the FTC on request, documentation of its compliance with the consent decree. Civil penalties and other monetary relief Google will pay the maximum possible civil penalties – $42,530 per violation. Whether violations are counted per child or per day, the total amount of the penalty must be sufficiently high to deter Google and YouTube from any further violations of COPPA. Google to establish a $100 million fund to be used to support the production of noncommercial, high-quality, and diverse content for children. Decisions about who receives this money must be insulated from influence by Google. In addition, we ask the FTC to consider using its authority under Section 13(b) of the FTC Act to require Google and YouTube to disgorge ill-gotten gains, and to impose separate civil penalties on the management personnel at Google and YouTube who knowingly allowed these COPPA violations to occur.