program areas Digital Citizen
Program Areas
-
The Center for Digital Democracy and 23 other leading civil society groups sent a letter to President Biden today asking his Administration to ensure that any new transatlantic data transfer deal is coupled with the enactment of U.S. laws that reform government surveillance practices and provide comprehensive privacy protections.
-
Blog
The Whole World will Still be Watching You: Google & Digital Marketing Industry “Death-of-the-Cookie” Privacy Initiatives Require Scrutiny from Public Policymakers
The Whole World will Still be Watching You: Google & Digital Marketing Industry “Death-of-the-Cookie” Privacy Initiatives Require Scrutiny from Public Policymakers Jeff Chester One would think, in listening to the language used by Google, Facebook, and other ad and data companies to discuss the construction and future of privacy protection, that they are playing some kind of word game. We hear terms (link is external) such as “TURTLEDOVE,” “FLEDGE,” SPARROW and “FLoC.” Such claims should be viewed with skepticism, however. Although some reports make it appear that Google and its online marketing compatriots propose to reduce data gathering and tracking, we believe that their primary goal is still focused on perfecting the vast surveillance system they’ve well-established. A major data marketing industry effort is now underway to eliminate—or diminish—the role of the tracking software known as “third-party” cookies. Cookies were developed (link is external) in the very earliest days of the commercial “World Wide Web,” and have served as the foundational digital tether connecting us to a sprawling and sophisticated data-mining complex. Through cookies—and later mobile device IDs and other “persistent” identifiers—Google, Facebook, Amazon, Coca-Cola and practically everyone else have been able to surveil and target us—and our communities. Tracking cookies have literally helped engineer a “sweet spot (link is external)” for online marketers, enabling them to embed spies into our web browsers, which help them understand our digital behaviors and activities and then take action based on that knowledge. Some of these trackers—placed and used by a myriad (link is external) of data marketing companies on various websites—are referred to as “third-party” cookies, to distinguish them from what online marketers claim, with a straight face, are more acceptable forms of tracking software—known as “first-party” cookies. According to the tortured online advertiser explanation, “first-party” trackers are placed by websites on which you have affirmatively given permission to be tracked while you are on that site. These “we-have-your-permission-to-use” first-party cookies would increasingly become the foundation for advances in digital tracking and targeting. Please raise your hand if you believe you have informed Google or Amazon, to cite the two most egregious examples, that they can surveil what you do via these first-party cookies, including engaging in an analysis of your actions, background, interests and more. What the online ad business has developed behind its digital curtain—such as various ways to trigger your response, measure your emotions (link is external), knit together information on device (link is external) use, and employ machine learning (link is external) to predict your behaviors (just to name a few of the methods currently in use)—has played a fundamental role in personal data gathering. Yet these and other practices—which have an enormous impact on privacy, autonomy, fairness, and so many other aspects of our lives—will not be affected by the “death-of-the-cookie” transition currently underway. On the contrary, we believe that a case to be made that the opposite is true. Rather than strengthening data safeguards, we are seeing unaccountable platforms such as Google actually becoming more dominant, as so-called “privacy preserving (link is external)” systems actually enable enhanced data profiling. In a moment, we will briefly discuss some of the leading online marketing industry work underway to redefine privacy. But the motivation for this post is to sound the alarm that we should not—once again—allow powerful commercial interests to determine the evolving structure of our online lives. The digital data industry has no serious track record of protecting the public. Indeed, it was the failure of regulators to rein in this industry over the years that led to the current crisis. In the process, the growth of hate speech, the explosion of disinformation, and the highly concentrated control over online communications and commerce—to name only a few— now pose serious challenges to the fate of democracies worldwide. Google, Facebook and the others should never be relied on to defer their principal pursuit of monetization out of respect to any democratic ideal—let alone consumer protection and privacy. One clue to the likely end result of the current industry effort is to see how they frame it. It isn’t about democracy, the end of commercial surveillance, or strengthening human rights. It’s about how best to preserve what they call the “Open Internet.” (link is external)Some leading data marketers believe we have all consented to a trade-off, that in exchange for “free” content we’ve agreed to a pact enabling them to eavesdrop on everything we do—and then make all that information available to anyone who can pay for it—primarily advertisers. Despite its rhetoric about curbing tracking cookies, the online marketing business intends to continue to colonize our devices and monitor our online experiences. This debate, then, is really about who can decide—and under what terms—the fate of the Internet’s architecture, including how it operationalizes privacy—at least in the U.S. It illustrates questions that deserve a better answer than the “industry-knows-best” approach we have allowed for far. That’s why we call on the Biden Administration, the Federal Trade Commission (FTC) and the Congress to investigate these proposed new approaches for data use, and ensure that the result is truly privacy protective, supporting democratic governance and incorporating mechanisms of oversight and accountability. Here’s a brief review (link is external) of some of the key developments, which illustrate the digital “tug-of-war” ensuing over the several industry proposals involving cookies and tracking. In 2019, Google announced (link is external) that it would end the role of what’s known as “third-party cookies.” Google has created a “privacy sandbox (link is external)” where it has researched various methods it claims will protect privacy, especially for people who rely on its Chrome browser. It is exploring “ways in which a browser can group together people with similar browsing habits, so that ad tech companies can observe the habits of large groups instead of the activity of individuals. Ad targeting could then be partly based on what group the person falls into.” This is its “Federated Learning of Cohorts (FLoC) approach, where people are placed into “clusters” based on the use of “machine learning algorithms” that analyze the data generated from the sites a person visited and their content. Google says these clusters would “each represent thousands of people,” and that the “input features” used to generate the targeting algorithm, such as our “web history,” would be stored on our browsers. There would be other techniques deployed, to add “noise” to the data sets and engage in various “anonymization methods” so that the exposure of a person’s individual information is limited. Its TURTLEDOVE initiative is designed to enable more personalized targeting, where web browsers will be used to help ensure our data is available for the real-time auctions that sell us to advertisers. The theory is that by allowing the data to remain within our devices, as well using clusters of people for targeting, our privacy is protected. But the goal of the process— to have sufficient data and effective digital marketing techniques—is still at the heart of this process. Google recently (link is external) reported that “FLoC can provide an effective replacement signal for third-party cookies. Our tests of FLoC to reach in-market and affinity Google Audiences show that advertisers can expect to see at least 95% of the conversions per dollar spent when compared to cookie-based advertising.” Google’s 2019 announcement caused an uproar in the digital marketing business. It was also perceived (correctly, in my view) as a Google power grab. Google operates basically as a “Walled Garden (link is external)” and has so much data that it doesn’t really need third-party data cookies to hone in on its targets. The potential “death of the cookie” ignited a number of initiatives from the Interactive (link is external) Advertising Bureau, as well as competitors (link is external) and major advertisers, who feared that Google’s plan would undermine their lucrative business model. They include such groups as the Partnership for Addressable Media (PRAM), (link is external) whose 400 members include Mastercard, Comcast/NBCU, P&G, the Association of National Advertisers, IAB and other ad and data companies. PRAM issued a request (link is external) to review proposals (link is external) that would ensure the data marketing industry continues to thrive, but could be less reliant on third-party cookies. Leading online marketing company Trade Desk is playing a key role here. It submitted (link is external) its “United ID 2.0 (link is external),” plan to PRAM, saying that it “represents an alternative to third party cookies that improves consumer transparency, privacy and control, while preserving the value exchange of relevant advertising across channels and devices.” There are also a number of other ways now being offered that claim both to protect privacy yet take advantage of our identity (link is external), such as various collaborative (link is external) data-sharing efforts. The Internet standards groups Worldwide Web Consortium (W3C) has created (link is external) a sort of neutral meeting ground where the industry can discuss proposals and potentially seek some sort of unified approach. The rationale for the [get ready for this statement] “Improving Web Advertising Business Group goal is to provide monetization opportunities that support the open web while balancing the needs of publishers and the advertisers that fund them, even when their interests do not align, with improvements to protect people from the individual and societal impacts of tracking content consumption over time.” Its participants (link is external) are another “Who’s Who” in data-driven marketing, including Google, AT&T, Verizon, NYT, IAB, Apple, Group M, Axel Springer, Facebook, Amazon, Washington Post, Verizon, and Criteo. DuckDuckGo is also a member (and both Google and Facebook have multiple representatives in this group). The sole NGO listed as a member is the Center for Democracy and Technology. W3Cs ad business group has a number of documents (link is external) about the digital marketing business that illustrate why the issue of the future of privacy and data collection and targeting should be a public—and not just data industry—concern. In an explainer (link is external) on digital advertising, they make the paradigm so many are working to defend very clear: Marketing’s goal can be boiled down to the "5 Rights": Right Message to the Right Person at the Right Time in the Right Channel and for the Right Reason. Achieving this goal in the context of traditional marketing (print, live television, billboards, et al) is impossible. In digital realm, however, not only can marketers achieve this goal, they can prove it happened. This proof is what enables marketing activities to continue, and is important for modern marketers to justify their advertising dollars, which ultimately finance the publishers sponsoring the underlying content being monetized.” Nothing I’ve read says it better. Through a quarter century of work to perfect harvesting our identity for profit, the digital ad industry has created a formidable complex of data clouds (link is external), real-time ad auctions, cross-device tracking tools and advertising techniques (link is external) that further commodify our lives, shred our privacy, and transform the Internet into a hall of mirrors that can amplify our fears and splinter democratic norms. It’s people, of course, who decide how the Internet operates—especially those from companies such as Google, Facebook, Amazon, and those working for trade groups as the IAB. We must not let them decide how cookies may or may not be used or what new data standard should be adopted by the most powerful corporate interests on the planet to profit from our “identity.” It’s time for action by the FTC and Congress. Part 1. (1)For the uninitiated, TURTLEDOVE stands for “Two Uncorrelated Requests, Then Locally-Executed Decision On Victory”; FLEDGE is short for “First Locally-Executed Decision over Groups Experiment”; SPARROW is “Secure Private Advertising Remotely Run On Webserver”; and FLoC is “Federated Learning of Cohorts”). (2) In January 2021, the UK’s Competition and Markets Authority (CMA) opened up an investigation (link is external) into Google privacy sandbox and cookie plans. -
CONSUMER AND CITIZEN GROUPS CONTINUE TO HAVE SERIOUS CONCERNS ABOUT GOOGLE FITBIT TAKEOVER Joint Statement on Possible Remedies (link is external)
-
The Campaign for Commercial-Free Childhood (CCFC) and CDD filed comments with the UN’s Special Rapporteur on privacy, as part of a consultation designed to propose global safeguards for young people online. Both CCFC and CDD, along with allies in the U.S. and throughout the world, are working to advance stronger international protections for young people, especially related to their privacy and the impacts that digital marketing has on their development.
-
For Immediate Release September 24, 2020 Contact: Jeff Chester (202-494-7100) jeff@democraticmedia.org (link sends e-mail) A Step Backwards for Consumer Privacy: Why Californians Should Vote No on Proposition 24 Ventura, CA, and Washington, DC: The Center for Digital Democracy (CDD) announced today its opposition to the California Privacy Rights Act (CPRA), also known as Proposition 24, which will appear on the November 2020 California general election ballot. Prop 24 does not sufficiently strengthen Californians’ privacy and may, in fact, set a new lower and thus more dangerous standard for privacy protection in the U.S., according to its analyses. “We need strong and bold privacy legislation, not weaker standards and tinkering at the margins,” declared CDD Policy Director Katharina Kopp. “Prop 24 fails to significantly limit data uses that undermine our privacy, increase corporate manipulation and exploitation, and exacerbate racial and economic inequality. This initiative allows the much more powerful companies to set unfair terms by default. It also condones pay-for-privacy schemes, where corporations would be allowed to charge a premium (or eliminate a discount) in exchange for privacy. These schemes tend to hurt the already disadvantaged the most,” she explained. CDD intends to work with allies from the consumer and privacy communities to inform voters about Prop 24 and how best to protect their privacy. The Center for Digital Democracy is a leading nonprofit organization focused on empowering and protecting the rights of the public in the digital era.
-
The Center for Digital Democracy (CDD) announced today its opposition to the California Privacy Rights Act (CPRA), also known as Proposition 24 (link is external), which will appear on the November 2020 California general election ballot. CDD has concluded that Prop 24 does not sufficiently strengthen Californians’ privacy and may, in fact, set a new, low, and thus dangerous standard for privacy protection in the U.S. We need strong and bold privacy legislation, not weaker standards and tinkering at the margins. We need digital privacy safeguards that address the fundamental drivers of our eroding privacy, autonomy, and that redress the growing levels of racial and social inequity. We need rules that go to the heart of the data-driven business model and curtail the market incentives that have created the deplorable state of affairs we currently face. What we need are protections that significantly limit data uses that undermine our privacy, increase corporate manipulation and exploitation, and exacerbate racial and economic inequality. We need default privacy settings that limit the sharing and selling of personal information, and the use of data for targeted advertising, personalized content, and other manipulative practices. We need to ensure privacy for all and limit any pay-for-privacy schemes that entice the most vulnerable to give up their privacy. In other words, we need to limit harmful data-use practices by default, and place the interests of consumers above market imperatives by allowing only those data practices that are not harmful to individuals, groups, and society at large. Prop 24 does none of that. Specifically, Prop 24 continues on the path of a failed notice-and-choice regime, allowing the much more powerful companies to set unfair terms. Instead, privacy legislation should focus on strong default settings and data-use practices that are allowable (“permissible uses”) and prohibiting all others. These safeguards should be in place by default, rather than forcing consumers to opt out of invasive advertising. Prop 24, in contrast, does not provide effective data-use limitations; instead it continues to limit data sharing and selling via an opt-out, rather than declaring them to be impermissible uses, or at minimum requiring an opt-in for such practices. Even “sensitive data” under Prop 24 is protected only via a consumer-initiated opt-out, rather than prohibiting the use of sensitive personal data altogether. Equally concerning, Prop 24 would expand rather than limit pay-for-privacy schemes. Under the terms of Prop 24, corporations are still allowed to charge a premium (or eliminate a discount) in exchange for privacy. Consumers shouldn’t be charged higher prices or be discriminated against simply for exercising their privacy rights. This provision of Prop 24 is particularly objectionable, as it tends to harm vulnerable populations, people of color, and the elderly by creating privacy “haves” and “have-nots,” further entrenching other, existing inequities as companies would be able use personal data to profile, segment, and discriminate in a variety of areas. There are many other reasons that CDD objects to Prop 24, chief among them that this flawed measure - employs an outdated concept of “sensitive data” instead of focusing on sensitive data uses; - fails to rein in the growing power of data brokers that collect and analyze personal data from a variety of sources, including public data sets, for sale to marketers; - does not employ strong enough data minimization provisions to limit data collection, use and disclosure only to what is necessary to provide the service requested by the consumer; - undermines consumer efforts to seek enforcement of privacy rights by neglecting to provide full private right-of-action provisions; and - unnecessarily delays its protection of employee privacy.
-
Supporting the Call for Racial JusticeThe Center for Digital Democracy supports the call for racial justice and the fight against police violence, against the systemic injustices that exist in all parts of our society – inferior educational opportunities; lack of affordable equitable health care; an unjust justice system; housing and employment discrimination; and discriminatory marketing practices.We grieve for the lives lost and the opportunities denied! We grieve for the everyday injustices people of color have to endure and had to endure for centuries.We grieve for an America that could be so much more!Our grieving is not enough! CDD will continue its fight for data justice in support of racial and social justiceJune 5, 2020
-
Press Release
Groups Tell FTC to Investigate TikTok’s Failure to Protect Children’s Privacy
TikTok gathers data from children despite promise made to commission
Contact: Jeff Chester, CDD (jeff@democraticmedia.org (link sends e-mail); 202-494-7100) David Monahan, CCFC (david@commercialfreechildhood.org (link sends e-mail);) Advocates Say TikTok In Contempt of Court Order More kids than ever use the site due to COVID19 quarantine, but TikTok flouts settlement agreement with the FTC WASHINGTON, DC and BOSTON, MA—May 14, 2020—Today, a coalition of leading U.S. child advocacy, consumer, and privacy groups filed a complaint (link is external) urging the Federal Trade Commission (FTC) to investigate and sanction TikTok for putting kids at risk by continuing to violate the Children’s Online Privacy Protection Act (COPPA). In February 2019, TikTok paid a $5.7 million fine for violating COPPA, including illegally collecting personal information from children. But more than a year later, with quarantined kids and families flocking to the site in record numbers, TikTok has failed to delete personal information previously collected from children and is still collecting kids’ personal information without notice to and consent of parents. Campaign for a Commercial-Free Childhood (CCFC), the Center for Digital Democracy (CDD), and a total of 20 organizations demonstrated in their FTC filing that TikTok continues to violate COPPA by: failing to delete personal information related to children under 13 it obtained prior to the 2019 settlement order; failing to give direct notice to parents and to obtain parents’ consent before collecting kids’ personal information; and failing to give parents the right to review or delete their children’s personal information collected by TikTok. TikTok makes it easy for children to avoid obtaining parental consent. When a child under 13 tries to register using their actual birthdate, they will be signed up for a “younger users account” with limited functions, and no ability to share their videos. If a child is frustrated by this limited functionality, they can immediately register again with a fake birthdate from the same device for an account with full privileges, thereby putting them at risk for both TikTok’s commercial data uses and inappropriate contact from adults. In either case, TikTok makes no attempt to notify parents or obtain their consent. And TikTok doesn’t even comply with the law for those children who stick with limited “younger users accounts.” For these accounts, TikTok collects detailed information about how the child uses the app and uses artificial intelligence to determine what to show next, to keep the child engaged online as long as possible. The advocates, represented by the Communications & Technology Law Clinic in the Institute for Public Representation at Georgetown Law, asked the FTC to identify and hold responsible those individuals who made or ratified decisions to violate the settlement agreement. They also asked the FTC to prevent TikTok from registering any new accounts for persons in the US until it adopts a reliable method of determining the ages of its users and comes into full compliance with the children’s privacy rules. In light of TikTok’s vast financial resources, the number and severity of the violations, and the large number of US children that use TikTok, they asked the FTC to seek the maximum monetary penalties allowed by law. Josh Golin, Executive Director of Campaign for a Commercial-Free Childhood, said “For years, TikTok has ignored COPPA, thereby ensnaring perhaps millions of underage children in its marketing apparatus, and putting children at risk of sexual predation. Now, even after being caught red-handed by the FTC, TikTok continues to flout the law. We urge the Commission to take swift action and sanction TikTok again – this time with a fine and injunctive relief commensurate with the seriousness of TikTok’s serial violations.” Jeff Chester, Executive Director of the Center for Digital Democracy, said “Congress empowered the FTC to ensure that kids have online protections, yet here is another case of a digital giant deliberately violating the law. The failure of the FTC to ensure that TikTok protects the privacy of millions of children, including through its use of predictive AI applications, is another reason why there are questions whether the agency can be trusted to effectively oversee the kids’ data law.” Michael Rosenbloom, Staff Attorney and Teaching Fellow at the Institute for Public Representation, Georgetown Law, said “The FTC ordered TikTok to delete all personal information of children under 13 years old from its servers, but TikTok has clearly failed to do so. We easily found that many accounts featuring children were still present on TikTok. Many of these accounts have tens of thousands to millions of followers, and have been around since before the order. We urge the FTC to hold TikTok to account for continuing to violate both COPPA and its consent decree.” Katie McInnis, Policy Counsel at Consumer Reports, said "During the pandemic, families and children are turning to digital tools like TikTok to share videos with loved ones. Now more than ever, effective protection of children's personal information requires robust enforcement in order to incentivize companies, including TikTok, to comply with COPPA and any relevant consent decrees. We urge the FTC to investigate the matters raised in this complaint" Groups signing on to the complaint to the FTC are: Campaign for a Commercial-Free Childhood, the Center for Digital Democracy, Badass Teachers Association, Berkeley Media Studies Group, Children and Screens: Institute of Digital Media and Child Development, Consumer Action, Consumer Federation of America, Consumer Reports, Defending the Early Years, Electronic Privacy Information Center, Media Education Foundation, Obligation, Inc., Parent Coalition for Student Privacy, Parents Across America, ParentsTogether Foundation, Privacy Rights Clearinghouse, Public Citizen, The Story of Stuff, United Church of Christ, and USPIRG. ###