CDD

Newsroom

  • Blog

    Center for Digital Democracy’s Principles for U.S. Privacy Legislation

    PROTECT PRIVACY RIGHTS, ADVANCE FAIR AND EQUITABLE OUTCOMES, LIMIT CORPORATE PRACTICES AND ENSURE GOVERNMENT LEADERSHIP AND ENFORCEMENT

    The Center for Digital Democracy provides the following recommendations for comprehensive baseline Federal privacy legislation. We are building on our expertise addressing digital marketplace developments for more than two decades, including work leading to the enactment of the 1998 Children’s Online Privacy Protection Act--the only federal online privacy law in the United States. Our recommendations are also informed by our long-standing trans-Atlantic work with consumer and privacy advocates in Europe, as well as the General Data Protection Regulation. We are alarmed by the increasingly intrusive and pervasive nature of commercial surveillance, which has the effect of controlling consumers’ and citizens’ behaviors, thoughts, and attitudes, and which sorts and tracks us as “winners” and “losers.” Today’s commercial practices have grown over the past decades unencumbered by regulatory constraints, and increasingly threaten the American ideals of self-determination, fairness, justice and equal opportunity. It is now time to address these developments: to grant basic rights to individuals and groups regarding data about them and how those data are used; to put limits on certain commercial data practices; and to strengthen our government to step in and protect our individual and common interests vis-à-vis powerful commercial entities. We call on legislators to consider the following principles: 1. Privacy protections should be broad: Set the scope of baseline legislation broadly and do not preempt stronger legislation Pervasive commercial surveillance practices know no limits, so legislation aiming to curtail negative practices should - address the full digital data life-cycle (collection, use, sharing, storage, on- and off-line) and cover all private entities’ public and private data processing, including nonprofits; - include all data derived from individuals, including personal information, inferred information, as well as aggregate and de-identified data; - apply all Fair Information Practice Principles (FIPPs) as a comprehensive baseline, including the principles of collection and use limitation, purpose specification, access and correction rights, accountability, data quality, and confidentiality/security; and require fairness in all data practices. - allow existing stronger federal legislation to prevail and let states continue to advance innovative legislation. 2. Individual privacy should be safeguarded: Give individuals rights to control the information about them - Building on FIPPs, individuals ought to have basic rights, including the right to + transparency and explanation + access + object and restrict + use privacy-enhancing technologies, including encryption + redress and compensation 3. Equitable, fair and just uses of data should be advanced: Place limits on certain data uses and safeguard equitable, fair and just outcomes Relying on “privacy self-management”—with the burden of responsibility placed solely on individuals alone to advance and protect their autonomy and self-determination—is not sufficient. Without one’s knowledge or participation, classifying and predictive data analytics may still draw inferences about individuals, resulting in injurious privacy violations—even if those harms are not immediately apparent. Importantly, these covert practices may result in pernicious forms of profiling and discrimination, harmful not just to the individual, but to groups and communities, particularly those with already diminished life chances, and society at large. Certain data practices may also unfairly influence the behavior of online users, such as children. Legislation should therefore address the impact of data practices and the distribution of harm by - placing limits on collecting, using and sharing sensitive personal information (such as data about ethnic or racial origin, political opinions/union membership, data concerning health, sex life or sexual orientation, genetic data, or biometric data) or data that reveals sensitive personal information, especially when using these data for profiling; - otherwise limiting the use of consumer scoring and other data practices, including in advertising, that have the effect of disproportionally and negatively affecting people’s life chances, related to, for example, housing, employment, finance, education, health and healthcare; - placing limits on manipulative marketing practices; - requiring particular safeguards when processing data relating to children and teens, especially with regard to marketing and profiling. 4. Privacy legislation should bring about real changes in corporate practices: Set limits and legal obligations for those managing data and require accountability Currently companies face very few limitations regarding their data practices. The presumption of “anything goes” has to end. Legislation should ensure that entities collecting, using, sharing data - can only do so for specific and appropriate purposes defined in advance, and subject to rules established by law and informed by data subjects’ freely given, specific, informed and unambiguous consent; for the execution of a contract, or as required by law; and without “pay-for-privacy provisions” or “take-it-or leave it” terms of service. - notify users in a timely fashion of data transfers and data breaches, and make consumers whole after a privacy violation or data breach; - cannot limit consumers’ right to redress with arbitration clauses; - are transparent and accountable, and adopt technical and organizational measures, including + provide for transparency, especially algorithmic transparency, + conduct impact assessments for high-risk processing considering the impact on individuals, groups, communities and society at large, + implement Privacy by Design and by Default, + assign resources and staff, including a Data Protection Officer, + implement appropriate oversight over third-party service providers/data processors, + conduct regular audits - are only allowed to transfer data to other countries/international organizations with essentially equivalent data protections in place. 5. Privacy protection should be consequential and aim to level the playing field: Give government at all levels significant and meaningful enforcement authority to protect privacy interests and give individuals legal remedies Without independent and flexible rulemaking data-protection authority, the Federal Trade Commission has been an ineffective agency for data protection. An agency with expertise and resources is needed to enforce company obligations. Ongoing research is required to anticipate and prepare for additionally warranted interventions to ensure a fair marketplace and a public sphere that strengthens our democratic institutions. Legislation should provide - for a strong, dedicated privacy agency with adequate resources, rulemaking authority and the ability to sanction non-compliance with meaningful penalties; - for independent authority for State Attorneys General; - for statutory damages and a private right of action; - for the federal agency to establish an office of technology impact assessment that would consider privacy, ethical, social, political, and economic impacts of high-risk data processing and other technologies; it would oversee and advise companies on their impact-assessment obligations.
  • Media Advisory – Save the Date FOR IMMEDIATE RELEASE October 3, 2018 Contact: Jeff Chester jeff@democraticmedia.org (link sends e-mail) COPPA--Protecting Children’s Privacy Online for 20 Years Sen. Ed Markey, Advocates and Experts Celebrate COPPA as they focus on future challenges posed by the digital marketplace October 17th, Capitol Hill, Open to Public Washington, D.C. To mark the 20th anniversary of the 1998 Children’s Online Privacy Protection Act (COPPA), Senator Edward J. Markey (DMA) —its principal congressional sponsor—will be joined by key representatives from the consumer, child advocacy, and privacy groups involved in implementing the law, at a public forum on Wednesday, October 17 from 12:30-3:30 pm in Room 385 of the Senate Russell Office Building (SR-385). Senator Markey will deliver a keynote speech followed by two panels featuring representatives from Electronic Privacy Information Center, Campaign for Commercial Free Childhood, Common Sense Media, Center for Digital Democracy, Color of Change, and Institute for Public Representation (Georgetown University Law Center), among others. Prof. Kathryn C. Montgomery, who spearheaded the public campaign that led to COPPA, will moderate. “COPPA is the nation’s constitution for children’s communication. For 20 years it has shielded our nation’s children from invasive practices and encroaching actors on the internet,” Sen. Markey noted. “It puts children and families in control and holds violators accountable when they compromise kids’ privacy. As we celebrate the 20th anniversary of COPPA, we must look to the future.” In addition to discussing COPPA’s impact, speakers will explore the expanding interactive and data-driven world young people face today, which is being transformed by a host of powerful technologies, such as artificial intelligence, virtual reality, and internet-connected toys. “In 2018, children grow up in an increasingly connected and digital world with ever-emerging threats to their sensitive personal information,” explained Sen. Markey. “Two decades after the passage of this bedrock law, it is time to redouble our efforts and safeguard the precious privacy of our youngest Americans.” The event is free and open to the public, but seating is limited. Lunch will be served. Please RSVP to jeff@democraticmedia.org (link sends e-mail).
  • October 1, 2018 Chairman John Thune Ranking Member Bill Nelson Senate Commerce Committee Washington, DC Dear Chairman Thune and Ranking Member Nelson, We appreciate your interest in consumer privacy and the hearing you convened recently to explore this topic. Still, our concerns remain that the hearing, with only industry representatives, was unnecessarily biased. Many of the problems consumers face, as well as the solutions we would propose, were simply never mentioned. There is little point in asking industry groups how they would like to be regulated. None of the proposals endorsed by the witnesses yesterday would have any substantial impact on the data collection practices of their firms. Such regulation will simply fortify business interests to the detriment of online users. And the absence of consumer advocates at the first hearing was also missed opportunity for a direct exchange about points made by the industry witnesses. We understand that you are planning to hold a second hearing in early October. In keeping with the structure of the first hearing, we ask that you invite six consumer privacy experts to testify before the Committee. We would also suggest that you organize an additional panel with other experts and enforcement officials, including Dr. Jelenik, the Chair of the European Data Protection Board, as well as State Attorneys General, who are now on the front lines of consumer protection in the United States. Thank you for your consideration of our views. We look forward to working with you. Sincerely, Access Humboldt Access Now Campaign for a Commercial-Free Childhood Center for Digital Democracy Common Sense Consumer Action Consumer Federation of America Customer Commons Digital Privacy Alliance Electronic Frontier Foundation EPIC Media Alliance National Association of Consumer Advocates New America's Open Technology Institute New York Public Interest Research Group (NYPIRG) Privacy Rights Clearing House U.S. Public Interest Research Group (U.S. PIRG) World Privacy Forum
  • September 25, 2018 Contact: Jeff Chester-202-494-7100 David Monahan 617-896-9397 For Immediate Release Child Advocacy and Consumer Groups Tell FCC to Keep Key TV Safeguards for Children Overturning Children’s TV Act rules will harm kids and be a huge giveaway of public airwaves to broadcast and cable companies Three leading nonprofit groups working to advance the interests of children in the digital era told the Federal Communications Commission (FCC) that its plan to dismantle long-standing safeguards designed to ensure all children have access to quality TV programing will harm American kids. The proposal to jettison guidelines which require broadcast TV stations air a minimum of three hours a week of educational programming on their primary channel and additional programming on multicast channels would significantly reduce the availability of higher quality shows, they explained in a filing (link is external) today. “The FCC seeks to strip away one of the only federal rules that helps both children and parents,” explained Jeff Chester, executive director of the Center for Digital Democracy. Chester helped lead the campaign back in the 1990’s that led to the current CTA rules. “It is also one of the only concrete public-interest requirements that Congress mandated in exchange for free use of the public airwaves, which allow television stations to earn vast revenues from both advertising and fees paid by cable companies. Just as the GOP FCC majority did when it killed network neutrality, the commission only seems interested in protecting the interests of the big broadcast and cable companies,” Chester said. “The Commission’s proposal would effectively eliminate children’s programming on broadcast television, where at least there are some limits on commercialism,” said Campaign for a Commercial-Free Childhood executive director Josh Golin. "Internet and mobile platforms for children are rife with many types of unfair and deceptive marketing that aren’t allow on kids’ TV. Rather than facilitating a race to the bottom, the FCC should work with lawmakers and the FTC to develop cross-platform rules to ensure all children access to quality, commercial-free media regardless of the platforms and devices their families own.” Without citing any evidence about the quality, cost and availability of children’s educational programs delivered by other means, the FCC claims that because children can watch children’s educational programs on cable, YouTube, Netflix, Amazon and Hulu, commercial television stations should not be required to air children’s educational programming. But in comments drafted by the Georgetown Law Communications and Technology Clinic, the advocates note, “To use non-broadcast services, households must have access to cable or broadband service, and be able to afford subscription fees and equipment. Children who live in rural areas, or whose families are low-income, and cannot access or afford alternative program options, will be hurt the most” if the FCC proposal is adopted. The three groups—Center for Digital Democracy, Campaign for a Commercial-Free Childhood, and the Benton Foundation—pledged to educate the public, including parents, educators and concerned citizens, so they can raise concerns with the FCC and other policy makers. --30--
  • Leading consumer privacy organizations in the United States write to express surprise and concern that not a single consumer representative was invited to testify at the September 26 Senate Commerce Committee hearing “Examining Safeguards for Consumer Data Privacy.”
  • CDD Releases E-Guide to Help Protect Voters From Online Manipulation and False News Washington, D.C.: September 12, 2018 To help fight online political misinformation and false news, which has already resurfaced in the 2018 midterm elections, CDD has produced a short e-guide to help voters understand how online media platforms can be hijacked to fan political polarization and social conflict. Enough Already! Protect Yourself from Online Political Manipulation and False News in Election 2018 describes the tactics that widely surfaced in the last presidential election, how they have evolved since, and deconstructs the underlying architecture of online media, especially social networks, that have fueled the rise of disinformation and false news. The e-guide tells voters what they can do to try to take themselves out of the targeted advertising systems developed by Facebook, Twitter, YouTube and other big platforms. The guide also describes the big picture issues that must be addressed to rein in the abuses unleashed by Silicon Valley’s big data surveillance economy and advertising-driven revenue machine. The e-guide is available for free download at the CDD web site. Journalists, activists and interested voters are urged to spread the guide to friends and colleagues. Contact: Jeff Chester, jeff@democraticmedia.org (link sends e-mail) 202-494-7100
  • Online political misinformation and false news have already resurfaced in the 2018 midterm elections. CDD has produced a short e-guide to help voters understand how online media platforms can be hijacked to fan political polarization and social conflict. Enough Already! Protect Yourself from Online Political Manipulation and False News in Election 2018 describes the tactics that widely surfaced in the last presidential election, how they have evolved since, and deconstructs the underlying architecture of online media, especially social networks, that have fueled the rise of disinformation and false news. The e-guide tells readers what they can do to try to take themselves out of the targeted advertising systems developed by Facebook, Twitter, YouTube and other big platforms. The guide also describes the big picture issues that must be addressed to rein in the abuses unleashed by Silicon Valley’s big data surveillance economy and advertising-driven revenue machine.
  • Reports

    The Influence Industry - Contemporary Digital Politics in the United States

    researched and written by Jeff Chester and Kathryn C. Montgomery

  • The Center for Digital Democracy (CDD), Berkeley Media Studies Group, and Color of Change urge the Federal Trade Commission (FTC) to specifically acknowledge the important issues involving the privacy and welfare of young people by adding this issue to its proposed hearing agenda on competition and consumer welfare.
  • CDD today joined the Electronic Privacy Information Center (EPIC) and six other consumer groups in calling on the Federal Trade Commission to investigate the misleading and manipulative tactics of Google and Facebook in steering users to “consent” to privacy-invasive default settings. In a letter to the FTC, the eight groups complained that the technology companies deceptively nudge users to choose less privacy-friendly options. The complaint was based on the findings in a report, “Deceived by Design,” published today by the Norwegian Consumer Council. It found that Google and Facebook steer consumers into sharing vast amounts of information about themselves, through cunning design, privacy invasive defaults, and “take it or leave it”-choices, according to an analysis of the companies’ privacy updates. A report by Consumer Report investigating Facebook settings for US users found “that the design and language used in Facebook's privacy controls nudge people toward sharing the maximum amount of data with the company.” Read the Norwegian report, “Deceived by Design” here: https://www.forbrukerradet.no/undersokelse/no-undersokelsekategori/deceived-by-design (link is external) Read the letter the eight groups sent to the FTC today here: http://thepublicvoice.org/wp-content/uploads/2018/06/FTC-letter-Deceived-by-Design.pdf (link is external) Read the report by Consumer Report here: https://www.consumerreports.org/privacy/cr-researchers-find-facebook-privacy-settings-maximize-data-collection (link is external)
  • The Center for Digital Democracy (CDD) respectfully urges the Federal Election Commission (FEC) to adopt regulations to ensure that voters will have meaningful transparency and control over the digital data and marketing practices used in elections today. The FEC must boldly act and use its legal authority and leadership position to enact—as well as recommend—much-needed safeguards. We call on the FEC to tell campaigns that they must refrain from using digital tactics that promote “voter suppression.” It should also urge federal candidates not to use viral and other forms of stealth communications to influence voters through misinformation—including “fake news.” The FEC should go on record saying that political campaigns should not deploy digital marketing tactics that have not been publicly assessed for their impact on the integrity of the voting process—such as the use of predictive artificial intelligence products (including bots) and applications designed to bypass conscious decision-making (through the use of neuromarketing and emotionally based psychometrics). Read more.
  • U.S. companies should adopt the same data protection rules that are poised to go into effect in the European Union on May 25, Public Citizen, the Center for Digital Democracy and Privacy International said today.
  • Consumer advocates, digital rights, and civil rights groups are calling on U.S. companies to adopt the requirements of the General Data Protection Regulation (GDPR) as a baseline in the U.S. and worldwide. Companies processing personal data* in the U.S. and/or worldwide and which are subject to the GDPR in the European Union, ought to: - extend the same individual privacy rights to their customers in the U.S. and around the world; - implement the obligations placed on them under the GDPR; - demonstrate that they meet these obligations; - accept public and regulatory scrutiny and oversight of their personal data practices; - adhere to the evolving GDPR jurisprudence and regulatory guidance (*Under the GDPR processing includes collecting, storing, using, altering, generating, disclosing, and destroying personal data.) Specifically, at a minimum, companies ought to: 1. Treat the right to data privacy as a fundamental human right. - This right includes the right to: + Information/notice + access + rectification + erasure + restriction + portability + object + avoid certain automated decision-making and profiling, as well as direct marketing - For these rights to be meaningful, give individuals effective control over the processing of their data so that they can realize their rights, including + set system defaults to protect data + be transparent and fair in the way you use people’s data 2. Apply these rights and obligations to all personal data including to data that can identify an individual directly and indirectly. 3. Process data only if you have a legal basis to do so, including - On the basis of freely given, specific, informed and unambiguous consent - If necessary for the performance of a contract 4. In addition, process data only in accordance to the principles of fairness, transparency, purpose limitation, data minimization, accuracy, storage limitation, integrity and confidentiality/security. 5. Add extra safeguards, including explicit consent, when processing sensitive personal data (such as data about ethnic or racial origin, political opinions/union membership, data concerning health, sex life or sexual orientation, genetic data, or biometric data) or data that reveals sensitive personal data, especially when using this data for profiling. 6. Apply extra safeguards when processing data relating to children and teens, particularly with regard to marketing and profiling. 7. Be transparent and accountable, and adopt technical and organizational measures to meet these obligations, including - Provide for algorithmic transparency - Conduct impact assessments for high risk processing - Implement Privacy by Design and by Default - Assign resources and staff, including a Data Protection Officer - Implement appropriate oversight over third party service providers/data processors - Conduct regular audits - Document the processing 8. Notify consumers and regulatory authorities in case of a breach without undue delay. 9. Support the adoption of similar requirements in a data protection law that will ensure appropriate and effective regulatory oversight and enforcement for data processing that does not fall under EU jurisdiction. 10. Adopt these GDPR requirements as a baseline regardless of industry sector, in addition to any other national/federal, provincial/state or local privacy requirements that are stricter than the requirements advanced by the GDPR.
  • The European Union's updated data protection legislation comes into effect in Europe on May 25, 2018. It gives individuals new rights to better control their personal information and strengthens some of the rights that already exist. Enforcement and redress mechanisms have also been strengthened to ensure that these rights are respected. And – importantly – the definition of personal data is wider in the GDPR than in the current EU legislation, and now includes online identifiers, such as an IP address. Read the summary of the eight rights here. The right to information to access to rectify to delete (or “to be forgotten”) to restrict processing to data portability to object to avoid automated decision making and profiling.
  • The European General Data Protection Regulation (GDPR) will take effect May 25, 2018. The Trans Atlantic Consumer Dialogue (link is external) (TACD), of which CDD is a member, published a document detailing 10 things that US citizens and companies need-to-know about the forthcoming General Data Protection Regulation (GDPR).
  • In an open letter to Facebook's CEO Mark Zuckkerberg, members of the Transatlantic Consumer Dialogue urge the company "to confirm your company’s commitment to global compliance with the GDPR".
  • Press Release

    Advocates Say Google’s YouTube Violates Federal Children’s Privacy Law

    Consumer, privacy and children’s groups file complaint urging FTC to stop most popular kids’ online video service from gathering children’s data

    WASHINGTON, DC—April 9, 2018—Today, a coalition of leading U.S. child advocacy, consumer, and privacy groups represented by the Institute for Public Representation filed a complaint (link is external) urging the Federal Trade Commission (FTC) to investigate and sanction Google for violations of the Children’s Online Privacy Protection Act (COPPA) in operating YouTube. Google claims that YouTube is only for users 13 and up, despite being the most popular online platform for children, used by 80% of American children ages 6 to 12. The site features many programs designed and promoted for children and Google generates significant profits from kid-targeted advertising. The complaint says the FTC should subject Google to penalties, which could total in the billions of dollars. The Center for Digital Democracy (CDD), Campaign for a Commercial-Free Childhood (CCFC), and 21 other organizations demonstrated in their filing that Google, which owns YouTube, makes substantial profits collecting many types of personal information on kids on YouTube, including geolocation, unique device identifiers, mobile telephone numbers, and persistent identifiers used to recognize a user over time and across different websites or online services. Google collects this information without first providing direct notice to parents and obtaining their consent, and Google uses it to target advertisements to kids across the internet, including across devices. COPPA bars the operator of a website directed to children, or that has knowledge of children using it, from collecting and using such information without obtaining parental consent. CCFC’s Executive Director Josh Golin said, “For years, Google has abdicated its responsibility to kids and families by disingenuously claiming YouTube—a site rife with popular cartoons, nursery rhymes, and toy ads—is not for children under thirteen. Google profits immensely by delivering ads to kids and must comply with COPPA. It’s time for the FTC to hold Google accountable for its illegal data collection and advertising practices.” Child directed channels such as ChuChuTV Nursery Rhymes & Kids Songs (15.9 million subscribers and over 10 billion channel views) and LittleBabyBum (14.6 million subscribers and over 14 billion channel views) are among the most popular channels on YouTube. Major advertisers pay Google a premium to place their ads in a platform known as “Google Preferred,” which includes a “Parenting and Family” lineup comprised mostly of popular channels targeted to children. “Google has acted duplicitously by falsely claiming in its terms of service that YouTube is only for those who are age 13 or older, while it deliberately lured young people into an ad-filled digital playground,” said Jeff Chester of the Center for Digital Democracy. “Just like Facebook, Google has focused its huge resources on generating profits instead of protecting privacy.” Angela J. Campbell, counsel for CCFC and CDD, said: “Given the large number of children affected and the extent of YouTube’s COPPA violations, the FTC needs to impose large civil penalties to show it is serious about protecting children’s privacy online.” James P. Steyer, CEO of Common Sense, said: "Kids have been watching videos on YouTube for years, something the company has known, and profited off of, by targeting content and ads at children under 13. It is time for Google to be completely transparent with all the facts and institute fundamentally responsible new policies moving forward to protect the privacy of kids. We fully expect Google to work closely with advocates and reach out to parents with information about parental controls, content, and collection practices on YouTube so parents can make informed choices about what content they allow their kids to access and how to protect their privacy.” Katie McInnis, policy counsel for Consumers Union, said: “YouTube knows children are watching content on their site, and has created content channels specifically aimed at them, but does not appear to obtain the required parental consent before collecting information about them. Google has the responsibility to be COPPA-compliant and ensure that children can safely watch the programs designed and promoted for kids. These practices present serious concerns that warrant the FTC’s attention.” Groups signing on to the complaint to the FTC along with CDD and CCFC are: Berkeley Media Studies Group; Center for Media Justice; Common Sense; Consumer Action; Consumer Federation of America; Consumer Federation of California; Consumers Union, the advocacy division of Consumer Reports; Consumer Watchdog; Corporate Accountability; Defending the Early Years; Electronic Privacy Information Center (“EPIC”); New Dream; Obligation, Inc.; Parent Coalition for Student Privacy; Parents Across America; Parents Television Council; Privacy Rights Clearinghouse; Public Citizen; The Story of Stuff Project; TRUCE (Teachers Resisting Unhealthy Childhood Entertainment); and USPIRG. The complaint was drafted by the Communications & Technology Law Clinic in the Institute for Public Representation at Georgetown University Law Center. ###