CDD

program areas Digital Citizen

  • Privacy Rights Are Civil Rights

    Over 40 Civil Rights, Civil Liberties, and Consumer Groups Call on Congress to Address Data-Driven Discrimination

  • Curbing Companies’ Bad Behavior Will Require Stronger Data Privacy Laws and a New Federal Data Privacy Agency Federal Privacy Laws Are Antiquated and Need Updating; New Data Privacy Legislation Must Include Civil Rights Protections and Enhanced Punishments for Violations Jan. 17, 2019 Contact: Don Owens, dowens@citizen.org (link sends e-mail), (202) 588-7767 Jeffrey Chester, jeff@democraticmedia.org (link sends e-mail), (202) 494-7100 WASHINGTON, D.C. – U.S. data privacy laws must be overhauled without pre-empting state laws and a new data privacy agency should be created to confront 21st century threats and address emerging concerns for digital customers, consumer and privacy organizations said today as they released a framework (link is external) for comprehensive privacy protection and digital rights for members of Congress. “Big Tech is coming to Washington looking for a deal that affords inadequate protections for privacy and other consumer rights but pre-empts states from defending their citizens against the tech companies’ surveillance and misuse of data,” said Robert Weissman, president of Public Citizen. “But here’s the bad news for the tech giants: That deal isn’t going to fly. Instead, the American people are demanding – and intend to win – meaningful federal restraints on tech company abuses of power that also ensure the right of states to craft their own consumer protections.” From the Equifax data breach to foreign election interference and targeted digital ads based on race, health and income, it’s clear that U.S. consumers face a crisis of confidence born from federal data privacy laws that are decades out of date and a lack of basic protections afforded them by digital conglomerates. These corporations, many of which dominate online spaces, are far more interested in monetizing every key stroke or click than protecting consumers from data breaches. For that reason, federal and state authorities must act, the groups maintain. The groups will push for federal legislation based on a familiar privacy framework, such as the original U.S. Code of Fair Information Practices and the widely followed Organization for Economic Cooperation and Development Privacy Guidelines. These frameworks should require companies that collect personal data and rights for individuals to: Establish limits on the collection, use and disclosure of sensitive personal data; Establish enhanced limits on the collection, use and disclosure of data of children and teens; Regulate consumer scoring and other business practices that diminish people’s physical health, education, financial and work prospects; and Prohibit or prevent manipulative marketing practices. The groups are calling for federal baseline legislation and oppose the pre-emption of state digital privacy laws. States have long acted as the “laboratories of democracy” and must continue to have the power to enact appropriate protections for their citizens as technology develops, the groups say. “Black communities should not have to choose between accessing the Internet and the right to control our data,” said Brandi Collins-Dexter, senior campaign director at Color Of Change. “We need privacy legislation that holds powerful corporations accountable for their impacts. Burdening our communities with the need to discern how complex terms of service and algorithms could harm us will only serve to reinforce discriminatory corporate practices. The privacy protection and digital rights principles released today create an important baseline for proactive data protections for our communities.” “For years now, Big Tech has used our sensitive information as a cash cow,” said Josh Golin, executive director of Campaign for a Commercial-Free Childhood. “Each innovation – whether it’s talking home assistants, new social media tools or software for schools – is designed to spy on families and children. We desperately need both 21st century legislation and a new federal agency with broad enforcement powers to ensure that children have a chance to grow up without their every move, keystroke, swipe and utterance tracked and monetized.” The United States is woefully behind other nations worldwide in providing these modern data protections for its consumers, instead relying solely on the Federal Trade Commission (FTC) to safeguard consumers and promote competition. But corporations understand that the FTC lacks rulemaking authority and that the agency often fails to enforce rules it has established. “The FTC has failed to act,” said Caitriona Fitzgerald, policy director at the Electronic Privacy Information Center. “The U.S. needs a dedicated data protection agency.” Alternately, many democratic nations like Canada, Mexico, the U.K., Ireland and Japan already have dedicated data protection agencies with independent authority and enforcement capabilities. Groups that have signed on to the framework include Berkeley Media Studies Group, Campaign for a Commercial-Free Childhood, Center for Digital Democracy, Center for Media Justice, Color of Change, Consumer Action, Consumer Federation of America, Defending Rights & Dissent, Electronic Privacy Information Center, Media Alliance, Parent Coalition for Student Privacy, Privacy Rights Clearinghouse, Privacy Times, Public Citizen, Stop Online Violence Against Women and U.S. PIRG. Read the groups’ proposal below. ###
  • CDD submits comments to The National Telecommunications and Information Administration On “Developing the Administration’s Approach to Consumer Privacy (link is external)" CDD argues that - Focus on “outcomes” is good but - Outcomes as defined by NTIA are too narrow and must include a broader discussion on privacy harms. They must include + identification harms (risks of identity theft, re-identification and sensitive inferences), + discrimination harms (inequities in the distribution of benefits and risks of exclusion), as well as + exploitation harms (personal data as commodity and risks to the vulnerable). - Legislation must not only achieve a reduction in privacy harms but must also ensure that “privacy benefits are fairly allocated”. Policy remedies must consider and be effective in addressing the inequities in the distribution of privacy benefits and harms. - NTIA’s list of desired outcomes of transparency, control, reasonable minimization, security, access and corrections, risk management, and accountability is a restatement of all-too-familiar privacy self-management paradigm. Privacy self-management alone is not enough as a policy solution. - Privacy is not an individual, commodified good that can and should be traded for other goods. - Legislation should focus less on data and more on outputs of data processing. So, instead of narrowing the scope of legislation to “personal data”, legislation must focus in on inferences, decisions and other data uses. - A risk-management approach must define risks broadly. NTIA should develop methodologies to assess the human rights, social, economic and ethical impacts of the use of algorithms in modern data processing.
  • Blog

    Center for Digital Democracy’s Principles for U.S. Privacy Legislation

    PROTECT PRIVACY RIGHTS, ADVANCE FAIR AND EQUITABLE OUTCOMES, LIMIT CORPORATE PRACTICES AND ENSURE GOVERNMENT LEADERSHIP AND ENFORCEMENT

    The Center for Digital Democracy provides the following recommendations for comprehensive baseline Federal privacy legislation. We are building on our expertise addressing digital marketplace developments for more than two decades, including work leading to the enactment of the 1998 Children’s Online Privacy Protection Act--the only federal online privacy law in the United States. Our recommendations are also informed by our long-standing trans-Atlantic work with consumer and privacy advocates in Europe, as well as the General Data Protection Regulation. We are alarmed by the increasingly intrusive and pervasive nature of commercial surveillance, which has the effect of controlling consumers’ and citizens’ behaviors, thoughts, and attitudes, and which sorts and tracks us as “winners” and “losers.” Today’s commercial practices have grown over the past decades unencumbered by regulatory constraints, and increasingly threaten the American ideals of self-determination, fairness, justice and equal opportunity. It is now time to address these developments: to grant basic rights to individuals and groups regarding data about them and how those data are used; to put limits on certain commercial data practices; and to strengthen our government to step in and protect our individual and common interests vis-à-vis powerful commercial entities. We call on legislators to consider the following principles: 1. Privacy protections should be broad: Set the scope of baseline legislation broadly and do not preempt stronger legislation Pervasive commercial surveillance practices know no limits, so legislation aiming to curtail negative practices should - address the full digital data life-cycle (collection, use, sharing, storage, on- and off-line) and cover all private entities’ public and private data processing, including nonprofits; - include all data derived from individuals, including personal information, inferred information, as well as aggregate and de-identified data; - apply all Fair Information Practice Principles (FIPPs) as a comprehensive baseline, including the principles of collection and use limitation, purpose specification, access and correction rights, accountability, data quality, and confidentiality/security; and require fairness in all data practices. - allow existing stronger federal legislation to prevail and let states continue to advance innovative legislation. 2. Individual privacy should be safeguarded: Give individuals rights to control the information about them - Building on FIPPs, individuals ought to have basic rights, including the right to + transparency and explanation + access + object and restrict + use privacy-enhancing technologies, including encryption + redress and compensation 3. Equitable, fair and just uses of data should be advanced: Place limits on certain data uses and safeguard equitable, fair and just outcomes Relying on “privacy self-management”—with the burden of responsibility placed solely on individuals alone to advance and protect their autonomy and self-determination—is not sufficient. Without one’s knowledge or participation, classifying and predictive data analytics may still draw inferences about individuals, resulting in injurious privacy violations—even if those harms are not immediately apparent. Importantly, these covert practices may result in pernicious forms of profiling and discrimination, harmful not just to the individual, but to groups and communities, particularly those with already diminished life chances, and society at large. Certain data practices may also unfairly influence the behavior of online users, such as children. Legislation should therefore address the impact of data practices and the distribution of harm by - placing limits on collecting, using and sharing sensitive personal information (such as data about ethnic or racial origin, political opinions/union membership, data concerning health, sex life or sexual orientation, genetic data, or biometric data) or data that reveals sensitive personal information, especially when using these data for profiling; - otherwise limiting the use of consumer scoring and other data practices, including in advertising, that have the effect of disproportionally and negatively affecting people’s life chances, related to, for example, housing, employment, finance, education, health and healthcare; - placing limits on manipulative marketing practices; - requiring particular safeguards when processing data relating to children and teens, especially with regard to marketing and profiling. 4. Privacy legislation should bring about real changes in corporate practices: Set limits and legal obligations for those managing data and require accountability Currently companies face very few limitations regarding their data practices. The presumption of “anything goes” has to end. Legislation should ensure that entities collecting, using, sharing data - can only do so for specific and appropriate purposes defined in advance, and subject to rules established by law and informed by data subjects’ freely given, specific, informed and unambiguous consent; for the execution of a contract, or as required by law; and without “pay-for-privacy provisions” or “take-it-or leave it” terms of service. - notify users in a timely fashion of data transfers and data breaches, and make consumers whole after a privacy violation or data breach; - cannot limit consumers’ right to redress with arbitration clauses; - are transparent and accountable, and adopt technical and organizational measures, including + provide for transparency, especially algorithmic transparency, + conduct impact assessments for high-risk processing considering the impact on individuals, groups, communities and society at large, + implement Privacy by Design and by Default, + assign resources and staff, including a Data Protection Officer, + implement appropriate oversight over third-party service providers/data processors, + conduct regular audits - are only allowed to transfer data to other countries/international organizations with essentially equivalent data protections in place. 5. Privacy protection should be consequential and aim to level the playing field: Give government at all levels significant and meaningful enforcement authority to protect privacy interests and give individuals legal remedies Without independent and flexible rulemaking data-protection authority, the Federal Trade Commission has been an ineffective agency for data protection. An agency with expertise and resources is needed to enforce company obligations. Ongoing research is required to anticipate and prepare for additionally warranted interventions to ensure a fair marketplace and a public sphere that strengthens our democratic institutions. Legislation should provide - for a strong, dedicated privacy agency with adequate resources, rulemaking authority and the ability to sanction non-compliance with meaningful penalties; - for independent authority for State Attorneys General; - for statutory damages and a private right of action; - for the federal agency to establish an office of technology impact assessment that would consider privacy, ethical, social, political, and economic impacts of high-risk data processing and other technologies; it would oversee and advise companies on their impact-assessment obligations.
  • Leading consumer privacy organizations in the United States write to express surprise and concern that not a single consumer representative was invited to testify at the September 26 Senate Commerce Committee hearing “Examining Safeguards for Consumer Data Privacy.”
  • Online political misinformation and false news have already resurfaced in the 2018 midterm elections. CDD has produced a short e-guide to help voters understand how online media platforms can be hijacked to fan political polarization and social conflict. Enough Already! Protect Yourself from Online Political Manipulation and False News in Election 2018 describes the tactics that widely surfaced in the last presidential election, how they have evolved since, and deconstructs the underlying architecture of online media, especially social networks, that have fueled the rise of disinformation and false news. The e-guide tells readers what they can do to try to take themselves out of the targeted advertising systems developed by Facebook, Twitter, YouTube and other big platforms. The guide also describes the big picture issues that must be addressed to rein in the abuses unleashed by Silicon Valley’s big data surveillance economy and advertising-driven revenue machine.
  • Reports

    The Influence Industry - Contemporary Digital Politics in the United States

    researched and written by Jeff Chester and Kathryn C. Montgomery

  • CDD today joined the Electronic Privacy Information Center (EPIC) and six other consumer groups in calling on the Federal Trade Commission to investigate the misleading and manipulative tactics of Google and Facebook in steering users to “consent” to privacy-invasive default settings. In a letter to the FTC, the eight groups complained that the technology companies deceptively nudge users to choose less privacy-friendly options. The complaint was based on the findings in a report, “Deceived by Design,” published today by the Norwegian Consumer Council. It found that Google and Facebook steer consumers into sharing vast amounts of information about themselves, through cunning design, privacy invasive defaults, and “take it or leave it”-choices, according to an analysis of the companies’ privacy updates. A report by Consumer Report investigating Facebook settings for US users found “that the design and language used in Facebook's privacy controls nudge people toward sharing the maximum amount of data with the company.” Read the Norwegian report, “Deceived by Design” here: https://www.forbrukerradet.no/undersokelse/no-undersokelsekategori/deceived-by-design (link is external) Read the letter the eight groups sent to the FTC today here: http://thepublicvoice.org/wp-content/uploads/2018/06/FTC-letter-Deceived-by-Design.pdf (link is external) Read the report by Consumer Report here: https://www.consumerreports.org/privacy/cr-researchers-find-facebook-privacy-settings-maximize-data-collection (link is external)