Publishings Digital Citizen
Program Areas
-
Press Release
Groups Call for New Privacy Agency to Replace FTC - Ensure Role for States' Protections
Curbing Companies’ Bad Behavior Will Require Stronger Data Privacy Laws and a New Federal Data Privacy Agency Federal Privacy Laws Are Antiquated and Need Updating; New Data Privacy Legislation Must Include Civil Rights Protections and Enhanced Punishments for Violations Jan. 17, 2019 Contact: Don Owens, dowens@citizen.org (link sends e-mail), (202) 588-7767 Jeffrey Chester, jeff@democraticmedia.org (link sends e-mail), (202) 494-7100 WASHINGTON, D.C. – U.S. data privacy laws must be overhauled without pre-empting state laws and a new data privacy agency should be created to confront 21st century threats and address emerging concerns for digital customers, consumer and privacy organizations said today as they released a framework (link is external) for comprehensive privacy protection and digital rights for members of Congress. “Big Tech is coming to Washington looking for a deal that affords inadequate protections for privacy and other consumer rights but pre-empts states from defending their citizens against the tech companies’ surveillance and misuse of data,” said Robert Weissman, president of Public Citizen. “But here’s the bad news for the tech giants: That deal isn’t going to fly. Instead, the American people are demanding – and intend to win – meaningful federal restraints on tech company abuses of power that also ensure the right of states to craft their own consumer protections.” From the Equifax data breach to foreign election interference and targeted digital ads based on race, health and income, it’s clear that U.S. consumers face a crisis of confidence born from federal data privacy laws that are decades out of date and a lack of basic protections afforded them by digital conglomerates. These corporations, many of which dominate online spaces, are far more interested in monetizing every key stroke or click than protecting consumers from data breaches. For that reason, federal and state authorities must act, the groups maintain. The groups will push for federal legislation based on a familiar privacy framework, such as the original U.S. Code of Fair Information Practices and the widely followed Organization for Economic Cooperation and Development Privacy Guidelines. These frameworks should require companies that collect personal data and rights for individuals to: Establish limits on the collection, use and disclosure of sensitive personal data; Establish enhanced limits on the collection, use and disclosure of data of children and teens; Regulate consumer scoring and other business practices that diminish people’s physical health, education, financial and work prospects; and Prohibit or prevent manipulative marketing practices. The groups are calling for federal baseline legislation and oppose the pre-emption of state digital privacy laws. States have long acted as the “laboratories of democracy” and must continue to have the power to enact appropriate protections for their citizens as technology develops, the groups say. “Black communities should not have to choose between accessing the Internet and the right to control our data,” said Brandi Collins-Dexter, senior campaign director at Color Of Change. “We need privacy legislation that holds powerful corporations accountable for their impacts. Burdening our communities with the need to discern how complex terms of service and algorithms could harm us will only serve to reinforce discriminatory corporate practices. The privacy protection and digital rights principles released today create an important baseline for proactive data protections for our communities.” “For years now, Big Tech has used our sensitive information as a cash cow,” said Josh Golin, executive director of Campaign for a Commercial-Free Childhood. “Each innovation – whether it’s talking home assistants, new social media tools or software for schools – is designed to spy on families and children. We desperately need both 21st century legislation and a new federal agency with broad enforcement powers to ensure that children have a chance to grow up without their every move, keystroke, swipe and utterance tracked and monetized.” The United States is woefully behind other nations worldwide in providing these modern data protections for its consumers, instead relying solely on the Federal Trade Commission (FTC) to safeguard consumers and promote competition. But corporations understand that the FTC lacks rulemaking authority and that the agency often fails to enforce rules it has established. “The FTC has failed to act,” said Caitriona Fitzgerald, policy director at the Electronic Privacy Information Center. “The U.S. needs a dedicated data protection agency.” Alternately, many democratic nations like Canada, Mexico, the U.K., Ireland and Japan already have dedicated data protection agencies with independent authority and enforcement capabilities. Groups that have signed on to the framework include Berkeley Media Studies Group, Campaign for a Commercial-Free Childhood, Center for Digital Democracy, Center for Media Justice, Color of Change, Consumer Action, Consumer Federation of America, Defending Rights & Dissent, Electronic Privacy Information Center, Media Alliance, Parent Coalition for Student Privacy, Privacy Rights Clearinghouse, Privacy Times, Public Citizen, Stop Online Violence Against Women and U.S. PIRG. Read the groups’ proposal below. ### -
CDD submits comments to The National Telecommunications and Information Administration On “Developing the Administration’s Approach to Consumer Privacy (link is external)" CDD argues that - Focus on “outcomes” is good but - Outcomes as defined by NTIA are too narrow and must include a broader discussion on privacy harms. They must include + identification harms (risks of identity theft, re-identification and sensitive inferences), + discrimination harms (inequities in the distribution of benefits and risks of exclusion), as well as + exploitation harms (personal data as commodity and risks to the vulnerable). - Legislation must not only achieve a reduction in privacy harms but must also ensure that “privacy benefits are fairly allocated”. Policy remedies must consider and be effective in addressing the inequities in the distribution of privacy benefits and harms. - NTIA’s list of desired outcomes of transparency, control, reasonable minimization, security, access and corrections, risk management, and accountability is a restatement of all-too-familiar privacy self-management paradigm. Privacy self-management alone is not enough as a policy solution. - Privacy is not an individual, commodified good that can and should be traded for other goods. - Legislation should focus less on data and more on outputs of data processing. So, instead of narrowing the scope of legislation to “personal data”, legislation must focus in on inferences, decisions and other data uses. - A risk-management approach must define risks broadly. NTIA should develop methodologies to assess the human rights, social, economic and ethical impacts of the use of algorithms in modern data processing.
-
Blog
Center for Digital Democracy’s Principles for U.S. Privacy Legislation
PROTECT PRIVACY RIGHTS, ADVANCE FAIR AND EQUITABLE OUTCOMES, LIMIT CORPORATE PRACTICES AND ENSURE GOVERNMENT LEADERSHIP AND ENFORCEMENT
The Center for Digital Democracy provides the following recommendations for comprehensive baseline Federal privacy legislation. We are building on our expertise addressing digital marketplace developments for more than two decades, including work leading to the enactment of the 1998 Children’s Online Privacy Protection Act--the only federal online privacy law in the United States. Our recommendations are also informed by our long-standing trans-Atlantic work with consumer and privacy advocates in Europe, as well as the General Data Protection Regulation. We are alarmed by the increasingly intrusive and pervasive nature of commercial surveillance, which has the effect of controlling consumers’ and citizens’ behaviors, thoughts, and attitudes, and which sorts and tracks us as “winners” and “losers.” Today’s commercial practices have grown over the past decades unencumbered by regulatory constraints, and increasingly threaten the American ideals of self-determination, fairness, justice and equal opportunity. It is now time to address these developments: to grant basic rights to individuals and groups regarding data about them and how those data are used; to put limits on certain commercial data practices; and to strengthen our government to step in and protect our individual and common interests vis-à-vis powerful commercial entities. We call on legislators to consider the following principles: 1. Privacy protections should be broad: Set the scope of baseline legislation broadly and do not preempt stronger legislation Pervasive commercial surveillance practices know no limits, so legislation aiming to curtail negative practices should - address the full digital data life-cycle (collection, use, sharing, storage, on- and off-line) and cover all private entities’ public and private data processing, including nonprofits; - include all data derived from individuals, including personal information, inferred information, as well as aggregate and de-identified data; - apply all Fair Information Practice Principles (FIPPs) as a comprehensive baseline, including the principles of collection and use limitation, purpose specification, access and correction rights, accountability, data quality, and confidentiality/security; and require fairness in all data practices. - allow existing stronger federal legislation to prevail and let states continue to advance innovative legislation. 2. Individual privacy should be safeguarded: Give individuals rights to control the information about them - Building on FIPPs, individuals ought to have basic rights, including the right to + transparency and explanation + access + object and restrict + use privacy-enhancing technologies, including encryption + redress and compensation 3. Equitable, fair and just uses of data should be advanced: Place limits on certain data uses and safeguard equitable, fair and just outcomes Relying on “privacy self-management”—with the burden of responsibility placed solely on individuals alone to advance and protect their autonomy and self-determination—is not sufficient. Without one’s knowledge or participation, classifying and predictive data analytics may still draw inferences about individuals, resulting in injurious privacy violations—even if those harms are not immediately apparent. Importantly, these covert practices may result in pernicious forms of profiling and discrimination, harmful not just to the individual, but to groups and communities, particularly those with already diminished life chances, and society at large. Certain data practices may also unfairly influence the behavior of online users, such as children. Legislation should therefore address the impact of data practices and the distribution of harm by - placing limits on collecting, using and sharing sensitive personal information (such as data about ethnic or racial origin, political opinions/union membership, data concerning health, sex life or sexual orientation, genetic data, or biometric data) or data that reveals sensitive personal information, especially when using these data for profiling; - otherwise limiting the use of consumer scoring and other data practices, including in advertising, that have the effect of disproportionally and negatively affecting people’s life chances, related to, for example, housing, employment, finance, education, health and healthcare; - placing limits on manipulative marketing practices; - requiring particular safeguards when processing data relating to children and teens, especially with regard to marketing and profiling. 4. Privacy legislation should bring about real changes in corporate practices: Set limits and legal obligations for those managing data and require accountability Currently companies face very few limitations regarding their data practices. The presumption of “anything goes” has to end. Legislation should ensure that entities collecting, using, sharing data - can only do so for specific and appropriate purposes defined in advance, and subject to rules established by law and informed by data subjects’ freely given, specific, informed and unambiguous consent; for the execution of a contract, or as required by law; and without “pay-for-privacy provisions” or “take-it-or leave it” terms of service. - notify users in a timely fashion of data transfers and data breaches, and make consumers whole after a privacy violation or data breach; - cannot limit consumers’ right to redress with arbitration clauses; - are transparent and accountable, and adopt technical and organizational measures, including + provide for transparency, especially algorithmic transparency, + conduct impact assessments for high-risk processing considering the impact on individuals, groups, communities and society at large, + implement Privacy by Design and by Default, + assign resources and staff, including a Data Protection Officer, + implement appropriate oversight over third-party service providers/data processors, + conduct regular audits - are only allowed to transfer data to other countries/international organizations with essentially equivalent data protections in place. 5. Privacy protection should be consequential and aim to level the playing field: Give government at all levels significant and meaningful enforcement authority to protect privacy interests and give individuals legal remedies Without independent and flexible rulemaking data-protection authority, the Federal Trade Commission has been an ineffective agency for data protection. An agency with expertise and resources is needed to enforce company obligations. Ongoing research is required to anticipate and prepare for additionally warranted interventions to ensure a fair marketplace and a public sphere that strengthens our democratic institutions. Legislation should provide - for a strong, dedicated privacy agency with adequate resources, rulemaking authority and the ability to sanction non-compliance with meaningful penalties; - for independent authority for State Attorneys General; - for statutory damages and a private right of action; - for the federal agency to establish an office of technology impact assessment that would consider privacy, ethical, social, political, and economic impacts of high-risk data processing and other technologies; it would oversee and advise companies on their impact-assessment obligations. -
Leading consumer privacy organizations in the United States write to express surprise and concern that not a single consumer representative was invited to testify at the September 26 Senate Commerce Committee hearing “Examining Safeguards for Consumer Data Privacy.”
-
Consumer advocates, digital rights, and civil rights groups are calling on U.S. companies to adopt the requirements of the General Data Protection Regulation (GDPR) as a baseline in the U.S. and worldwide. Companies processing personal data* in the U.S. and/or worldwide and which are subject to the GDPR in the European Union, ought to: - extend the same individual privacy rights to their customers in the U.S. and around the world; - implement the obligations placed on them under the GDPR; - demonstrate that they meet these obligations; - accept public and regulatory scrutiny and oversight of their personal data practices; - adhere to the evolving GDPR jurisprudence and regulatory guidance (*Under the GDPR processing includes collecting, storing, using, altering, generating, disclosing, and destroying personal data.) Specifically, at a minimum, companies ought to: 1. Treat the right to data privacy as a fundamental human right. - This right includes the right to: + Information/notice + access + rectification + erasure + restriction + portability + object + avoid certain automated decision-making and profiling, as well as direct marketing - For these rights to be meaningful, give individuals effective control over the processing of their data so that they can realize their rights, including + set system defaults to protect data + be transparent and fair in the way you use people’s data 2. Apply these rights and obligations to all personal data including to data that can identify an individual directly and indirectly. 3. Process data only if you have a legal basis to do so, including - On the basis of freely given, specific, informed and unambiguous consent - If necessary for the performance of a contract 4. In addition, process data only in accordance to the principles of fairness, transparency, purpose limitation, data minimization, accuracy, storage limitation, integrity and confidentiality/security. 5. Add extra safeguards, including explicit consent, when processing sensitive personal data (such as data about ethnic or racial origin, political opinions/union membership, data concerning health, sex life or sexual orientation, genetic data, or biometric data) or data that reveals sensitive personal data, especially when using this data for profiling. 6. Apply extra safeguards when processing data relating to children and teens, particularly with regard to marketing and profiling. 7. Be transparent and accountable, and adopt technical and organizational measures to meet these obligations, including - Provide for algorithmic transparency - Conduct impact assessments for high risk processing - Implement Privacy by Design and by Default - Assign resources and staff, including a Data Protection Officer - Implement appropriate oversight over third party service providers/data processors - Conduct regular audits - Document the processing 8. Notify consumers and regulatory authorities in case of a breach without undue delay. 9. Support the adoption of similar requirements in a data protection law that will ensure appropriate and effective regulatory oversight and enforcement for data processing that does not fall under EU jurisdiction. 10. Adopt these GDPR requirements as a baseline regardless of industry sector, in addition to any other national/federal, provincial/state or local privacy requirements that are stricter than the requirements advanced by the GDPR.
-
The European Union's updated data protection legislation comes into effect in Europe on May 25, 2018. It gives individuals new rights to better control their personal information and strengthens some of the rights that already exist. Enforcement and redress mechanisms have also been strengthened to ensure that these rights are respected. And – importantly – the definition of personal data is wider in the GDPR than in the current EU legislation, and now includes online identifiers, such as an IP address. Read the summary of the eight rights here. The right to information to access to rectify to delete (or “to be forgotten”) to restrict processing to data portability to object to avoid automated decision making and profiling.
-
The European General Data Protection Regulation (GDPR) will take effect May 25, 2018. The Trans Atlantic Consumer Dialogue (link is external) (TACD), of which CDD is a member, published a document detailing 10 things that US citizens and companies need-to-know about the forthcoming General Data Protection Regulation (GDPR).
-
Press Release
Consumer groups in the U.S. and EU urge Facebook to adopt the General Data Protection Regulation as a global baseline standard
In an open letter to Facebook's CEO Mark Zuckkerberg, members of the Transatlantic Consumer Dialogue urge the company "to confirm your company’s commitment to global compliance with the GDPR". -
In a statement issued today, CDD, EPIC and a coalition of consumer groups have called on the Federal Trade Commission to determine whether Facebook violated a 2011 Consent Order (link is external) when it facilitated the transfer of personal data of 50 million Facebook users to the data mining firm Cambridge Analytica. The groups had repeatedly urged (link is external) the FTC to enforce its own legal judgements. "The FTC's failure to act imperils not only privacy but democracy as well," the groups warned.