CDD

program areas Digital Citizen

  • Online political misinformation and false news have already resurfaced in the 2018 midterm elections. CDD has produced a short e-guide to help voters understand how online media platforms can be hijacked to fan political polarization and social conflict. Enough Already! Protect Yourself from Online Political Manipulation and False News in Election 2018 describes the tactics that widely surfaced in the last presidential election, how they have evolved since, and deconstructs the underlying architecture of online media, especially social networks, that have fueled the rise of disinformation and false news. The e-guide tells readers what they can do to try to take themselves out of the targeted advertising systems developed by Facebook, Twitter, YouTube and other big platforms. The guide also describes the big picture issues that must be addressed to rein in the abuses unleashed by Silicon Valley’s big data surveillance economy and advertising-driven revenue machine.
  • Reports

    The Influence Industry - Contemporary Digital Politics in the United States

    researched and written by Jeff Chester and Kathryn C. Montgomery

  • CDD today joined the Electronic Privacy Information Center (EPIC) and six other consumer groups in calling on the Federal Trade Commission to investigate the misleading and manipulative tactics of Google and Facebook in steering users to “consent” to privacy-invasive default settings. In a letter to the FTC, the eight groups complained that the technology companies deceptively nudge users to choose less privacy-friendly options. The complaint was based on the findings in a report, “Deceived by Design,” published today by the Norwegian Consumer Council. It found that Google and Facebook steer consumers into sharing vast amounts of information about themselves, through cunning design, privacy invasive defaults, and “take it or leave it”-choices, according to an analysis of the companies’ privacy updates. A report by Consumer Report investigating Facebook settings for US users found “that the design and language used in Facebook's privacy controls nudge people toward sharing the maximum amount of data with the company.” Read the Norwegian report, “Deceived by Design” here: https://www.forbrukerradet.no/undersokelse/no-undersokelsekategori/deceived-by-design (link is external) Read the letter the eight groups sent to the FTC today here: http://thepublicvoice.org/wp-content/uploads/2018/06/FTC-letter-Deceived-by-Design.pdf (link is external) Read the report by Consumer Report here: https://www.consumerreports.org/privacy/cr-researchers-find-facebook-privacy-settings-maximize-data-collection (link is external)
  • The Center for Digital Democracy (CDD) respectfully urges the Federal Election Commission (FEC) to adopt regulations to ensure that voters will have meaningful transparency and control over the digital data and marketing practices used in elections today. The FEC must boldly act and use its legal authority and leadership position to enact—as well as recommend—much-needed safeguards. We call on the FEC to tell campaigns that they must refrain from using digital tactics that promote “voter suppression.” It should also urge federal candidates not to use viral and other forms of stealth communications to influence voters through misinformation—including “fake news.” The FEC should go on record saying that political campaigns should not deploy digital marketing tactics that have not been publicly assessed for their impact on the integrity of the voting process—such as the use of predictive artificial intelligence products (including bots) and applications designed to bypass conscious decision-making (through the use of neuromarketing and emotionally based psychometrics). Read more.
  • Consumer advocates, digital rights, and civil rights groups are calling on U.S. companies to adopt the requirements of the General Data Protection Regulation (GDPR) as a baseline in the U.S. and worldwide. Companies processing personal data* in the U.S. and/or worldwide and which are subject to the GDPR in the European Union, ought to: - extend the same individual privacy rights to their customers in the U.S. and around the world; - implement the obligations placed on them under the GDPR; - demonstrate that they meet these obligations; - accept public and regulatory scrutiny and oversight of their personal data practices; - adhere to the evolving GDPR jurisprudence and regulatory guidance (*Under the GDPR processing includes collecting, storing, using, altering, generating, disclosing, and destroying personal data.) Specifically, at a minimum, companies ought to: 1. Treat the right to data privacy as a fundamental human right. - This right includes the right to: + Information/notice + access + rectification + erasure + restriction + portability + object + avoid certain automated decision-making and profiling, as well as direct marketing - For these rights to be meaningful, give individuals effective control over the processing of their data so that they can realize their rights, including + set system defaults to protect data + be transparent and fair in the way you use people’s data 2. Apply these rights and obligations to all personal data including to data that can identify an individual directly and indirectly. 3. Process data only if you have a legal basis to do so, including - On the basis of freely given, specific, informed and unambiguous consent - If necessary for the performance of a contract 4. In addition, process data only in accordance to the principles of fairness, transparency, purpose limitation, data minimization, accuracy, storage limitation, integrity and confidentiality/security. 5. Add extra safeguards, including explicit consent, when processing sensitive personal data (such as data about ethnic or racial origin, political opinions/union membership, data concerning health, sex life or sexual orientation, genetic data, or biometric data) or data that reveals sensitive personal data, especially when using this data for profiling. 6. Apply extra safeguards when processing data relating to children and teens, particularly with regard to marketing and profiling. 7. Be transparent and accountable, and adopt technical and organizational measures to meet these obligations, including - Provide for algorithmic transparency - Conduct impact assessments for high risk processing - Implement Privacy by Design and by Default - Assign resources and staff, including a Data Protection Officer - Implement appropriate oversight over third party service providers/data processors - Conduct regular audits - Document the processing 8. Notify consumers and regulatory authorities in case of a breach without undue delay. 9. Support the adoption of similar requirements in a data protection law that will ensure appropriate and effective regulatory oversight and enforcement for data processing that does not fall under EU jurisdiction. 10. Adopt these GDPR requirements as a baseline regardless of industry sector, in addition to any other national/federal, provincial/state or local privacy requirements that are stricter than the requirements advanced by the GDPR.
  • The European Union's updated data protection legislation comes into effect in Europe on May 25, 2018. It gives individuals new rights to better control their personal information and strengthens some of the rights that already exist. Enforcement and redress mechanisms have also been strengthened to ensure that these rights are respected. And – importantly – the definition of personal data is wider in the GDPR than in the current EU legislation, and now includes online identifiers, such as an IP address. Read the summary of the eight rights here. The right to information to access to rectify to delete (or “to be forgotten”) to restrict processing to data portability to object to avoid automated decision making and profiling.
  • The European General Data Protection Regulation (GDPR) will take effect May 25, 2018. The Trans Atlantic Consumer Dialogue (link is external) (TACD), of which CDD is a member, published a document detailing 10 things that US citizens and companies need-to-know about the forthcoming General Data Protection Regulation (GDPR).
  • In an open letter to Facebook's CEO Mark Zuckkerberg, members of the Transatlantic Consumer Dialogue urge the company "to confirm your company’s commitment to global compliance with the GDPR".
  • In a statement issued today, CDD, EPIC and a coalition of consumer groups have called on the Federal Trade Commission to determine whether Facebook violated a 2011 Consent Order (link is external) when it facilitated the transfer of personal data of 50 million Facebook users to the data mining firm Cambridge Analytica. The groups had repeatedly urged (link is external) the FTC to enforce its own legal judgements. "The FTC's failure to act imperils not only privacy but democracy as well," the groups warned.
  • Can Democracy Survive Big Data & Micro-Profiling in Elections? (CPDP 2018 Video)

    Organized by Center for Digital Democracy & Transatlantic Consumer Dialogue

    Today’s political candidates and issue campaigns are fully integrated into the growing Big Data marketing infrastructure, with more and more companies in this sphere accelerating the pace of research and innovation and promising to transform how political campaigns and elections are conducted. Data management platforms, marketing clouds, and other new data services enable information about one’s finances, health, race, ethnicity, shopping behavior, and geo-location to be combined with political interests, reading habits, and voting records. Social media and digital platforms are facilitating many of these techniques, monetizing and normalizing “fake news,” “dark posts”, and other practices, and challenging fundamental principles such as privacy, data protection, and individual autonomy. It has been widely reported that political Big Data digital micro-targeting played a role in the election of President Trump as well as the Brexit vote in the UK, and is now the subject to growing scrutiny by regulatory authorities. Is the use of such technologies likely to cause harm and undermine the democratic process? What is the link between these technologies and fake news? How do policy frameworks in western democracies compare, in terms of controlling political election campaigns practices? What is the role of data protection legislation in protecting the privacy of voters? And what are the challenges for data protection authorities in addressing how commercial data can be sold or shared with political groups? --- Chair: Paul-Olivier Dehaye, PersonalDataIO (CH) Moderator: Anna Fielder, Transatlantic Consumer Dialogue (UK) Speakers: Michael McEvoy, Office for Information and Privacy Commissioner of British Columbia (CA); Irina Vasiliu, DG Justice, European Commission (EU); Jeffrey Chester, Center for Digital Democracy (US); Juhi Kulshrestha, Hans Bredow Institute for Media Research (DE)