CDD

program areas Digital Consumer

  • October 1, 2018 Chairman John Thune Ranking Member Bill Nelson Senate Commerce Committee Washington, DC Dear Chairman Thune and Ranking Member Nelson, We appreciate your interest in consumer privacy and the hearing you convened recently to explore this topic. Still, our concerns remain that the hearing, with only industry representatives, was unnecessarily biased. Many of the problems consumers face, as well as the solutions we would propose, were simply never mentioned. There is little point in asking industry groups how they would like to be regulated. None of the proposals endorsed by the witnesses yesterday would have any substantial impact on the data collection practices of their firms. Such regulation will simply fortify business interests to the detriment of online users. And the absence of consumer advocates at the first hearing was also missed opportunity for a direct exchange about points made by the industry witnesses. We understand that you are planning to hold a second hearing in early October. In keeping with the structure of the first hearing, we ask that you invite six consumer privacy experts to testify before the Committee. We would also suggest that you organize an additional panel with other experts and enforcement officials, including Dr. Jelenik, the Chair of the European Data Protection Board, as well as State Attorneys General, who are now on the front lines of consumer protection in the United States. Thank you for your consideration of our views. We look forward to working with you. Sincerely, Access Humboldt Access Now Campaign for a Commercial-Free Childhood Center for Digital Democracy Common Sense Consumer Action Consumer Federation of America Customer Commons Digital Privacy Alliance Electronic Frontier Foundation EPIC Media Alliance National Association of Consumer Advocates New America's Open Technology Institute New York Public Interest Research Group (NYPIRG) Privacy Rights Clearing House U.S. Public Interest Research Group (U.S. PIRG) World Privacy Forum
  • September 25, 2018 Contact: Jeff Chester-202-494-7100 David Monahan 617-896-9397 For Immediate Release Child Advocacy and Consumer Groups Tell FCC to Keep Key TV Safeguards for Children Overturning Children’s TV Act rules will harm kids and be a huge giveaway of public airwaves to broadcast and cable companies Three leading nonprofit groups working to advance the interests of children in the digital era told the Federal Communications Commission (FCC) that its plan to dismantle long-standing safeguards designed to ensure all children have access to quality TV programing will harm American kids. The proposal to jettison guidelines which require broadcast TV stations air a minimum of three hours a week of educational programming on their primary channel and additional programming on multicast channels would significantly reduce the availability of higher quality shows, they explained in a filing (link is external) today. “The FCC seeks to strip away one of the only federal rules that helps both children and parents,” explained Jeff Chester, executive director of the Center for Digital Democracy. Chester helped lead the campaign back in the 1990’s that led to the current CTA rules. “It is also one of the only concrete public-interest requirements that Congress mandated in exchange for free use of the public airwaves, which allow television stations to earn vast revenues from both advertising and fees paid by cable companies. Just as the GOP FCC majority did when it killed network neutrality, the commission only seems interested in protecting the interests of the big broadcast and cable companies,” Chester said. “The Commission’s proposal would effectively eliminate children’s programming on broadcast television, where at least there are some limits on commercialism,” said Campaign for a Commercial-Free Childhood executive director Josh Golin. "Internet and mobile platforms for children are rife with many types of unfair and deceptive marketing that aren’t allow on kids’ TV. Rather than facilitating a race to the bottom, the FCC should work with lawmakers and the FTC to develop cross-platform rules to ensure all children access to quality, commercial-free media regardless of the platforms and devices their families own.” Without citing any evidence about the quality, cost and availability of children’s educational programs delivered by other means, the FCC claims that because children can watch children’s educational programs on cable, YouTube, Netflix, Amazon and Hulu, commercial television stations should not be required to air children’s educational programming. But in comments drafted by the Georgetown Law Communications and Technology Clinic, the advocates note, “To use non-broadcast services, households must have access to cable or broadband service, and be able to afford subscription fees and equipment. Children who live in rural areas, or whose families are low-income, and cannot access or afford alternative program options, will be hurt the most” if the FCC proposal is adopted. The three groups—Center for Digital Democracy, Campaign for a Commercial-Free Childhood, and the Benton Foundation—pledged to educate the public, including parents, educators and concerned citizens, so they can raise concerns with the FCC and other policy makers. --30--
  • Leading consumer privacy organizations in the United States write to express surprise and concern that not a single consumer representative was invited to testify at the September 26 Senate Commerce Committee hearing “Examining Safeguards for Consumer Data Privacy.”
  • CDD today joined the Electronic Privacy Information Center (EPIC) and six other consumer groups in calling on the Federal Trade Commission to investigate the misleading and manipulative tactics of Google and Facebook in steering users to “consent” to privacy-invasive default settings. In a letter to the FTC, the eight groups complained that the technology companies deceptively nudge users to choose less privacy-friendly options. The complaint was based on the findings in a report, “Deceived by Design,” published today by the Norwegian Consumer Council. It found that Google and Facebook steer consumers into sharing vast amounts of information about themselves, through cunning design, privacy invasive defaults, and “take it or leave it”-choices, according to an analysis of the companies’ privacy updates. A report by Consumer Report investigating Facebook settings for US users found “that the design and language used in Facebook's privacy controls nudge people toward sharing the maximum amount of data with the company.” Read the Norwegian report, “Deceived by Design” here: https://www.forbrukerradet.no/undersokelse/no-undersokelsekategori/deceived-by-design (link is external) Read the letter the eight groups sent to the FTC today here: http://thepublicvoice.org/wp-content/uploads/2018/06/FTC-letter-Deceived-by-Design.pdf (link is external) Read the report by Consumer Report here: https://www.consumerreports.org/privacy/cr-researchers-find-facebook-privacy-settings-maximize-data-collection (link is external)
  • U.S. companies should adopt the same data protection rules that are poised to go into effect in the European Union on May 25, Public Citizen, the Center for Digital Democracy and Privacy International said today.
  • Consumer advocates, digital rights, and civil rights groups are calling on U.S. companies to adopt the requirements of the General Data Protection Regulation (GDPR) as a baseline in the U.S. and worldwide. Companies processing personal data* in the U.S. and/or worldwide and which are subject to the GDPR in the European Union, ought to: - extend the same individual privacy rights to their customers in the U.S. and around the world; - implement the obligations placed on them under the GDPR; - demonstrate that they meet these obligations; - accept public and regulatory scrutiny and oversight of their personal data practices; - adhere to the evolving GDPR jurisprudence and regulatory guidance (*Under the GDPR processing includes collecting, storing, using, altering, generating, disclosing, and destroying personal data.) Specifically, at a minimum, companies ought to: 1. Treat the right to data privacy as a fundamental human right. - This right includes the right to: + Information/notice + access + rectification + erasure + restriction + portability + object + avoid certain automated decision-making and profiling, as well as direct marketing - For these rights to be meaningful, give individuals effective control over the processing of their data so that they can realize their rights, including + set system defaults to protect data + be transparent and fair in the way you use people’s data 2. Apply these rights and obligations to all personal data including to data that can identify an individual directly and indirectly. 3. Process data only if you have a legal basis to do so, including - On the basis of freely given, specific, informed and unambiguous consent - If necessary for the performance of a contract 4. In addition, process data only in accordance to the principles of fairness, transparency, purpose limitation, data minimization, accuracy, storage limitation, integrity and confidentiality/security. 5. Add extra safeguards, including explicit consent, when processing sensitive personal data (such as data about ethnic or racial origin, political opinions/union membership, data concerning health, sex life or sexual orientation, genetic data, or biometric data) or data that reveals sensitive personal data, especially when using this data for profiling. 6. Apply extra safeguards when processing data relating to children and teens, particularly with regard to marketing and profiling. 7. Be transparent and accountable, and adopt technical and organizational measures to meet these obligations, including - Provide for algorithmic transparency - Conduct impact assessments for high risk processing - Implement Privacy by Design and by Default - Assign resources and staff, including a Data Protection Officer - Implement appropriate oversight over third party service providers/data processors - Conduct regular audits - Document the processing 8. Notify consumers and regulatory authorities in case of a breach without undue delay. 9. Support the adoption of similar requirements in a data protection law that will ensure appropriate and effective regulatory oversight and enforcement for data processing that does not fall under EU jurisdiction. 10. Adopt these GDPR requirements as a baseline regardless of industry sector, in addition to any other national/federal, provincial/state or local privacy requirements that are stricter than the requirements advanced by the GDPR.
  • The European Union's updated data protection legislation comes into effect in Europe on May 25, 2018. It gives individuals new rights to better control their personal information and strengthens some of the rights that already exist. Enforcement and redress mechanisms have also been strengthened to ensure that these rights are respected. And – importantly – the definition of personal data is wider in the GDPR than in the current EU legislation, and now includes online identifiers, such as an IP address. Read the summary of the eight rights here. The right to information to access to rectify to delete (or “to be forgotten”) to restrict processing to data portability to object to avoid automated decision making and profiling.
  • The European General Data Protection Regulation (GDPR) will take effect May 25, 2018. The Trans Atlantic Consumer Dialogue (link is external) (TACD), of which CDD is a member, published a document detailing 10 things that US citizens and companies need-to-know about the forthcoming General Data Protection Regulation (GDPR).
  • In an open letter to Facebook's CEO Mark Zuckkerberg, members of the Transatlantic Consumer Dialogue urge the company "to confirm your company’s commitment to global compliance with the GDPR".