CDD

Newsroom

  • Press Release

    Press Briefing: Senators, Coalition Call for Sweeping National Digital Privacy Legislation

    Experts to Demand Federal Action to Combat Growing digital-consumer and Civil Rights Threats

    For Immediate Release: Feb. 26, 2019 Contact: Mike Stankiewicz, mstankiewicz@citizen.org (link sends e-mail), (202) 588-7779 Jeff Chester jeff@democraticmedia.org (link sends e-mail) (202) 494-7100 MEDIA ADVISORY Press Briefing: Senators, Coalition Call for Sweeping National Digital Privacy Legislation Experts to Demand Federal Action to Combat Growing digital-consumer and Civil Rights Threats WHAT: Staff briefing, open to the press and sponsored by U.S. Sens. Ed Markey (D-Mass.) Tom Udall (D-N.M.), at which consumer protection and civil rights advocates will explain the urgent need for sweeping federal digital privacy legislation with a new approach. It includes baseline legislation that doesn’t pre-empt state privacy laws, the creation of a new federal data protection agency and safeguards against data practices that lead to unjust, unfair, manipulative or discriminatory, outcomes. The briefing comes as Americans increasingly demand protection of their digital privacy in light of unethical privacy sharing and harvesting by tech giants. Without any constraints, Big Tech companies and their partners are collecting sensitive information on American citizens, ranging from financial and health information to data that tracks our Internet activity across all our devices, our location throughout the day and much more. Both the federal government and the states must be empowered to protect the public from this ever-growing online threat to their privacy, welfare and civil rights. Members of the Privacy and Digital Rights for All coalition, which has announced a Framework for Comprehensive Privacy Protection And Digital Rights (link is external) in the U.S. also will speak about the need for enduring privacy innovation and limiting government access to personal data. WHEN: 2 p.m. EST, Mon., March 4 WHO: Ed Mierzwinski, consumer program director, U.S. PIRG Jeffrey Chester, executive director, Center for Digital Democracy Brandi Collins-Dexter, senior campaign director, Color of Change Josh Golin, executive director, Campaign for a Commercial-Free Childhood Burcu Kilic, research director, Public Citizen’s Access to Medicines program Christine Bannan, administrative law and policy fellow, Electronic Privacy Information Center (EPIC) WHERE: Hart Senate Office Building Room 216 120 Constitution Ave NE Washington, DC 20002 ###
  • Privacy Rights Are Civil Rights

    Over 40 Civil Rights, Civil Liberties, and Consumer Groups Call on Congress to Address Data-Driven Discrimination

  • Curbing Companies’ Bad Behavior Will Require Stronger Data Privacy Laws and a New Federal Data Privacy Agency Federal Privacy Laws Are Antiquated and Need Updating; New Data Privacy Legislation Must Include Civil Rights Protections and Enhanced Punishments for Violations Jan. 17, 2019 Contact: Don Owens, dowens@citizen.org (link sends e-mail), (202) 588-7767 Jeffrey Chester, jeff@democraticmedia.org (link sends e-mail), (202) 494-7100 WASHINGTON, D.C. – U.S. data privacy laws must be overhauled without pre-empting state laws and a new data privacy agency should be created to confront 21st century threats and address emerging concerns for digital customers, consumer and privacy organizations said today as they released a framework (link is external) for comprehensive privacy protection and digital rights for members of Congress. “Big Tech is coming to Washington looking for a deal that affords inadequate protections for privacy and other consumer rights but pre-empts states from defending their citizens against the tech companies’ surveillance and misuse of data,” said Robert Weissman, president of Public Citizen. “But here’s the bad news for the tech giants: That deal isn’t going to fly. Instead, the American people are demanding – and intend to win – meaningful federal restraints on tech company abuses of power that also ensure the right of states to craft their own consumer protections.” From the Equifax data breach to foreign election interference and targeted digital ads based on race, health and income, it’s clear that U.S. consumers face a crisis of confidence born from federal data privacy laws that are decades out of date and a lack of basic protections afforded them by digital conglomerates. These corporations, many of which dominate online spaces, are far more interested in monetizing every key stroke or click than protecting consumers from data breaches. For that reason, federal and state authorities must act, the groups maintain. The groups will push for federal legislation based on a familiar privacy framework, such as the original U.S. Code of Fair Information Practices and the widely followed Organization for Economic Cooperation and Development Privacy Guidelines. These frameworks should require companies that collect personal data and rights for individuals to: Establish limits on the collection, use and disclosure of sensitive personal data; Establish enhanced limits on the collection, use and disclosure of data of children and teens; Regulate consumer scoring and other business practices that diminish people’s physical health, education, financial and work prospects; and Prohibit or prevent manipulative marketing practices. The groups are calling for federal baseline legislation and oppose the pre-emption of state digital privacy laws. States have long acted as the “laboratories of democracy” and must continue to have the power to enact appropriate protections for their citizens as technology develops, the groups say. “Black communities should not have to choose between accessing the Internet and the right to control our data,” said Brandi Collins-Dexter, senior campaign director at Color Of Change. “We need privacy legislation that holds powerful corporations accountable for their impacts. Burdening our communities with the need to discern how complex terms of service and algorithms could harm us will only serve to reinforce discriminatory corporate practices. The privacy protection and digital rights principles released today create an important baseline for proactive data protections for our communities.” “For years now, Big Tech has used our sensitive information as a cash cow,” said Josh Golin, executive director of Campaign for a Commercial-Free Childhood. “Each innovation – whether it’s talking home assistants, new social media tools or software for schools – is designed to spy on families and children. We desperately need both 21st century legislation and a new federal agency with broad enforcement powers to ensure that children have a chance to grow up without their every move, keystroke, swipe and utterance tracked and monetized.” The United States is woefully behind other nations worldwide in providing these modern data protections for its consumers, instead relying solely on the Federal Trade Commission (FTC) to safeguard consumers and promote competition. But corporations understand that the FTC lacks rulemaking authority and that the agency often fails to enforce rules it has established. “The FTC has failed to act,” said Caitriona Fitzgerald, policy director at the Electronic Privacy Information Center. “The U.S. needs a dedicated data protection agency.” Alternately, many democratic nations like Canada, Mexico, the U.K., Ireland and Japan already have dedicated data protection agencies with independent authority and enforcement capabilities. Groups that have signed on to the framework include Berkeley Media Studies Group, Campaign for a Commercial-Free Childhood, Center for Digital Democracy, Center for Media Justice, Color of Change, Consumer Action, Consumer Federation of America, Defending Rights & Dissent, Electronic Privacy Information Center, Media Alliance, Parent Coalition for Student Privacy, Privacy Rights Clearinghouse, Privacy Times, Public Citizen, Stop Online Violence Against Women and U.S. PIRG. Read the groups’ proposal below. ###
  • Contact: David Monahan, CCFC (david@commercialfreechildhood.org (link sends e-mail); 617-896-9397)Jeff Chester, CDD (jeff@democraticmedia.org (link sends e-mail); 202-494-7100)Apps which Google rates as safe for kids violate their privacy and expose them to other harmsAdvocates, lawmakers call on FTC to address how Google’s Play Store promotes children’s games which violate kids’ privacy law, feature inappropriate content, and lure kids to watch ads and make in-app purchases BOSTON, MA and WASHINGTON, DC — December 19, 2018 — Today, a coalition of 22 consumer and public health advocacy groups led by Campaign for a Commercial-Free Childhood (CCFC) and Center for Digital Democracy (CDD) called on the Federal Trade Commission (link is external) (“FTC”) to investigate and sanction Google for the deceptive marketing of apps for young children. Google represents that the apps in the “Family” section of the Google Play Store are safe for children, but the apps often violate federal children’s privacy law, expose children to inappropriate content, and disregard Google’s own policies by manipulating children into watching ads and making in-app purchases.The Play Store is Google’s one-stop shop for Android apps, games, and entertainment. Apps in the “Family” section are promoted with a green star and, in some cases, a recommended age, like “Ages 5 & Under,” or “Ages 6-8.” Google is aware from several recent academic studies that many of the apps in this section are a threat to children’s privacy and wellbeing, yet it continues to promote them with these kid-friendly ratings.“The business model for the Play Store’s Family section benefits advertisers, developers, and Google at the expense of children and parents,” said CCFC’s Executive Director Josh Golin. “Google puts its seal of approval on apps that break the law, manipulate kids into watching ads and making purchases, and feature content like kids cleaning their eyes with sharp objects. Given Google’s long history of targeting children with unfair marketing and inappropriate content, including on YouTube, it is imperative that the FTC take swift action.”Lawmakers echoed the call for FTC action. “We’re repeatedly confronted with examples of tech companies that are just not doing enough to protect consumer privacy – and I’m particularly concerned about what this failure means for our children,” said U.S. Senator Tom Udall (D-NM) regarding today’s action by the advocates. “When real-world products are dangerous or violate the law, we expect retailers to pull them off the shelves. Google’s refusal to take responsibility for privacy issues in their Play Store allows for app developers to violate COPPA, all while Google cashes in on our children’s activity. It is past time for the Federal Trade Commission to crack down to protect children’s privacy.”“Google’s dominance in the app market cannot come at the expense of its clear legal obligations to protect kids that use its products.” said David N. Cicilline (RI-01), the top Democrat on the House Antitrust Subcommittee, who raised his concerns about this issue when the Chairman of the FTC testified last week. “I am pleased that this coalition of consumer and children’s advocacy groups are urging the FTC to scrutinize whether Google is improperly tracking children and selling their data.”Google policies require apps in the Kids and Family section of its Play Store to be compliant with the Children’s Online Privacy Protection Act (COPPA). But, Google doesn’t verify compliance, so Play Store apps for children consistently violate COPPA. Many apps send children’s data unencrypted, while others access children’s locations or transmit persistent identifiers without notice or verifiable parental consent. Google has known about these COPPA violations since at least July 2017, when they were publicly reported by Serge Egelman, a researcher at the University of California, Berkeley Center for Long-Term Cybersecurity. Yet Google continues to promote such apps as COPPA-compliant.“Our research revealed a surprising number of privacy violations on Android apps for children, including sharing geolocation with third parties,” said Serge Egelman, a researcher at the University of California, Berkeley. “Given Google’s assertion that Designed for Families apps must be COPPA compliant, it’s disappointing these violations still abound, even after Google was alerted to the scale of the problem.”Google’s policies also require apps for children to avoid “overly aggressive” commercial tactics, but the advocates’ FTC complaint reveals that many popular apps feature ads that interrupt gameplay, are difficult to click out of, or are required to watch in order to advance in a game. In addition, games represented to parents as free often pressure children to make in-app purchases, sometimes going so far as to show characters crying if kids don’t buy locked items. The complaint also offers examples of multiple children’s apps that serve ads for alcohol and gambling, despite those ads being barred by Google’s Ad Policy.Other apps designated as appropriate for children are clearly not. Some contain graphic, sexualized images, like TutoTOONS Sweet Baby Girl Daycare 4 – Babysitting Fun, which has over 10 million downloads. Others model actively harmful behavior, like TabTale’s Crazy Eye Clinic, which teaches children to clean their eyes with a sharp instrument, and has over one million downloads."Parents who download apps recommended for ages 8 and under don’t expect their child to see ads which promote gambling, alcoholic beverages, or violent video games,” said Angela Campbell, Director of the Communications and Technology Clinic at Georgetown Law, which drafted the complaint. “But Google falsely claims that apps listed in the Family section only have ads which are appropriate for children. It’s important for the FTC to act quickly to protect children, especially in light of Google's dominance in the app market."The coalition has previously asked the FTC to investigate developers of children’s apps, citing research from the University of Michigan that revealed manipulative advertising is rampant in apps popular with preschoolers. Today’s complaint focuses on Google, whose misrepresentation and promotion of those apps has led to hundreds of millions of downloads.“Google (Alphabet, Inc.) has long engaged in unethical and harmful business practices, especially when it comes to children,” explained Jeff Chester, executive director of the Center for Digital Democracy (CDD). "And the Federal Trade Commission has for too long ignored this problem, placing both children and their parents at risk over their loss of privacy, and exposing them to a powerful and manipulative marketing apparatus. As one of the world’s leading providers of content for kids online, Google continues to put the enormous profits they make from kids ahead of any concern for their welfare," Chester noted. “It’s time federal and state regulators acted to control Google’s 'wild west' Play Store App activities.”Joining the Campaign for a Commercial-Free Childhood and the Center for Digital Democracy in signing today’s complaint to the FTC are Badass Teachers Association, Berkeley Media Studies Group, Color of Change, Consumer Action, Consumer Federation of America, Consumer Watchdog, Defending the Early Years, Electronic Privacy Information Center, Media Education Foundation, New Dream, Open MIC (Open Media and Information Companies Initiative), Parents Across America, Parent Coalition for Student Privacy, Parents Television Council, Peace Educators Allied for Children Everywhere (P.E.A.C.E.), Privacy Rights Clearinghouse, Public Citizen, the Story of Stuff, TRUCE (Teachers Resisting Unhealthy Childhood Entertainment), and USPIRG.In addition to filing an FTC complaint, CCFC has launched a petition (link is external) asking Google to adopt the Kids’ Safer App Store Standards, which would bar advertising in apps for kids under 5, limit ads in apps for kids 6 -12, bar in-app purchases, and require apps to be reviewed by a human before being included in the Kids and Family section of the Play Store.###
  • Kathryn Montgomery, PhD

    Research Director and Senior Strategist for the Center for Digital Democracy

    Kathryn Montgomery, PhD. is Research Director and Senior Strategist for the Center for Digital Democracy (CDD). In the early 90s, she and Jeff Chester co-founded the Center for Media Education (CME), where she served as President until 2003, and which was the predecessor organization to CDD. CME spearheaded the national campaign that led to passage of the 1998 Children's Online Privacy Protection Act (COPPA) the first federal legislation to protect children's privacy on the Internet. From 2003 until 2018, Dr. Montgomery was Professor of Communication at American University in Washington, D.C., where she founded and directed the 3-year interdisciplinary PhD program in Communication. She has served as a consultant to CDD for a number of years and joined the full-time staff in July 2018. Throughout her career, Dr. Montgomery has written and published extensively about the role of media in society, addressing a variety of topics, including: the politics of entertainment television; youth engagement with digital media; and contemporary advertising and marketing practices. Montgomery's research, writing, and testimony have helped frame the national public policy debate on a range of critical media issues. In addition to numerous journal articles, chapters, and reports, she is author of two books: Target: Prime Time – Advocacy Groups and the Struggle over Entertainment Television (Oxford University Press, 1989); and Generation Digital: Politics, Commerce, and Childhood in the Age of the Internet (MIT Press, 2007). Montgomery’s current research focuses on the major technology, economic, and policy trends shaping the future of digital media in the Big Data era. She earned her doctorate in Film and Television from the University of California, Los Angeles.
  • The law that lets Europeans take back their data from big tech companies, November 11, 1918, CBS 60 Minutes Click to view video.
  • CDD’s Executive Director, Jeff Chester, on CBS 60 Minutes

    The law that lets Europeans take back their data from big tech companies

    "> " type="application/x-shockwave-flash">
  • 34 Civil Rights, Consumer, and Privacy Organizations Unite to Release Principles for Privacy Legislation Contact: Katharina Kopp (kkopp@democraticmedia.org (link sends e-mail)); 202-836 4621 Washington, DC ----- Today, 34 civil rights, consumer, and privacy organizations join in releasing public interest principles for privacy legislation (link is external), because the public needs and deserves strong and comprehensive federal legislation to protect their privacy and afford meaningful redress. Irresponsible data practices lead to a broad range of harms, including discrimination in employment, housing, healthcare, and advertising. They also lead to data breaches and loss of individuals’ control over personal information. Existing enforcement mechanisms fail to hold data processors accountable and provide little-to-no relief for privacy violations. The privacy principles outline four concepts that any meaningful data protection legislation should incorporate at a minimum: Privacy protections must be strong, meaningful, and comprehensive. Data practices must protect civil rights, prevent unlawful discrimination, and advance equal opportunity. Governments at all levels should play a role in protecting and enforcing privacy rights. Legislation should provide redress for privacy violations. These public interest privacy principles include a framework providing guidelines for policymakers considering how to protect the privacy of all Americans effectively while also offering meaningful redress. They follow three days of Federal Trade Commission hearings (link is external) about big data, competition, and privacy as well as the comment deadline on “Developing the Administration’s Approach to Privacy (link is external),” a request for comment from the National Telecommunications and Information Administration as the agency works to develop privacy policy recommendations for the Trump Administration, and ongoing work (link is external) at the National Institute for Standards and Technology to develop a privacy risk framework. The groups urge members of Congress to pass privacy legislation that ensures fairness, prevents discrimination, advances equal opportunity, protects free expression, and facilitates trust between the public and companies that collect their personal data. New America’s Open Technology Institute, Public Knowledge, Access Humboldt, Access Now, Berkeley Media Studies Group, Campaign for a Commercial-Free Childhood, Center for Democracy & Technology, Center for Digital Democracy, Center for Media Justice, Center on Privacy & Technology at Georgetown Law, Color of Change, Common Cause, Common Sense Kids Action, Consumer Action, Consumer Federation of America, Consumers Union, Customer Commons, Demand Progress, Free Press Action Fund, Human Rights Watch, Lawyers’ Committee for Civil Rights Under Law, Media Alliance, Media Mobilizing Project, National Association of Consumer Advocates, National Consumer Law Center, National Consumers League, National Digital Inclusion Alliance, National Hispanic Media Coalition, Oakland Privacy, Open MIC (Open Media and Information Companies Initiative), Privacy Rights Clearinghouse, Public Citizen, U.S. PIRG, and United Church of Christ, OC Inc. signed the principles. Additional local and national privacy advocates are encouraged to sign on. The following can be attributed to Eric Null, Senior Policy Counsel at New America’s Open Technology Institute: “For decades, privacy regulation has favored the company over the user -- companies set their own rules and users are left to fend for themselves. Worse, companies have even discriminated based on protected classes through algorithmic decision-making. Comprehensive privacy legislation must disrupt this status quo. Legislation that follows the public interest privacy principles will better protect users and give users more control over their data.” The following can be attributed to Allie Bohm, Policy Counsel at Public Knowledge: “It is imperative that any comprehensive privacy legislation reflect the concerns, interests, and priorities of actual human beings. Today, consumer protection, privacy, and civil rights groups come together to articulate those interests, priorities, and concerns. Importantly, these principles address the many harms people can experience from privacy violations and misuse of personal data, including enabling unfair price discrimination; limiting awareness of opportunities; and contributing to employment, housing, health care, and other forms of discrimination.” The following can be attributed to Amie Stepanovich, U.S. Policy Manager at Access Now: “From Europe to India to Brazil, data privacy legislation is becoming the norm around the world, and people in the United States are getting left behind. It is long past time that our legislators acted to protect people across the country from opaque data practices that can result in its misuse and abuse, and any acceptable package must start with these principles.” The following can be attributed to Josh Golin, Executive Director at Campaign for a Commercial-Free Childhood: “What big tech offers for ‘free’ actually comes at a high cost -- our privacy. Worst of all is how vulnerable kids are tracked online and then targeted with manipulative marketing. This has to stop. We need laws that will empower parents to protect their children’s privacy.” The following can be attributed to Joseph Jerome, Policy Counsel at Center for Democracy & Technology: “Debates about national privacy laws focus on how companies should implement Fair Information Practices. The operative word is ‘fair.’ When it comes to how companies collect, use, and share our data, too many business practices are simply unfair. Federal law must go beyond giving consumers more notices and choices about their privacy, and we think it is time for legislators in Congress to flip the privacy presumption and declare some data practices unfair.” The following can be attributed to Katharina Kopp, Director of Policy at Center for Digital Democracy: “To this day, U.S. citizens have had to live without effective privacy safeguards. Commercial data practices have grown ever more intrusive, ubiquitous and harmful. It is high time to provide Americans with effective safeguards against commercial surveillance. Any legislation must not only effectively protect individual privacy, it must advance equitable, just and fair data uses, and must protect the most vulnerable among us, including children. In other words, they must bring about real changes in corporate practices. We have waited long enough; the time is now.” The following can be attributed to Laura Moy, Executive Director at Center on Privacy & Technology at Georgetown Law: “Americans want their data to be respected, protected, and used in ways that are consistent with their expectations. Any new legislation governing commercial data practices must advance these goals, and also protect us from data-driven activities that are harmful to society. We need privacy to protect us from uses of data that exclude communities from important opportunities, enable faceless brokers to secretly build ever-more-detailed profiles of us, and amplify political polarization and hate speech.” The following can be attributed to Yosef Getachew, Director of Media and Democracy Program at Common Cause: “An overwhelming majority of Americans believe they have lost control over how their personal information is collected and used across the internet ecosystem. Numerous data breaches and abuses in data sharing practices, which have jeopardized the personal information of millions of Americans, have validated these fears. Our current privacy framework no longer works, and the lack of meaningful privacy protections poses a serious threat to our democracy. Companies can easily manipulate data to politically influence voters or engage in discriminatory practices. These principles should serve as a baseline for any comprehensive privacy legislation that guarantees all Americans control over their data.” The following can be attributed to James P. Steyer, CEO and Founder, at Common Sense: “Any federal legislation should provide for strong baseline protections, particularly for the most surveilled and vulnerable generation ever -- our kids. These principles reflect that as privacy, consumer, and civil rights advocates, we only want federal legislation that will move the ball forward in terms of protecting kids, families, and all of us.” The following can be attributed to Linda Sherry, director of national priorities at Consumer Action: “Our country has floundered far too long without strong federal regulations governing data collection, retention, use and sharing. These privacy principles, developed by a coalition of leading consumer, civil rights and privacy organizations, are offered as a framework to guide Congress in protecting consumers from the many harms that can befall them when they are given little or no choice in safeguarding their data, and companies have few, if any, restrictions on how they use that information.” The following can be attributed to Susan Grant, Director of Consumer Protection and Privacy at Consumer Federation of America: “We need to move forward on data protection in the United States, from a default that allows companies to do what they want with Americans’ personal information as long as they don’t lie about it, to one in which their business practices are aligned with respect for privacy rights and the responsibility to keep people’s data secure.” The following can be attributed to Katie McInnis, Policy Counsel for Consumers Union, the advocacy division of Consumer Reports: “As new data breaches are announced at an alarming rate, now is the time to protect consumers with strong privacy laws. We need laws that do more than just address broad transparency and access rights. Consumers deserve practical controls and robust enforcement to ensure all of their personal information is sufficiently protected.” The following can be attributed to Gaurav Laroia, Policy Counsel at Free Press Action Fund: “The public has lost faith in technology companies' interest and ability to police their own privacy and data usage practices. It’s past time for Congress to pass a strong law that empowers people to make meaningful choices about their data, protects them from discrimination and undue manipulation, and holds companies accountable for those practices.” The following can be attributed to David Brody, Counsel & Senior Fellow for Privacy and Technology at the Lawyers’ Committee for Civil Rights Under Law: “Protecting the right to privacy is essential to protecting civil rights and advancing racial equity in a modern, Internet-focused society. Privacy rights are civil rights. Invasive profiling of online activity enables discrimination in employment, housing, credit, and education; helps bad actors target voter suppression and misinformation; assists biased law enforcement surveillance; chills the free association of advocates; and creates connections between hateful extremists exacerbating racial tensions.” The following can be attributed to Tracy Rosenberg, Executive Director at Media Alliance: “After a flood of data breaches and privacy violations, Americans overwhelmingly support meaningful protections for their personal information that are not written by, for and in the interests of the data collection industry. These principles start to define what that looks like.” The following can be attributed to Francella Ochillo, Vice President of Policy & General Counsel at National Hispanic Media Coalition: “For years, tech platforms have been allowed to monetize personal data without oversight or consequence, losing sight of the fact that personal data belongs to the user. Meanwhile, Latinos and other marginalized communities continue to be exposed to the greatest risk of harm and have the fewest opportunities for redress. The National Hispanic Media Coalition joins the chorus of advocates calling for a comprehensive regulatory framework that protects a user’s right to privacy and access as well as the right to be forgotten.” The following can be attributed to JP Massar, Organizer at Oakland Privacy: “We must not only watch the watchers, and regulate the sellers of our information. We must begin to unravel the information panopticon that has already formed. This is a start.” The following can be attributed to Robert Weissman, President at Public Citizen: “Internet privacy means control. Either we get to control our own lives as lived through the Internet, or the Big Tech companies do. That's what is at stake in whether the U.S. adopts real privacy protections.” The following can be attributed to Ed Mierzwinski, Senior Director for Consumer Programs at U.S. PIRG: “The big banks and the big tech companies all say that they want a federal privacy law, but the law that their phalanx of lobbyists seeks isn’t designed to protect consumers. Instead, it’s designed to protect their business models that treat consumers as commodities for sale; it fails to guarantee that their secret sauce big data algorithms don’t discriminate; it eliminates stronger and innovative state laws forever and it denies consumers any real, enforceable rights when harmed. We can’t allow that.” You may view the privacy principles (link is external) for more information.
  • CDD submits comments to The National Telecommunications and Information Administration On “Developing the Administration’s Approach to Consumer Privacy (link is external)" CDD argues that - Focus on “outcomes” is good but - Outcomes as defined by NTIA are too narrow and must include a broader discussion on privacy harms. They must include + identification harms (risks of identity theft, re-identification and sensitive inferences), + discrimination harms (inequities in the distribution of benefits and risks of exclusion), as well as + exploitation harms (personal data as commodity and risks to the vulnerable). - Legislation must not only achieve a reduction in privacy harms but must also ensure that “privacy benefits are fairly allocated”. Policy remedies must consider and be effective in addressing the inequities in the distribution of privacy benefits and harms. - NTIA’s list of desired outcomes of transparency, control, reasonable minimization, security, access and corrections, risk management, and accountability is a restatement of all-too-familiar privacy self-management paradigm. Privacy self-management alone is not enough as a policy solution. - Privacy is not an individual, commodified good that can and should be traded for other goods. - Legislation should focus less on data and more on outputs of data processing. So, instead of narrowing the scope of legislation to “personal data”, legislation must focus in on inferences, decisions and other data uses. - A risk-management approach must define risks broadly. NTIA should develop methodologies to assess the human rights, social, economic and ethical impacts of the use of algorithms in modern data processing.
  • Press Release

    Advocates ask FTC to investigate apps which manipulate kids

    Popular games for kids 5 and under lure them to watch ads and make in-app purchases

    A coalition of 22 consumer and public health advocacy groups called on the Federal Trade Commission (“FTC”) to investigate the preschool app market. The advocates’ letter urges the FTC to hold app makers accountable for unfair and deceptive practices, including falsely marketing apps that require in-app purchases as “free” and manipulating children to watch ads and make purchases. The complaint was filed in conjunction with a major new study that details a host of concerning practices in apps targeted to young children. The study (link is external) (paywall), “Advertising in Young Children’s Apps,” was led by researchers at University of Michigan C.S. Mott Children’s Hospital, and examined the type and content of advertising in 135 children’s apps.
  • Blog

    Center for Digital Democracy’s Principles for U.S. Privacy Legislation

    PROTECT PRIVACY RIGHTS, ADVANCE FAIR AND EQUITABLE OUTCOMES, LIMIT CORPORATE PRACTICES AND ENSURE GOVERNMENT LEADERSHIP AND ENFORCEMENT

    The Center for Digital Democracy provides the following recommendations for comprehensive baseline Federal privacy legislation. We are building on our expertise addressing digital marketplace developments for more than two decades, including work leading to the enactment of the 1998 Children’s Online Privacy Protection Act--the only federal online privacy law in the United States. Our recommendations are also informed by our long-standing trans-Atlantic work with consumer and privacy advocates in Europe, as well as the General Data Protection Regulation. We are alarmed by the increasingly intrusive and pervasive nature of commercial surveillance, which has the effect of controlling consumers’ and citizens’ behaviors, thoughts, and attitudes, and which sorts and tracks us as “winners” and “losers.” Today’s commercial practices have grown over the past decades unencumbered by regulatory constraints, and increasingly threaten the American ideals of self-determination, fairness, justice and equal opportunity. It is now time to address these developments: to grant basic rights to individuals and groups regarding data about them and how those data are used; to put limits on certain commercial data practices; and to strengthen our government to step in and protect our individual and common interests vis-à-vis powerful commercial entities. We call on legislators to consider the following principles: 1. Privacy protections should be broad: Set the scope of baseline legislation broadly and do not preempt stronger legislation Pervasive commercial surveillance practices know no limits, so legislation aiming to curtail negative practices should - address the full digital data life-cycle (collection, use, sharing, storage, on- and off-line) and cover all private entities’ public and private data processing, including nonprofits; - include all data derived from individuals, including personal information, inferred information, as well as aggregate and de-identified data; - apply all Fair Information Practice Principles (FIPPs) as a comprehensive baseline, including the principles of collection and use limitation, purpose specification, access and correction rights, accountability, data quality, and confidentiality/security; and require fairness in all data practices. - allow existing stronger federal legislation to prevail and let states continue to advance innovative legislation. 2. Individual privacy should be safeguarded: Give individuals rights to control the information about them - Building on FIPPs, individuals ought to have basic rights, including the right to + transparency and explanation + access + object and restrict + use privacy-enhancing technologies, including encryption + redress and compensation 3. Equitable, fair and just uses of data should be advanced: Place limits on certain data uses and safeguard equitable, fair and just outcomes Relying on “privacy self-management”—with the burden of responsibility placed solely on individuals alone to advance and protect their autonomy and self-determination—is not sufficient. Without one’s knowledge or participation, classifying and predictive data analytics may still draw inferences about individuals, resulting in injurious privacy violations—even if those harms are not immediately apparent. Importantly, these covert practices may result in pernicious forms of profiling and discrimination, harmful not just to the individual, but to groups and communities, particularly those with already diminished life chances, and society at large. Certain data practices may also unfairly influence the behavior of online users, such as children. Legislation should therefore address the impact of data practices and the distribution of harm by - placing limits on collecting, using and sharing sensitive personal information (such as data about ethnic or racial origin, political opinions/union membership, data concerning health, sex life or sexual orientation, genetic data, or biometric data) or data that reveals sensitive personal information, especially when using these data for profiling; - otherwise limiting the use of consumer scoring and other data practices, including in advertising, that have the effect of disproportionally and negatively affecting people’s life chances, related to, for example, housing, employment, finance, education, health and healthcare; - placing limits on manipulative marketing practices; - requiring particular safeguards when processing data relating to children and teens, especially with regard to marketing and profiling. 4. Privacy legislation should bring about real changes in corporate practices: Set limits and legal obligations for those managing data and require accountability Currently companies face very few limitations regarding their data practices. The presumption of “anything goes” has to end. Legislation should ensure that entities collecting, using, sharing data - can only do so for specific and appropriate purposes defined in advance, and subject to rules established by law and informed by data subjects’ freely given, specific, informed and unambiguous consent; for the execution of a contract, or as required by law; and without “pay-for-privacy provisions” or “take-it-or leave it” terms of service. - notify users in a timely fashion of data transfers and data breaches, and make consumers whole after a privacy violation or data breach; - cannot limit consumers’ right to redress with arbitration clauses; - are transparent and accountable, and adopt technical and organizational measures, including + provide for transparency, especially algorithmic transparency, + conduct impact assessments for high-risk processing considering the impact on individuals, groups, communities and society at large, + implement Privacy by Design and by Default, + assign resources and staff, including a Data Protection Officer, + implement appropriate oversight over third-party service providers/data processors, + conduct regular audits - are only allowed to transfer data to other countries/international organizations with essentially equivalent data protections in place. 5. Privacy protection should be consequential and aim to level the playing field: Give government at all levels significant and meaningful enforcement authority to protect privacy interests and give individuals legal remedies Without independent and flexible rulemaking data-protection authority, the Federal Trade Commission has been an ineffective agency for data protection. An agency with expertise and resources is needed to enforce company obligations. Ongoing research is required to anticipate and prepare for additionally warranted interventions to ensure a fair marketplace and a public sphere that strengthens our democratic institutions. Legislation should provide - for a strong, dedicated privacy agency with adequate resources, rulemaking authority and the ability to sanction non-compliance with meaningful penalties; - for independent authority for State Attorneys General; - for statutory damages and a private right of action; - for the federal agency to establish an office of technology impact assessment that would consider privacy, ethical, social, political, and economic impacts of high-risk data processing and other technologies; it would oversee and advise companies on their impact-assessment obligations.
  • Media Advisory – Save the Date FOR IMMEDIATE RELEASE October 3, 2018 Contact: Jeff Chester jeff@democraticmedia.org (link sends e-mail) COPPA--Protecting Children’s Privacy Online for 20 Years Sen. Ed Markey, Advocates and Experts Celebrate COPPA as they focus on future challenges posed by the digital marketplace October 17th, Capitol Hill, Open to Public Washington, D.C. To mark the 20th anniversary of the 1998 Children’s Online Privacy Protection Act (COPPA), Senator Edward J. Markey (DMA) —its principal congressional sponsor—will be joined by key representatives from the consumer, child advocacy, and privacy groups involved in implementing the law, at a public forum on Wednesday, October 17 from 12:30-3:30 pm in Room 385 of the Senate Russell Office Building (SR-385). Senator Markey will deliver a keynote speech followed by two panels featuring representatives from Electronic Privacy Information Center, Campaign for Commercial Free Childhood, Common Sense Media, Center for Digital Democracy, Color of Change, and Institute for Public Representation (Georgetown University Law Center), among others. Prof. Kathryn C. Montgomery, who spearheaded the public campaign that led to COPPA, will moderate. “COPPA is the nation’s constitution for children’s communication. For 20 years it has shielded our nation’s children from invasive practices and encroaching actors on the internet,” Sen. Markey noted. “It puts children and families in control and holds violators accountable when they compromise kids’ privacy. As we celebrate the 20th anniversary of COPPA, we must look to the future.” In addition to discussing COPPA’s impact, speakers will explore the expanding interactive and data-driven world young people face today, which is being transformed by a host of powerful technologies, such as artificial intelligence, virtual reality, and internet-connected toys. “In 2018, children grow up in an increasingly connected and digital world with ever-emerging threats to their sensitive personal information,” explained Sen. Markey. “Two decades after the passage of this bedrock law, it is time to redouble our efforts and safeguard the precious privacy of our youngest Americans.” The event is free and open to the public, but seating is limited. Lunch will be served. Please RSVP to jeff@democraticmedia.org (link sends e-mail).
  • October 1, 2018 Chairman John Thune Ranking Member Bill Nelson Senate Commerce Committee Washington, DC Dear Chairman Thune and Ranking Member Nelson, We appreciate your interest in consumer privacy and the hearing you convened recently to explore this topic. Still, our concerns remain that the hearing, with only industry representatives, was unnecessarily biased. Many of the problems consumers face, as well as the solutions we would propose, were simply never mentioned. There is little point in asking industry groups how they would like to be regulated. None of the proposals endorsed by the witnesses yesterday would have any substantial impact on the data collection practices of their firms. Such regulation will simply fortify business interests to the detriment of online users. And the absence of consumer advocates at the first hearing was also missed opportunity for a direct exchange about points made by the industry witnesses. We understand that you are planning to hold a second hearing in early October. In keeping with the structure of the first hearing, we ask that you invite six consumer privacy experts to testify before the Committee. We would also suggest that you organize an additional panel with other experts and enforcement officials, including Dr. Jelenik, the Chair of the European Data Protection Board, as well as State Attorneys General, who are now on the front lines of consumer protection in the United States. Thank you for your consideration of our views. We look forward to working with you. Sincerely, Access Humboldt Access Now Campaign for a Commercial-Free Childhood Center for Digital Democracy Common Sense Consumer Action Consumer Federation of America Customer Commons Digital Privacy Alliance Electronic Frontier Foundation EPIC Media Alliance National Association of Consumer Advocates New America's Open Technology Institute New York Public Interest Research Group (NYPIRG) Privacy Rights Clearing House U.S. Public Interest Research Group (U.S. PIRG) World Privacy Forum
  • September 25, 2018 Contact: Jeff Chester-202-494-7100 David Monahan 617-896-9397 For Immediate Release Child Advocacy and Consumer Groups Tell FCC to Keep Key TV Safeguards for Children Overturning Children’s TV Act rules will harm kids and be a huge giveaway of public airwaves to broadcast and cable companies Three leading nonprofit groups working to advance the interests of children in the digital era told the Federal Communications Commission (FCC) that its plan to dismantle long-standing safeguards designed to ensure all children have access to quality TV programing will harm American kids. The proposal to jettison guidelines which require broadcast TV stations air a minimum of three hours a week of educational programming on their primary channel and additional programming on multicast channels would significantly reduce the availability of higher quality shows, they explained in a filing (link is external) today. “The FCC seeks to strip away one of the only federal rules that helps both children and parents,” explained Jeff Chester, executive director of the Center for Digital Democracy. Chester helped lead the campaign back in the 1990’s that led to the current CTA rules. “It is also one of the only concrete public-interest requirements that Congress mandated in exchange for free use of the public airwaves, which allow television stations to earn vast revenues from both advertising and fees paid by cable companies. Just as the GOP FCC majority did when it killed network neutrality, the commission only seems interested in protecting the interests of the big broadcast and cable companies,” Chester said. “The Commission’s proposal would effectively eliminate children’s programming on broadcast television, where at least there are some limits on commercialism,” said Campaign for a Commercial-Free Childhood executive director Josh Golin. "Internet and mobile platforms for children are rife with many types of unfair and deceptive marketing that aren’t allow on kids’ TV. Rather than facilitating a race to the bottom, the FCC should work with lawmakers and the FTC to develop cross-platform rules to ensure all children access to quality, commercial-free media regardless of the platforms and devices their families own.” Without citing any evidence about the quality, cost and availability of children’s educational programs delivered by other means, the FCC claims that because children can watch children’s educational programs on cable, YouTube, Netflix, Amazon and Hulu, commercial television stations should not be required to air children’s educational programming. But in comments drafted by the Georgetown Law Communications and Technology Clinic, the advocates note, “To use non-broadcast services, households must have access to cable or broadband service, and be able to afford subscription fees and equipment. Children who live in rural areas, or whose families are low-income, and cannot access or afford alternative program options, will be hurt the most” if the FCC proposal is adopted. The three groups—Center for Digital Democracy, Campaign for a Commercial-Free Childhood, and the Benton Foundation—pledged to educate the public, including parents, educators and concerned citizens, so they can raise concerns with the FCC and other policy makers. --30--
  • Leading consumer privacy organizations in the United States write to express surprise and concern that not a single consumer representative was invited to testify at the September 26 Senate Commerce Committee hearing “Examining Safeguards for Consumer Data Privacy.”
  • CDD Releases E-Guide to Help Protect Voters From Online Manipulation and False News Washington, D.C.: September 12, 2018 To help fight online political misinformation and false news, which has already resurfaced in the 2018 midterm elections, CDD has produced a short e-guide to help voters understand how online media platforms can be hijacked to fan political polarization and social conflict. Enough Already! Protect Yourself from Online Political Manipulation and False News in Election 2018 describes the tactics that widely surfaced in the last presidential election, how they have evolved since, and deconstructs the underlying architecture of online media, especially social networks, that have fueled the rise of disinformation and false news. The e-guide tells voters what they can do to try to take themselves out of the targeted advertising systems developed by Facebook, Twitter, YouTube and other big platforms. The guide also describes the big picture issues that must be addressed to rein in the abuses unleashed by Silicon Valley’s big data surveillance economy and advertising-driven revenue machine. The e-guide is available for free download at the CDD web site. Journalists, activists and interested voters are urged to spread the guide to friends and colleagues. Contact: Jeff Chester, jeff@democraticmedia.org (link sends e-mail) 202-494-7100
  • Online political misinformation and false news have already resurfaced in the 2018 midterm elections. CDD has produced a short e-guide to help voters understand how online media platforms can be hijacked to fan political polarization and social conflict. Enough Already! Protect Yourself from Online Political Manipulation and False News in Election 2018 describes the tactics that widely surfaced in the last presidential election, how they have evolved since, and deconstructs the underlying architecture of online media, especially social networks, that have fueled the rise of disinformation and false news. The e-guide tells readers what they can do to try to take themselves out of the targeted advertising systems developed by Facebook, Twitter, YouTube and other big platforms. The guide also describes the big picture issues that must be addressed to rein in the abuses unleashed by Silicon Valley’s big data surveillance economy and advertising-driven revenue machine.