CDD

Newsroom

  • CONSUMER AND CITIZEN GROUPS CONTINUE TO HAVE SERIOUS CONCERNS ABOUT GOOGLE FITBIT TAKEOVER Joint Statement on Possible Remedies (link is external)
  • October 9, 2020 Susan Wojciki CEO YouTube 901 Cherry Avenue San Bruno, CA 94006 Dear Ms. Wojciki: We commend Google/YouTube’s plan to create a $100 million investment fund for children’s content, announced in 2019 following the FTC settlement to address YouTube’s violations of COPPA. This fund has the potential to stamp an imprint on children’s online content which will have influence for years to come. We ask that YouTube adopt policies to ensure this fund will operate in the best interests of children worldwide. The programming supported by the fund should: Reflect the perspectives and interests of children from different countries and cultures Underwrite content makers who are diverse and independent, with at least 50% of funding dedicated to historically underrepresented communities Promote educational content and content which reflects the highest values of civil society, including diversity Not support content which promotes commercialism Facilitate union representation of creators of scripted and nonfiction content for YouTube Be advised by a team of leading independent experts who can ensure programming is commissioned that truly serves the educational, civic, and developmental needs of young people. As the leading global online destination for many millions of children, as well as the most powerful digital marketing entity, Google should be at the forefront of providing financial resources for quality content that is innovative, takes creative risks, and supports emerging program makers from many different backgrounds. For example, programming supported by the fund should reflect a major commitment to diversity by commissioning producers from around the world who represent diverse cultures and perspectives. The fund is also an opportunity for Google to make a significant contribution to the development of a distinct programming vision for young people that is primarily driven to foster their wellbeing. We urge Google to only fund programming free of commercial content, including influencer marketing, product and brand integration, and licensed characters or products. In addition, each program or series should have a robust release window that provides access to all children without being required to view digital advertising and other forms of commercial marketing. The expert commissioning board we advise you to adopt will help ensure that the fund will operate fairly, and help eliminate potential conflict of interests. Operating the fund using these principles will allow YouTube to cement its place as a leader in children’s programming and more importantly, make a world of difference—ensuring that time spent watching YouTube will enrich children. We stand ready to confer with you on these suggestions and your development of the fund, and would welcome the opportunity to meet with you in the near future to discuss these items. Sincerely, Jeffrey Chester, Executive Director, Center for Digital Democracy Jessica J. GonzĂĄlez, Co-CEO, Free Press Josh Golin, Executive Director, Campaign for a Commercial-Free Childhood Justin Ruben, Co-Director, ParentsTogether Lowell Peterson, Executive Director, Writers Guild of America, East, AFL-CIO
  • The Campaign for Commercial-Free Childhood (CCFC) and CDD filed comments with the UN’s Special Rapporteur on privacy, as part of a consultation designed to propose global safeguards for young people online. Both CCFC and CDD, along with allies in the U.S. and throughout the world, are working to advance stronger international protections for young people, especially related to their privacy and the impacts that digital marketing has on their development.
    Jeff Chester
  • For Immediate Release September 24, 2020 Contact: Jeff Chester (202-494-7100) jeff@democraticmedia.org (link sends e-mail) A Step Backwards for Consumer Privacy: Why Californians Should Vote No on Proposition 24 Ventura, CA, and Washington, DC: The Center for Digital Democracy (CDD) announced today its opposition to the California Privacy Rights Act (CPRA), also known as Proposition 24, which will appear on the November 2020 California general election ballot. Prop 24 does not sufficiently strengthen Californians’ privacy and may, in fact, set a new lower and thus more dangerous standard for privacy protection in the U.S., according to its analyses. “We need strong and bold privacy legislation, not weaker standards and tinkering at the margins,” declared CDD Policy Director Katharina Kopp. “Prop 24 fails to significantly limit data uses that undermine our privacy, increase corporate manipulation and exploitation, and exacerbate racial and economic inequality. This initiative allows the much more powerful companies to set unfair terms by default. It also condones pay-for-privacy schemes, where corporations would be allowed to charge a premium (or eliminate a discount) in exchange for privacy. These schemes tend to hurt the already disadvantaged the most,” she explained. CDD intends to work with allies from the consumer and privacy communities to inform voters about Prop 24 and how best to protect their privacy. The Center for Digital Democracy is a leading nonprofit organization focused on empowering and protecting the rights of the public in the digital era.
  • The Center for Digital Democracy (CDD) announced today its opposition to the California Privacy Rights Act (CPRA), also known as Proposition 24 (link is external), which will appear on the November 2020 California general election ballot. CDD has concluded that Prop 24 does not sufficiently strengthen Californians’ privacy and may, in fact, set a new, low, and thus dangerous standard for privacy protection in the U.S. We need strong and bold privacy legislation, not weaker standards and tinkering at the margins. We need digital privacy safeguards that address the fundamental drivers of our eroding privacy, autonomy, and that redress the growing levels of racial and social inequity. We need rules that go to the heart of the data-driven business model and curtail the market incentives that have created the deplorable state of affairs we currently face. What we need are protections that significantly limit data uses that undermine our privacy, increase corporate manipulation and exploitation, and exacerbate racial and economic inequality. We need default privacy settings that limit the sharing and selling of personal information, and the use of data for targeted advertising, personalized content, and other manipulative practices. We need to ensure privacy for all and limit any pay-for-privacy schemes that entice the most vulnerable to give up their privacy. In other words, we need to limit harmful data-use practices by default, and place the interests of consumers above market imperatives by allowing only those data practices that are not harmful to individuals, groups, and society at large. Prop 24 does none of that. Specifically, Prop 24 continues on the path of a failed notice-and-choice regime, allowing the much more powerful companies to set unfair terms. Instead, privacy legislation should focus on strong default settings and data-use practices that are allowable (“permissible uses”) and prohibiting all others. These safeguards should be in place by default, rather than forcing consumers to opt out of invasive advertising. Prop 24, in contrast, does not provide effective data-use limitations; instead it continues to limit data sharing and selling via an opt-out, rather than declaring them to be impermissible uses, or at minimum requiring an opt-in for such practices. Even “sensitive data” under Prop 24 is protected only via a consumer-initiated opt-out, rather than prohibiting the use of sensitive personal data altogether. Equally concerning, Prop 24 would expand rather than limit pay-for-privacy schemes. Under the terms of Prop 24, corporations are still allowed to charge a premium (or eliminate a discount) in exchange for privacy. Consumers shouldn’t be charged higher prices or be discriminated against simply for exercising their privacy rights. This provision of Prop 24 is particularly objectionable, as it tends to harm vulnerable populations, people of color, and the elderly by creating privacy “haves” and “have-nots,” further entrenching other, existing inequities as companies would be able use personal data to profile, segment, and discriminate in a variety of areas. There are many other reasons that CDD objects to Prop 24, chief among them that this flawed measure - employs an outdated concept of “sensitive data” instead of focusing on sensitive data uses; - fails to rein in the growing power of data brokers that collect and analyze personal data from a variety of sources, including public data sets, for sale to marketers; - does not employ strong enough data minimization provisions to limit data collection, use and disclosure only to what is necessary to provide the service requested by the consumer; - undermines consumer efforts to seek enforcement of privacy rights by neglecting to provide full private right-of-action provisions; and - unnecessarily delays its protection of employee privacy.
  • Reports

    Data Governance for Young People in the Commercialized Digital Environment

    A report for UNICEF's Global Governance of Children's Data Project

    TikTok (also known by its Chinese name, DǒuyÄ«n) has quickly captured the interest of children, adolescents, and young adults in 150 countries around the world. The mobile app enables users to create short video clips, customize them with a panoply of user-friendly special effects tools, and then share them widely through the platform’s vast social network. A recent industry survey of children’s app usage in the United States, the UK, and Spain reported that young people between the ages of 4 and 15 now spend almost as much time per day (80 minutes) on TikTok as they do on the highly popular YouTube (85 minutes). TikTok is also credited with helping to drive growth in children’s social app use by 100 percent in 2019 and 200 percent in 2020. Among the keys to its success is a sophisticated artificial intelligence (AI) system that offers a constant stream of highly tailored content, and fosters continuous interaction with the platform. Using computer vision technology to reveal insights based on images, objects, texts, and natural-language processing, the app “learns” about an individual’s preferences, interests and online behaviors so it can offer “high-quality and personalized” content and recommendations. TikTok also provides advertisers with a full spectrum of marketing and brand-promotion applications that tap into a vast store of user information, including not only age, gender, location, and interests, but also granular data sets based on constant tracking of behaviors and activities...TikTok is just one of many tech companies deploying these techniques
 [full article attached and also here (link is external); more from series here (link is external)]
  • Press Release

    Advocates Call on TikTok Suitors to Clean Up Kids’ Privacy Practices

    Groups had filed complaint at FTC documenting how TikTok flouts children’s privacy law, tracks millions of kids without parental consent.

    Contact: Katharina Kopp, CDD (kkopp@democraticmedia.org (link sends e-mail); 202-836-4621) David Monahan, CCFC (david@commercialfreechildhood.org (link sends e-mail)) Advocates Call on TikTok Suitors to Clean Up Kids’ Privacy Practices Groups had filed complaint at FTC documenting how TikTok flouts children’s privacy law, tracks millions of kids without parental consent. WASHINGTON, DC and BOSTON, MA—September 3, 2020—The nation’s leading children’s privacy advocates are calling on potential buyers of TikTok “to take immediate steps to comprehensively improve its privacy and data marketing practices for young people” should they purchase the platform. In separate letters to Microsoft, Walmart, and Oracle, Campaign for a Commercial-Free Childhood (CCFC) and the Center for Digital Democracy (CDD) detail TikTok’s extensive history of violating the Children’s Online Privacy Protection Act (COPPA), including a recent news report that TikTok internally classified more than one-third of its 49 million US users as fourteen or under. Given the likelihood that millions of these users are also under thirteen, the advocates urged Microsoft, Walmart, and Oracle to pledge to immediately stop collecting and processing data from any account flagged as or believed to be under thirteen if they acquire TikTok’s US operations, and only restore accounts that can be affirmatively verified as belonging to users that are thirteen or older. COPPA requires apps and websites to obtain verifiable parental consent before collecting the personal information of anyone under 13, but TikTok has not done so for its millions of accounts held by children. “Whoever purchases TikTok will have access to a treasure trove of ill-gotten, sensitive children’s data,” said Josh Golin, Executive Director of CCFC. “Any new owner must demonstrate their commitment to protecting young people’s privacy by immediately deleting any data that was illegally obtained from children under thirteen. With the keys to one of the most popular platforms for young people on the planet must come a commitment to protect children’s privacy and wellbeing.” In February 2019, TikTok was fined $5.7 million by the Federal Trade Commission (FTC) for COPPA violations and agreed to delete children’s data and properly request parental consent before allowing children under 13 on the site and collecting more data from them. This May, CCFC, CDD, and a coalition of 20 advocacy groups filed an FTC complaint against TikTok for ignoring their promises to delete kids’ data and comply with the law. To this day, the groups say, TikTok plays by its own rules, luring millions of kids under the age of 13, illegally collecting their data, and using it to manipulatively target them with marketing. In addition, they wrote to the companies today that, “By ignoring the presence of millions of younger children on its app, TikTok is putting them at risk for sexual predation; news reports and law enforcement agencies have documented many cases of inappropriate adult-to-child contact on the app.” In August, the groups’ allegations that TikTok had actual knowledge that millions of its users were under thirteen were confirmed by the New York Times. According to internal documents obtained by the Times, TikTok assigns an age range to each user utilizing a variety of methods including “facial recognition algorithms that scrutinize profile pictures and videos,” “comparing their activity and social connections in the app against those of users whose ages have already been estimated,” and drawing “upon information about users that is bought from other sources.” Using these methods, more than one third of TikTok’s 49 million users in the US were estimated to be under fourteen. Among daily users, the proportion that TikTok has designated as under fourteen rises to 47%. “The new owners of TikTok in the U.S. must demonstrate they take protecting the privacy and well-being of young people seriously,” said Katharina Kopp, policy director of the Center for Digital Democracy. “The federal law protecting kids’ privacy must be complied with and fully enforced. In addition, the company should implement a series of safeguards that prohibits manipulative, discriminatory and harmful data and marketing practices that target children and teens. Regulators should reject any proposed sale without ensuring a set of robust set of safeguards for youth are in place,” she noted. ###
  • Press Release

    USDA Online Buying Program for SNAP Participants Threatens Their Privacy and Can Exacerbate Racial and Health Inequities, Says New Report

    Digital Rights, Civil Rights and Public Health Groups Call for Reforms from USDA, Amazon, Walmart, Safeway/Albertson’s and Other Grocery Retailers - Need for Safeguards Urgent During Covid-19 Crisis

    Contact: Jeff Chester jeff@democraticmedia.org (link sends e-mail) 202-494-7100 Katharina Kopp kkopp@democraticmedia.org (link sends e-mail) https://www.democraticmedia.org/ USDA Online Buying Program for SNAP Participants Threatens Their Privacy and Can Exacerbate Racial and Health Inequities, Says New Report Digital Rights, Civil Rights and Public Health Groups Call for Reforms from USDA, Amazon, Walmart, Safeway/Albertson’s and Other Grocery Retailers Need for Safeguards Urgent During Covid-19 Crisis Washington, DC, July 16, 2020—A pilot program designed to enable the tens of millions of Americans who participate in the USDA’s Supplemental Nutrition Assistance Program (SNAP) to buy groceries online is exposing them to a loss of their privacy through “increased data collection and surveillance,” as well as risks involving “intrusive and manipulative online marketing techniques,” according to a report from the Center for Digital Democracy (CDD). The report reveals how online grocers and retailers use an orchestrated array of digital techniques—including granular data profiling, predictive analytics, geolocation tracking, personalized online coupons, AI and machine learning —to promote unhealthy products, trigger impulsive purchases, and increase overall spending at check-out. While these practices affect all consumers engaged in online shopping, the report explains, “they pose greater threats to individuals and families already facing hardship.” E-commerce data practices “are likely to have a disproportionate impact on SNAP participants, which include low-income communities, communities of color, the disabled, and families living in rural areas. The increased reliance on these services for daily food and other household purchases could expose these consumers to extensive data collection, as well as unfair and predatory techniques, exacerbating existing disparities in racial and health equity.” The report was funded by the Robert Wood Johnson Foundation, as part of a collaboration among four civil rights, digital rights, and health organizations: Color of Change, UnidosUS, Center for Digital Democracy, and Berkeley Media Studies Group. The groups issued a letter today to Secretary of Agriculture Sonny Perdue, urging the USDA to take immediate action to strengthen online protections for SNAP participants. USDA launched (link is external) its e-commerce pilot last year in a handful of states, with an initial set of eight retailers approved for participation: Amazon, Dash’s Market, FreshDirect, Hy-Vee, Safeway, ShopRite, Walmart and Wright’s Market. The program has rapidly expanded (link is external) to a majority of states, in part as a result of the current Covid-19 health crisis, in order to enable SNAP participants to shop more safely from home by following “shelter-in-place” rules. Through an analysis of the digital marketing and grocery ecommerce practices of the eight companies, as well as an assessment of their privacy policies, CDD found that SNAP participants and other online shoppers confront an often manipulative and nontransparent online grocery marketplace, which is structured to leverage the tremendous amounts of data gathered on consumers via their mobile devices, loyalty cards, and shopping transactions. E-commerce grocers deliberately foreground the brands and products that partner with them (which include some of the most heavily advertised, processed foods and beverages), making them highly visible on store home pages and on “digital shelves,” as well as through online coupons and well-placed reminders at the point of sale. Grocers working with the SNAP pilot have developed an arsenal of “adtech” (advertising technology) techniques, including those that use machine learning and behavioral science to foster “frictionless shopping” and impulsive purchasing of specific foods and beverages. The AI and Big Data operations documented in the report may also lead to unfair and discriminatory data practices, such as targeting low-income communities and people of color with aggressive promotions for unhealthy food. Data collected and profiles created during online shopping may be applied in other contexts as well, leading to increased exposure to additional forms of predatory marketing, or to denial of opportunities in housing, education, employment, and financial services. “The SNAP program is one of our nation’s greatest success stories because it puts food on the table of hungry families and money in the communities where they live,” explained Dr. Lori Dorfman, Director of the Berkeley Media Studies Group. “Shopping for groceries should not put these families in danger of being hounded by marketers intent on selling products that harm health. Especially in the time of coronavirus when everyone has to stay home to keep themselves and their communities safe, the USDA should put digital safeguards in place so SNAP recipients can grocery shop without being manipulated by unfair marketing practices.” CDD’s research also found that the USDA relied on the flawed and misleading privacy policies of the participating companies, which fail to provide sufficient data protections. According to the pilot’s requirement for participating retailers, privacy policies should clearly explain how a consumer’s data is gathered and used, and provide “optimal” protections. A review of these long, densely worded documents, however, reveals the failure of the companies to identify the extent and impact of their actual data operations, or the risks to consumers. The pilot’s requirements also do not adequately limit the use of SNAP participant’s data for marketing. In addition, CDD tested the companies’ data practices for tracking customers’ behavior online, and compared them to the USDA’s requirements. The research found widespread use of so-called “third party” tracking software (such as “cookies”), which can expose an individual’s personal data to others. “In the absence of strong baseline privacy and ecommerce regulations in the US, the USDA’s weak safeguards are placing SNAP recipients at substantial risk,” explained Dr. Katharina Kopp, one of the report’s authors. “The kinds of e-commerce and Big Data practices we have identified through our research could pose even greater threats to communities of color, including increased commercial surveillance and further discrimination.” “Being on SNAP, or any other assistance program, should not give corporations free rein to use intrusive and manipulative online marketing techniques on Black communities,” said Jade Magnus Ogunnaike, Senior Campaign Director at Color of Change. “Especially in the era of COVID, where online grocery shopping is a necessity, Black people should not be further exposed to a corporate surveillance system with unfair and predatory practices that exacerbate disparities in racial and health equity just because they use SNAP. The USDA should act aggressively to protect SNAP users from unfair, predatory, and discriminatory data practices.” “The SNAP program helps millions of Latinos keep food on the table when times are tough and our nation’s public health and economic crises have highlighted that critical role,” said Steven Lopez, Director of Health Policy at UnidosUS. “Providing enhanced access to healthy and nutritious foods at the expense of the privacy and health of communities of color is too high of a price. Predatory marketing practices have been linked to increased health disparities for communities of color. The USDA must not ignore that fact and should take strong and meaningful steps to treat all participants fairly, without discriminatory practices based on the color of their skin.” The report calls on the USDA to “take an aggressive role in developing meaningful and effective safeguards” before moving the SNAP online purchasing system beyond its initial trial. The agency needs to ensure that contemporary e-commerce, retail and digital marketing applications treat SNAP participants fairly, with strong privacy protections and safeguards against manipulative and discriminatory practices. The USDA should work with SNAP participants, civil rights, consumer and privacy groups, as well as retailers like Amazon and Walmart, to restructure its program to ensure the safety and well-being of the millions of people enrolled in the program. ###
  • Contact: Jeff Chester, CDD (jeff@democraticmedia.org (link sends e-mail); 202-494-7100) David Monahan, CCFC (david@commercialfreechildhood.org (link sends e-mail);) Statement from Campaign for a Commercial-Free Childhood and Center for Digital Democracy on Comments filed with FTC regarding Endorsement Guides WASHINGTON, DC and BOSTON, MA—June 23, 2020—Advocacy groups Campaign for a Commercial-Free Childhood (CCFC) and the Center for Digital Democracy (CDD) filed comments on Monday in response to the FTC’s request for public comment (link is external) on its Endorsement Guides. Jeff Chester, executive director, Center for Digital Democracy: "Influencer marketing should be declared an unfair and deceptive practice when it comes to children. The FTC is enabling so-called ‘kidfluencers,’ ‘brand ambassadors,’ and other ‘celebrity’ marketers to stealthily pitch kids junk food, toys and other products, despite the known risks to their privacy, personal health and security. Kids and teens are being targeted by a ‘wild west’ influencer marketing industry wherever they go online, including when they watch videos, play games, or use social media. It's time for the FTC to place the interests of America's youth before the manipulative commercial activities of influencers." Josh Golin, Executive Director, Campaign for a Commercial-Free Childhood: “The FTC’s failure to act has helped create an entire ecosystem of unfair and deceptive influencer marketing aimed at children. It’s past time for the Commission to send a strong message to everyone – platforms, brands, ad agencies and the influencers themselves – that children should not be targets for this insidious and manipulative marketing.” Angela J. Campbell, Director Emeritus of the Institute for Public Representation’s Communications and Technology Clinic at Georgetown Law, currently chair of CCFC’s Board, and counsel to CCFC and CDD: "Influencer videos full of hidden promotions and sometimes blatant marketing have largely displaced actual programs for children. The FTC must act now to stop these deceptive and unfair practices." ###
  • Supporting the Call for Racial JusticeThe Center for Digital Democracy supports the call for racial justice and the fight against police violence, against the systemic injustices that exist in all parts of our society – inferior educational opportunities; lack of affordable equitable health care; an unjust justice system; housing and employment discrimination; and discriminatory marketing practices.We grieve for the lives lost and the opportunities denied! We grieve for the everyday injustices people of color have to endure and had to endure for centuries.We grieve for an America that could be so much more!Our grieving is not enough! CDD will continue its fight for data justice in support of racial and social justiceJune 5, 2020
  • Contact: Jeff Chester, CDD (jeff@democraticmedia.org (link sends e-mail); 202-494-7100) David Monahan, CCFC (david@commercialfreechildhood.org (link sends e-mail);) Advocates Say TikTok In Contempt of Court Order More kids than ever use the site due to COVID19 quarantine, but TikTok flouts settlement agreement with the FTC WASHINGTON, DC and BOSTON, MA—May 14, 2020—Today, a coalition of leading U.S. child advocacy, consumer, and privacy groups filed a complaint (link is external) urging the Federal Trade Commission (FTC) to investigate and sanction TikTok for putting kids at risk by continuing to violate the Children’s Online Privacy Protection Act (COPPA). In February 2019, TikTok paid a $5.7 million fine for violating COPPA, including illegally collecting personal information from children. But more than a year later, with quarantined kids and families flocking to the site in record numbers, TikTok has failed to delete personal information previously collected from children and is still collecting kids’ personal information without notice to and consent of parents. Campaign for a Commercial-Free Childhood (CCFC), the Center for Digital Democracy (CDD), and a total of 20 organizations demonstrated in their FTC filing that TikTok continues to violate COPPA by: failing to delete personal information related to children under 13 it obtained prior to the 2019 settlement order; failing to give direct notice to parents and to obtain parents’ consent before collecting kids’ personal information; and failing to give parents the right to review or delete their children’s personal information collected by TikTok. TikTok makes it easy for children to avoid obtaining parental consent. When a child under 13 tries to register using their actual birthdate, they will be signed up for a “younger users account” with limited functions, and no ability to share their videos. If a child is frustrated by this limited functionality, they can immediately register again with a fake birthdate from the same device for an account with full privileges, thereby putting them at risk for both TikTok’s commercial data uses and inappropriate contact from adults. In either case, TikTok makes no attempt to notify parents or obtain their consent. And TikTok doesn’t even comply with the law for those children who stick with limited “younger users accounts.” For these accounts, TikTok collects detailed information about how the child uses the app and uses artificial intelligence to determine what to show next, to keep the child engaged online as long as possible. The advocates, represented by the Communications & Technology Law Clinic in the Institute for Public Representation at Georgetown Law, asked the FTC to identify and hold responsible those individuals who made or ratified decisions to violate the settlement agreement. They also asked the FTC to prevent TikTok from registering any new accounts for persons in the US until it adopts a reliable method of determining the ages of its users and comes into full compliance with the children’s privacy rules. In light of TikTok’s vast financial resources, the number and severity of the violations, and the large number of US children that use TikTok, they asked the FTC to seek the maximum monetary penalties allowed by law. Josh Golin, Executive Director of Campaign for a Commercial-Free Childhood, said “For years, TikTok has ignored COPPA, thereby ensnaring perhaps millions of underage children in its marketing apparatus, and putting children at risk of sexual predation. Now, even after being caught red-handed by the FTC, TikTok continues to flout the law. We urge the Commission to take swift action and sanction TikTok again – this time with a fine and injunctive relief commensurate with the seriousness of TikTok’s serial violations.” Jeff Chester, Executive Director of the Center for Digital Democracy, said “Congress empowered the FTC to ensure that kids have online protections, yet here is another case of a digital giant deliberately violating the law. The failure of the FTC to ensure that TikTok protects the privacy of millions of children, including through its use of predictive AI applications, is another reason why there are questions whether the agency can be trusted to effectively oversee the kids’ data law.” Michael Rosenbloom, Staff Attorney and Teaching Fellow at the Institute for Public Representation, Georgetown Law, said “The FTC ordered TikTok to delete all personal information of children under 13 years old from its servers, but TikTok has clearly failed to do so. We easily found that many accounts featuring children were still present on TikTok. Many of these accounts have tens of thousands to millions of followers, and have been around since before the order. We urge the FTC to hold TikTok to account for continuing to violate both COPPA and its consent decree.” Katie McInnis, Policy Counsel at Consumer Reports, said "During the pandemic, families and children are turning to digital tools like TikTok to share videos with loved ones. Now more than ever, effective protection of children's personal information requires robust enforcement in order to incentivize companies, including TikTok, to comply with COPPA and any relevant consent decrees. We urge the FTC to investigate the matters raised in this complaint" Groups signing on to the complaint to the FTC are: Campaign for a Commercial-Free Childhood, the Center for Digital Democracy, Badass Teachers Association, Berkeley Media Studies Group, Children and Screens: Institute of Digital Media and Child Development, Consumer Action, Consumer Federation of America, Consumer Reports, Defending the Early Years, Electronic Privacy Information Center, Media Education Foundation, Obligation, Inc., Parent Coalition for Student Privacy, Parents Across America, ParentsTogether Foundation, Privacy Rights Clearinghouse, Public Citizen, The Story of Stuff, United Church of Christ, and USPIRG. ###
  • Press Release

    Groups Say White House Must Show Efficacy, Protect Privacy, and Ensure Equity When Deploying Technology to Fight Virus

    Fifteen leading consumer, privacy, civil and digital rights organizations called on the federal government to set guidelines to protect individuals’ privacy, ensure equity in the treatment of individuals and communities, and communicate clearly about public health objectives in responding to the COVID-19 pandemic. There must be consensus among all relevant stakeholders on the most efficacious solution before relying on a technological fix to respond to the pandemic.

    FOR IMMEDIATE RELEASE Contacts: Susan Grant (link sends e-mail), CFA, 202-939-1003 May 5, 2020 Katharina Kopp (link sends e-mail), CDD, 202-836 4621 White House Must Act To protect privacy and ensure equity in responding to COVID-19 pandemic Groups Tell Pence to Set Standards to Guide Government and Public-Private Partnership Data Practices and Technology Use Washington, D.C. – Today, 15 leading consumer, privacy, civil and digital rights organizations called on the federal government (link is external) to set guidelines to protect individuals’ privacy, ensure equity in the treatment of individuals and communities, and communicate clearly about public health objectives in responding to the COVID-19 pandemic. In a letter to Vice President Michael R. Pence, who leads the Coronavirus Task Force, the groups said that the proper use of technology and data have the potential to provide important public health benefits, but must incorporate privacy and security, as well as safeguards against discrimination and violations of civil and other rights. Developing a process to assess how effective technology and other tools will be to achieve the desired public health objectives is also vitally important, the groups said. The letter (link is external) was signed by the Campaign for a Commercial Free Childhood, Center for Democracy & Technology, Center for Digital Democracy, Constitutional Alliance, Consumer Action, Consumer Federation of America, Electronic Privacy Information Center (EPIC), Media Alliance, MediaJustice, Oakland Privacy, Parent Coalition for Student Privacy, Privacy Rights Clearinghouse, Public Citizen, Public Knowledge, and Rights x Tech. “A headlong rush into technological solutions without carefully considering how well they work and whether they could undermine fundamental American values such as privacy, equity, and fairness would be a mistake,” said Susan Grant, Director of Consumer Protection and Privacy at the Consumer Federation of America. “Fostering public trust and confidence in the programs that are implemented to combat COVID-19 is crucial to their overall success.” “Measures to contain the deadly spread of COVID-19 must be effective and protect those most exposed. History has taught us that the deployment of technologies is often driven by forces that tend to risk privacy, undermine fairness and equity, and place our civil rights in peril. The White House Task Force must work with privacy, consumer and civil rights groups, and other experts, to ensure that the efforts to limit the spread of the virus truly protect our interests,” said Katharina Kopp, Director of Policy, Center for Digital Democracy. In addition to concerns about government plans that are being developed to address the pandemic, such as using technology for contact tracing, the groups noted the need to ensure that private-sector partnerships incorporate comprehensive privacy and security standards. The letter outlines 11 principles that should form the basis for standards that government agencies and the private sector can follow: Set science-based, public health objectives to address the pandemic. Then design the programs and consider what tools, including technology, might be most efficacious and helpful to meet those objectives. Assess how technology and other tools meet key criteria. This should be done before deployment when possible and consistent with public health demands, and on an ongoing basis. Questions should include: Can they be shown to be effective for their intended purposes? Can they be used without infringing on privacy? Can they be used without unfairly disadvantaging individuals or communities? Are there other alternatives that would help meet the objectives well without potentially negative consequences? Use of technologies and tools that are ineffective or raise privacy or other societal concerns should be discontinued promptly. Protect against bias and address inequities in technology access. In many cases, communities already disproportionately impacted by COVID-19 may lack access to technology, or not be fairly represented in data sets. Any use of digital tools must ensure that nobody is left behind. Set clear guidelines for how technology and other tools will be used. These should be aimed at ensuring that they will serve the public health objective while safeguarding privacy and other societal values. Public and private partners should be required to adhere to those guidelines, and the guidelines should be readily available to the public. Ensure that programs such as technology-assisted contact tracing are voluntary. Individual participation should be based on informed, affirmative consent, not coercion. Only collect individuals’ personal information needed for the public health objective. No other personal information should be collected in testing, contact tracing, and public information portals. Do not use or share individuals’ personal information for any other purposes. It is important to avoid “mission creep” and to prevent use for purposes unrelated to the pandemic such as for advertising, law enforcement, or for reputation management in non-public health settings. Secure individuals’ personal information from unauthorized access and use. Information collected from testing, contact tracing and information portals may be very revealing, even if it is not “health” information, and security breaches would severely damage public trust. Retain individuals’ personal information only for as long as it is needed. When it is no longer required for the public health objective, the information should be safely disposed of. Be transparent about data collection and use. Before their personal information is collected, individuals should be informed about what data is needed, the specific purposes for which the data will be used, and what rights they have over what’s been collected about them. Provide accountability. There must be systems in place to ensure that these principles are followed and to hold responsible parties accountable. In addition, individuals should have clear means to ask questions, make complaints, and seek recourse in connection with the handling of their personal information. The groups asked Vice President Pence for a meeting to discuss their concerns and suggested that the Coronavirus Task Force immediately create an interdisciplinary advisory committee comprised of experts from public health, data security, privacy, social science, and civil society to help develop effective standards. The Consumer Federation of America (link is external) is a nonprofit association of more than 250 consumer groups that was founded in 1968 to advance the consumer interest through research, advocacy, and education. The Center for Digital Democracy (CDD) is recognized as one of the leading NGOs organizations promoting privacy and consumer protection, fairness and data justice in the digital age. Since its founding in 2001 (and prior to that through its predecessor organization, the Center for Media Education), CDD has been at the forefront of research, public education, and advocacy.
  • The COVID-19 pandemic is a global public health emergency that requires a coordinated and large-scale response by governments worldwide. However, States’ efforts to contain the virus must not be used as a cover to usher in a new era of greatly expanded systems of invasive digital surveillance.We, the undersigned organizations, urge governments to show leadership in tackling the pandemic in a way that ensures that the use of digital technologies to track and monitor individuals and populations is carried out strictly in line with human rights.Technology can and should play an important role during this effort to save lives, such as to spread public health messages and increase access to health care. However, an increase in state digital surveillance powers, such as obtaining access to mobile phone location data, threatens privacy, freedom of expression and freedom of association, in ways that could violate rights and degrade trust in public authorities – undermining the effectiveness of any public health response. Such measures also pose a risk of discrimination and may disproportionately harm already marginalized communities.These are extraordinary times, but human rights law still applies. Indeed, the human rights framework is designed to ensure that different rights can be carefully balanced to protect individuals and wider societies. States cannot simply disregard rights such as privacy and freedom of expression in the name of tackling a public health crisis. On the contrary, protecting human rights also promotes public health. Now more than ever, governments must rigorously ensure that any restrictions to these rights is in line with long-established human rights safeguards.This crisis offers an opportunity to demonstrate our shared humanity. We can make extraordinary efforts to fight this pandemic that are consistent with human rights standards and the rule of law. The decisions that governments make now to confront the pandemic will shape what the world looks like in the future.We call on all governments not to respond to the COVID-19 pandemic with increased digital surveillance unless the following conditions are met:Surveillance measures adopted to address the pandemic must be lawful, necessary and proportionate. They must be provided for by law and must be justified by legitimate public health objectives, as determined by the appropriate public health authorities, and be proportionate to those needs. Governments must be transparent about the measures they are taking so that they can be scrutinized and if appropriate later modified, retracted, or overturned. We cannot allow the COVID-19 pandemic to serve as an excuse for indiscriminate mass surveillance.If governments expand monitoring and surveillance powers then such powers must be time-bound, and only continue for as long as necessary to address the current pandemic. We cannot allow the COVID-19 pandemic to serve as an excuse for indefinite surveillance.States must ensure that increased collection, retention, and aggregation of personal data, including health data, is only used for the purposes of responding to the COVID-19 pandemic. Data collected, retained, and aggregated to respond to the pandemic must be limited in scope, time-bound in relation to the pandemic and must not be used for commercial or any other purposes. We cannot allow the COVID-19 pandemic to serve as an excuse to gut individual’s right to privacy.Governments must take every effort to protect people’s data, including ensuring sufficient security of any personal data collected and of any devices, applications, networks, or services involved in collection, transmission, processing, and storage. Any claims that data is anonymous must be based on evidence and supported with sufficient information regarding how it has been anonymized. We cannot allow attempts to respond to this pandemic to be used as justification for compromising people’s digital safety.Any use of digital surveillance technologies in responding to COVID-19, including big data and artificial intelligence systems, must address the risk that these tools will facilitate discrimination and other rights abuses against racial minorities, people living in poverty, and other marginalized populations, whose needs and lived realities may be obscured or misrepresented in large datasets. We cannot allow the COVID-19 pandemic to further increase the gap in the enjoyment of human rights between different groups in society.If governments enter into data sharing agreements with other public or private sector entities, they must be based on law, and the existence of these agreements and information necessary to assess their impact on privacy and human rights must be publicly disclosed – in writing, with sunset clauses, public oversight and other safeguards by default. Businesses involved in efforts by governments to tackle COVID-19 must undertake due diligence to ensure they respect human rights, and ensure any intervention is firewalled from other business and commercial interests. We cannot allow the COVID-19 pandemic to serve as an excuse for keeping people in the dark about what information their governments are gathering and sharing with third parties.Any response must incorporate accountability protections and safeguards against abuse. Increased surveillance efforts related to COVID-19 should not fall under the domain of security or intelligence agencies and must be subject to effective oversight by appropriate independent bodies. Further, individuals must be given the opportunity to know about and challenge any COVID-19 related measures to collect, aggregate, and retain, and use data. Individuals who have been subjected to surveillance must have access to effective remedies.COVID-19 related responses that include data collection efforts should include means for free, active, and meaningful participation of relevant stakeholders, in particular experts in the public health sector and the most marginalized population groups.Signatories:7amleh – Arab Center for Social Media AdvancementAccess NowAfrican Declaration on Internet Rights and Freedoms CoalitionAI NowAlgorithm WatchAlternatif BilisimAmnesty InternationalApTIARTICLE 19AsociaciĂłn para una CiudadanĂ­a Participativa, ACI ParticipaAssociation for Progressive Communications (APC)ASUTIC, SenegalAthan - Freedom of Expression Activist OrganizationAustralian Privacy FoundationBarracĂłn DigitalBig Brother WatchBits of FreedomCenter for Advancement of Rights and Democracy (CARD)Center for Digital DemocracyCenter for Economic JusticeCentro De Estudios Constitucionales y de Derechos Humanos de RosarioChaos Computer Club - CCCCitizen D / DrĆŸavljan DCIVICUSCivil Liberties Union for EuropeCĂłdigoSurCoding RightsColetivo Brasil de Comunicação SocialCollaboration on International ICT Policy for East and Southern Africa (CIPESA)ComitĂ© por la Libre ExpresiĂłn (C-Libre)Committee to Protect JournalistsConsumer ActionConsumer Federation of AmericaCooperativa Tierra ComĂșnCreative Commons UruguayD3 - Defesa dos Direitos DigitaisData Privacy BrasilDemocratic Transition and Human Rights Support Center "DAAM"Derechos DigitalesDigital Rights Lawyers Initiative (DRLI)Digital Rights WatchDigital Security Lab UkraineDigitalcourageEPICepicenter.worksEuropean Digital Rights - EDRiFitugFoundation for Information Policy ResearchFoundation for Media AlternativesFundaciĂłn Acceso (CentroamĂ©rica)FundaciĂłn CiudadanĂ­a y Desarrollo, EcuadorFundaciĂłn Datos ProtegidosFundaciĂłn Internet BoliviaFundaciĂłn TaigĂŒey, RepĂșblica DominicanaFundaciĂłn VĂ­a LibreHermes CenterHiperderechoHomo DigitalisHuman Rights WatchHungarian Civil Liberties UnionImpACT International for Human Rights PoliciesIndex on CensorshipInitiative fĂŒr NetzfreiheitInnovation for Change - Middle East and North AfricaInternational Commission of JuristsInternational Service for Human Rights (ISHR)Intervozes - Coletivo Brasil de Comunicação SocialIpandetecIPPFIrish Council for Civil Liberties (ICCL)IT-Political Association of DenmarkIuridicum Remedium z.s. (IURE)KarismaLa Quadrature du NetLiberia Information Technology Student UnionLibertyLuchadorasMajal.orgMasaar "Community for Technology and Law"Media Rights Agenda (Nigeria)MENA Rights GroupMetamorphosis FoundationNew America's Open Technology InstituteObservacomOpen Data InstituteOpen Rights GroupOpenMediaOutRight Action InternationalPangeaPanoptykon FoundationParadigm Initiative (PIN)PEN InternationalPrivacy InternationalPublic CitizenPublic KnowledgeR3D: Red en Defensa de los Derechos DigitalesRedesAyudaSHARE FoundationSkyline International for Human RightsSursiendoSwedish Consumers’ AssociationTahrir Institute for Middle East Policy (TIMEP)Tech InquiryTechHerNGTEDICThe Bachchao ProjectUnwanted Witness, UgandaUsuarios DigitalesWITNESSWorld Wide Web Foundation