CDD

Newsroom

  • Blog

    Time to Legislate COPPA 2.0

    FTC's Proposed COPPA Review comes too late; Agency cannot be relied on to protect youth, families

    Today, the FTC announced plans to review the Children's Online Privacy Protection Act (COPPA) rules. CDD and allies successfully campaigned to have the law's rules expanded in 2012, to ensure that the privacy of children 12 and under was better protected in today's "Big Data" driven cross-device environment. But it's too late for the FTC to make modest changes to how it implements the 1998 law. The commission has failed to effectively enforce the law for years--both Republicans and Democrats are at fault here. For example, the commission has long known that Google was deliberately ignoring COPPA, in order to build its YouTube service as the leading child-directed site. But it did nothing. Under a bill proposed by Sens. Edward J. Markey and Josh Hawley, the FTC would be required to proactively ensure that children's data is better protected, and that there are serious safeguards when marketing to them. It's time for the FTC to come out in support of that bill.
  • News

    FCC Commissioners Channel Scrooge and Weaken Children's TV Safeguards

    FCC fails to ensure children in the U.S. have access to free, quality video content

    "The Trump FCC just gave some of the country’s most powerful media companies a huge taxpayer-funded gift. By weakening a key safeguard requiring companies such as Comcast/NBC, News Corp./Fox, Disney/ABC, and Sinclair to air on their broadcast TV stations a few hours of educational programming for children, the FCC has placed the interests of the already rich and powerful over the needs of children and families. "Now these giant media companies get a huge public taxpayer hand-out in terms of free access to the airwaves (spectrum), as well as guaranteed access to cable TV systems, without any serious public-interest obligations. The FCC has given the TV industry a huge benefit, without getting anything in return. "Congress must step in and enact a new law that requires TV stations, cable systems and streaming video providers to offer a wide range of quality content for children. Such programming should be free—and not behind paywalls. In the meantime, today’s decision by the FCC will be remembered as one in which the commission's three GOP members embodied the worst qualities of Dickens’ Ebenezer Scrooge." statement of Jeff Chester, executive director, Center for Digital Democracy. CDD’s predecessor group Center for Media Education led the campaign in the early the 1990’s that led to the Children’s Television Act rules that were weakened today. It did the report in the early 1990’s that revealed TV stations claimed that shows like The Jetsons and The Flintstones were allegedly educational: https://www.nytimes.com/1992/09/30/us/broadcasters-to-satisfy-law-define-cartoons-as-education.html?searchResultPosition=13 (link is external)
  • FTC Must Effectively Penalize Google for Violating COPPA Privacy law

    Concern that FTC actions will not protect privacy for children on Google platforms; Digital ad giant must be held accountable

    Letter sent on June 3 2019 to Chairman Simons and FTC Commissioners in response to a call made by the Campaign for Commercial-Free Childhood and CDD, represented by our attorneys from the Institute for Public Representation (IPR), Georgetown University Law Center with the commission.
    Jeff Chester
  • Press Release

    FTC must impose maximum fine and ensure Google’s YouTube business practices obey children’s privacy law, say groups

    Google’s unprecedented violation requires an unprecedented FTC response and a 20-year consent decree to ensure Alphabet Inc. acts responsibly when it comes to serving children and parents; Google executives should also be held accountable.

    June 25, 2019 The Honorable Joseph Simons The Honorable Noah Phillips Chairman Commissioner Federal Trade Commission Federal Trade Commission 600 Pennsylvania Avenue, NW 600 Pennsylvania Avenue, NW Washington, DC 20580 Washington, DC 20580 The Honorable Rohit Chopra The Honorable Rebecca Slaughter Commissioner Commissioner Federal Trade Commission Federal Trade Commission 600 Pennsylvania Avenue, NW 600 Pennsylvania Avenue, NW Washington, DC 20580 Washington, DC 20580 The Honorable Christine Wilson Commissioner Federal Trade Commission 600 Pennsylvania Avenue, NW Washington, DC 20580 Dear Chairman Simons, Commissioner Phillips, Commissioner Chopra, Commissioner Slaughter, and Commissioner Wilson: The Campaign for a Commercial-Free Childhood (CCFC) and Center for Digital Democracy (CDD) have been encouraged by recent media reports that the Federal Trade Commission is preparing to take action against Google and YouTube for violating the Children’s Online Privacy Protection Act (COPPA). CCFC and CDD, represented by the Institute for Public Representation at Georgetown Law (IPR), are the organizations responsible for drafting the Request to Investigate Google and YouTube filed with the FTC on April 9, 2018. Previously, IPR filed on behalf of CCFC and CDD Requests to Investigate YouTube’s promotion of unfair and deceptive influencer marketing to children (October 21, 2016) and unfair and deceptive marketing practices on YouTube Kids (April 7, and November 24, 2015). As you are aware, YouTube has profited enormously by hosting channels and videos of nursery rhymes, unboxing videos, popular cartoons, and other content specifically designed for children on the main YouTube platform. But instead of getting the verifiable parental consent required before collecting children’s personal information, Google claims that YouTube is not for children under thirteen, and therefore, no consent is required. This defense is outlandish given that YouTube is the number one online destination for kids. In short, Google has profited by violating the law and the privacy of tens of millions of children. For this reason, the FTC must sanction Google at a scale commensurate with the company’s unprecedented and unparalleled violations of COPPA. As we pointed out in our Request to Investigate, the maximum civil penalties should be imposed because: Google had actual knowledge of both the large number of child-directed channels on YouTube and the large numbers of children using YouTube. Yet, Google collected personal information from nearly 25 million children in the U.S over a period of years, and used this data to engage in very sophisticated digital marketing techniques. Google’s wrongdoing allowed it to profit in two different ways: Google has not only made a vast amount of money by using children’s personal information as part of its ad networks to target advertising, but has also profited from advertising revenues from ads on its YouTube channels that are watched by children. [April 9, 2018 Request to Investigate at 26-27 (footnotes omitted)]. Moreover, any consent order must mandate meaningful changes to YouTube’s business practices. For example, all child-directed content should be placed on a separate platform where targeted advertising, commercial data collection, links to other sites or content, and autoplay are prohibited. Google must also live up to its Terms of Service – which stipulate YouTube is only for persons thirteen and older – by removing all kids’ content from the main YouTube platform. By ensuring such changes, the Commission will do a tremendous service to America’s families seeking to provide a healthy media environment for their children, while sending a clear message to all online and mobile operators that no one is above the law. Google’s disregard of children’s welfare is demonstrated not only by the evidence in our complaints, but by numerous reports of violent, sexual and other inappropriate content available to children on both YouTube Kids and on the main YouTube platform. Moreover, the company refused to turn off recommendations on videos featuring young children in leotards and bathing suits even after researchers demonstrated YouTube’s algorithm was recommending these videos to pedophiles. These ongoing and serious issues require that the FTC take strong action. We believe that Google should repay America’s families by creating a truly safe space for kids and fostering the production of quality non-commercial children’s programming. Attached you will find a list of recommended penalties and conditions to be included in a final consent order. We would be happy to meet with you to discuss our proposed remedies in greater detail. Thank you. Sincerely, Jeffrey Chester Executive Director Center for Digital Democracy Josh Golin Executive Director Campaign for a Commercial-Free Childhood Angela J. Campbell Director Institute for Public Representation Georgetown Law Encl.: Proposed Consent Order Penalties and Conditions Proposed Consent Order Penalties and Conditions The FTC should seek a 20-year consent decree which includes the following forms of relief: Injunctive relief Destroy all data collected from children under 13, in all forms in Google’s possession, including inferences drawn from this data, custody, or control of YouTube and all of Alphabet’s subsidiaries engaged in online data collection or commercial uses (e.g. advertising), including, but not limited to, Google Ads, Google Marketing Platform and their predecessors. Immediately stop collecting data from any user known to be under age 13, and any user that a reasonable person would likely believe to be under age 13, including, but not limited to, persons that are viewing any channel or video primarily directed to children, persons who have been identified for targeted ads based on being under 13 or any proxy for under 13 (e.g., grade in school, interest in toys, etc.), or any other factors. Identify, as of the date of this consent order, as well as on an ongoing basis, any users under age 13, and prohibit them from accessing content on YouTube. Prohibit users under age 13 from accessing content on YouTube Kids unless and until YouTube has provided detailed notice to parents, obtained parental consent, and complied with all of the other requirements of COPPA and this consent order. Remove all channels in the Parenting and Family lineup, as well as any other YouTube channels and videos directed at children, from YouTube. YouTube may make such channels and videos available on a platform specifically intended for children (e.g. YouTube Kids) only after qualified human reviewers have reviewed the content and determined that the programming comply with all of the policies for YouTube’s child-directed platform, which must include, but are not limited to: No data collection for commercial purposes. Any data collected for “internal purposes” must be clearly identified as to what is being collected, for what purpose, and who has access to the data. It may not be sold to any third parties. No links out to other sites or online services. No recommendations or autoplay. No targeted marketing. No product or brand integration, including influencer marketing. Consumer education Require Google to fund independent organizations to undertake educational campaigns to help children and parents understand the true nature of Google’s data-driven digital marketing systems and its potential impacts on children’s wellbeing and privacy. Require Google to publicly admit (in advertising and in other ways) that it has violated the law and warn parents that no one under 13 should use YouTube. Record keeping and monitoring provisions Google must submit to an annual audit by a qualified, independent auditor to ensure that Google is complying with all aspects of the consent decree. The auditor must submit their report to the FTC. The FTC shall provide reports to Congress about the findings. All of the annual audits must be publicly available without redaction on the Commission’s website within 30 days of receipt. Google may not launch any new child-directed service until the new service has been reviewed and approved by an independent panel of experts – including child development and privacy experts – to be appointed by the FTC. Google must retain, and make available to the FTC on request, documentation of its compliance with the consent decree. Civil penalties and other monetary relief Google will pay the maximum possible civil penalties – $42,530 per violation. Whether violations are counted per child or per day, the total amount of the penalty must be sufficiently high to deter Google and YouTube from any further violations of COPPA. Google to establish a $100 million fund to be used to support the production of noncommercial, high-quality, and diverse content for children. Decisions about who receives this money must be insulated from influence by Google. In addition, we ask the FTC to consider using its authority under Section 13(b) of the FTC Act to require Google and YouTube to disgorge ill-gotten gains, and to impose separate civil penalties on the management personnel at Google and YouTube who knowingly allowed these COPPA violations to occur.
  • All Complaints (link is external) filed by Institute for Public Representation on behalf of Campaign for a Commercial-Free Childhood, the Center for Digital Democracy, and others
  • In response to a call (link is external) for submissions by the UN Committee on the Right of the Child (link is external) on the topic of children’s rights in relation to the digital environment, CDD joins academics and advocates in submitting comments. The group calls on the Committee to recognize the far-reaching harms caused by digital marketing and the personal data extraction on which it is predicated. Many digital marketing practices infringe many rights enshrined in the UN Convention on the Rights of the Child (link is external). The Committee ought to recognize the need to protect children from these harms so children can fully enjoy the opportunities digital environments offer for their development and fulfilment of their rights.
  • Contact: Josh Golin, CCFC: josh@commercialfreechildhood.org (link sends e-mail); (617) 896-9369 Jeff Chester, CDD: jeff@democraticmedia.org (link sends e-mail); (202) 494-7100 Advocates Demand FTC Investigation of Echo Dot Kids Edition Amazon violates COPPA in many ways, including keeping data that parents believe they deleted BOSTON, MA — May 9, 2019 — Today, a coalition of 19 consumer and public health advocates led by the Campaign for a Commercial-Free Childhood (CCFC) and the Center for Digital Democracy (CDD) called on the Federal Trade Commission (FTC) (link is external) to investigate and sanction Amazon for infringing on children’s privacy through its Amazon Echo Dot Kids Edition. An investigation by CCFC and the Institute for Public Representation (IPR) at Georgetown Law revealed that Echo Dot Kids, a candy-colored version of Amazon’s home assistant with Alexa voice technology, violates the Children’s Online Privacy Protection Act (COPPA) in many ways. Amazon collects sensitive personal information from kids, including their voice recordings and data gleaned from kids’ viewing, reading, listening, and purchasing habits, and retains it indefinitely. Most shockingly, Amazon retains children’s data even after parents believe they have deleted it. CCFC and IPR have produced a video (link is external) demonstrating how Amazon ignores the request to delete or “forget” a child’s information it has remembered. The advocates’ FTC complaint also say Amazon offers parents a maze of multiple privacy policies, which violate COPPA because they are confusing, misleading and even contradictory. “Amazon markets Echo Dot Kids as a device to educate and entertain kids, but the real purpose is to amass a treasure trove of sensitive data that it refuses to relinquish even when directed to by parents,” said Josh Golin, CCFC’s Executive Director. “COPPA makes clear that parents are the ones with the final say about what happens to their children’s data, not Jeff Bezos. The FTC must hold Amazon accountable for blatantly violating children’s privacy law and putting kids at risk.” Amazon Echo Dot Kids Edition comes with a one-year subscription to FreeTime Unlimited, which connects children with entertainment like movies, music, audiobooks, and video games. The always-on listening device is often placed in the child’s bedroom, and kids are encouraged to interact with it as if Alexa was a close friend. Kids can download “skills,” similar to apps, to add functionality. In clear violation of COPPA, Amazon disavows responsibility for the data collection practices of Alexa skills for kids and tells parents to check the skill developers’ privacy policies. To make matters worse, 85% of skills for kids have no privacy policy posted. Amazon does not verify that the person consenting to data collection is an adult, let alone the child’s parent. The advocates also say the Echo Dot has a “playdate problem”: a child whose parents have not consented will have their conversations recorded and sensitive information retained when visiting a friend who owns the device. “We spent months analyzing the Echo Dot Kids and the device’s myriad privacy policies and we still don’t have a clear picture of what data is collected by Amazon and who has access to it,” said Angela Campbell, a CCFC Board Member and Director of IPR’s Communications and Technology Clinic at Georgetown Law, which researched and drafted the complaint. “If privacy experts can’t make heads or tails of Amazon’s privacy policy labyrinth, how can a parent meaningfully consent to the collection of their children’s data?” “By providing misleading tools that don’t actually allow parents to delete their children’s data, Amazon has made a farce of parents’ difficult task of protecting their children’s privacy,” said Lindsey Barrett, Staff Attorney and Teaching Fellow at IPR. “COPPA requires companies to allow parents to delete their children’s personal information, and Amazon is breaking the law— not to mention breaking parents’ trust.” “It’s shameful that Amazon is ensnaring children and their valuable data in its race to market dominance,” said Jeff Chester of CDD. "COPPA was enacted to empower parents to have control over their children’s data, but at every turn Echo Dot Kids thwarts parents who want to limit what Amazon knows about their child. The FTC must hold Amazon accountable to make clear that voice-activated, always-on devices must respect children’s privacy." Organizations which signed today’s complaint were the Campaign for a Commercial-Free Childhood, Center for Digital Democracy, Berkeley Media Studies Group, Color of Change, Consumer Action, Consumer Federation of America, Defending the Early Years, Electronic Privacy Information Center, New Dream, Open MIC (Open Media and Information Companies Initiative), Parents Across America, Parent Coalition for Student Privacy, Parents Television Council, Peace Educators Allied for Children Everywhere (P.E.A.C.E.), Public Citizen, Raffi Foundation for Child Honouring, Story of Stuff, TRUCE (Teachers Resisting Unhealthy Childhood Entertainment), and U.S. PIRG. In May 2018, CCFC and CDD issued a warning (link is external), supported by experts like Drs. Sherry Turkle, Jenny Radesky, and Dipesh Navsaria, that parents should steer clear of Echo Dot Kids. The advocates cautioned that Echo Dot endangers children’s privacy, and by encouraging young children to spend more time with and form “faux relationships” with digital devices, it threatens their healthy development. Added Josh Golin: “Echo Dot Kids interferes with children’s healthy development and relationships and threatens their privacy. Parents should resist Amazon’s efforts to indoctrinate children into a culture of surveillance, and say ‘no’ to Echo Dot Kids.” The investigation by CCFC and IPR was made possible by a generous grant from the Rose Foundation for Communities and the Environment (link is external).
  • Press Release

    Press Briefing: Senators, Coalition Call for Sweeping National Digital Privacy Legislation

    Experts to Demand Federal Action to Combat Growing digital-consumer and Civil Rights Threats

    For Immediate Release: Feb. 26, 2019 Contact: Mike Stankiewicz, mstankiewicz@citizen.org (link sends e-mail), (202) 588-7779 Jeff Chester jeff@democraticmedia.org (link sends e-mail) (202) 494-7100 MEDIA ADVISORY Press Briefing: Senators, Coalition Call for Sweeping National Digital Privacy Legislation Experts to Demand Federal Action to Combat Growing digital-consumer and Civil Rights Threats WHAT: Staff briefing, open to the press and sponsored by U.S. Sens. Ed Markey (D-Mass.) Tom Udall (D-N.M.), at which consumer protection and civil rights advocates will explain the urgent need for sweeping federal digital privacy legislation with a new approach. It includes baseline legislation that doesn’t pre-empt state privacy laws, the creation of a new federal data protection agency and safeguards against data practices that lead to unjust, unfair, manipulative or discriminatory, outcomes. The briefing comes as Americans increasingly demand protection of their digital privacy in light of unethical privacy sharing and harvesting by tech giants. Without any constraints, Big Tech companies and their partners are collecting sensitive information on American citizens, ranging from financial and health information to data that tracks our Internet activity across all our devices, our location throughout the day and much more. Both the federal government and the states must be empowered to protect the public from this ever-growing online threat to their privacy, welfare and civil rights. Members of the Privacy and Digital Rights for All coalition, which has announced a Framework for Comprehensive Privacy Protection And Digital Rights (link is external) in the U.S. also will speak about the need for enduring privacy innovation and limiting government access to personal data. WHEN: 2 p.m. EST, Mon., March 4 WHO: Ed Mierzwinski, consumer program director, U.S. PIRG Jeffrey Chester, executive director, Center for Digital Democracy Brandi Collins-Dexter, senior campaign director, Color of Change Josh Golin, executive director, Campaign for a Commercial-Free Childhood Burcu Kilic, research director, Public Citizen’s Access to Medicines program Christine Bannan, administrative law and policy fellow, Electronic Privacy Information Center (EPIC) WHERE: Hart Senate Office Building Room 216 120 Constitution Ave NE Washington, DC 20002 ###
  • Privacy Rights Are Civil Rights

    Over 40 Civil Rights, Civil Liberties, and Consumer Groups Call on Congress to Address Data-Driven Discrimination

  • Curbing Companies’ Bad Behavior Will Require Stronger Data Privacy Laws and a New Federal Data Privacy Agency Federal Privacy Laws Are Antiquated and Need Updating; New Data Privacy Legislation Must Include Civil Rights Protections and Enhanced Punishments for Violations Jan. 17, 2019 Contact: Don Owens, dowens@citizen.org (link sends e-mail), (202) 588-7767 Jeffrey Chester, jeff@democraticmedia.org (link sends e-mail), (202) 494-7100 WASHINGTON, D.C. – U.S. data privacy laws must be overhauled without pre-empting state laws and a new data privacy agency should be created to confront 21st century threats and address emerging concerns for digital customers, consumer and privacy organizations said today as they released a framework (link is external) for comprehensive privacy protection and digital rights for members of Congress. “Big Tech is coming to Washington looking for a deal that affords inadequate protections for privacy and other consumer rights but pre-empts states from defending their citizens against the tech companies’ surveillance and misuse of data,” said Robert Weissman, president of Public Citizen. “But here’s the bad news for the tech giants: That deal isn’t going to fly. Instead, the American people are demanding – and intend to win – meaningful federal restraints on tech company abuses of power that also ensure the right of states to craft their own consumer protections.” From the Equifax data breach to foreign election interference and targeted digital ads based on race, health and income, it’s clear that U.S. consumers face a crisis of confidence born from federal data privacy laws that are decades out of date and a lack of basic protections afforded them by digital conglomerates. These corporations, many of which dominate online spaces, are far more interested in monetizing every key stroke or click than protecting consumers from data breaches. For that reason, federal and state authorities must act, the groups maintain. The groups will push for federal legislation based on a familiar privacy framework, such as the original U.S. Code of Fair Information Practices and the widely followed Organization for Economic Cooperation and Development Privacy Guidelines. These frameworks should require companies that collect personal data and rights for individuals to: Establish limits on the collection, use and disclosure of sensitive personal data; Establish enhanced limits on the collection, use and disclosure of data of children and teens; Regulate consumer scoring and other business practices that diminish people’s physical health, education, financial and work prospects; and Prohibit or prevent manipulative marketing practices. The groups are calling for federal baseline legislation and oppose the pre-emption of state digital privacy laws. States have long acted as the “laboratories of democracy” and must continue to have the power to enact appropriate protections for their citizens as technology develops, the groups say. “Black communities should not have to choose between accessing the Internet and the right to control our data,” said Brandi Collins-Dexter, senior campaign director at Color Of Change. “We need privacy legislation that holds powerful corporations accountable for their impacts. Burdening our communities with the need to discern how complex terms of service and algorithms could harm us will only serve to reinforce discriminatory corporate practices. The privacy protection and digital rights principles released today create an important baseline for proactive data protections for our communities.” “For years now, Big Tech has used our sensitive information as a cash cow,” said Josh Golin, executive director of Campaign for a Commercial-Free Childhood. “Each innovation – whether it’s talking home assistants, new social media tools or software for schools – is designed to spy on families and children. We desperately need both 21st century legislation and a new federal agency with broad enforcement powers to ensure that children have a chance to grow up without their every move, keystroke, swipe and utterance tracked and monetized.” The United States is woefully behind other nations worldwide in providing these modern data protections for its consumers, instead relying solely on the Federal Trade Commission (FTC) to safeguard consumers and promote competition. But corporations understand that the FTC lacks rulemaking authority and that the agency often fails to enforce rules it has established. “The FTC has failed to act,” said Caitriona Fitzgerald, policy director at the Electronic Privacy Information Center. “The U.S. needs a dedicated data protection agency.” Alternately, many democratic nations like Canada, Mexico, the U.K., Ireland and Japan already have dedicated data protection agencies with independent authority and enforcement capabilities. Groups that have signed on to the framework include Berkeley Media Studies Group, Campaign for a Commercial-Free Childhood, Center for Digital Democracy, Center for Media Justice, Color of Change, Consumer Action, Consumer Federation of America, Defending Rights & Dissent, Electronic Privacy Information Center, Media Alliance, Parent Coalition for Student Privacy, Privacy Rights Clearinghouse, Privacy Times, Public Citizen, Stop Online Violence Against Women and U.S. PIRG. Read the groups’ proposal below. ###
  • Contact: David Monahan, CCFC (david@commercialfreechildhood.org (link sends e-mail); 617-896-9397)Jeff Chester, CDD (jeff@democraticmedia.org (link sends e-mail); 202-494-7100)Apps which Google rates as safe for kids violate their privacy and expose them to other harmsAdvocates, lawmakers call on FTC to address how Google’s Play Store promotes children’s games which violate kids’ privacy law, feature inappropriate content, and lure kids to watch ads and make in-app purchases BOSTON, MA and WASHINGTON, DC — December 19, 2018 — Today, a coalition of 22 consumer and public health advocacy groups led by Campaign for a Commercial-Free Childhood (CCFC) and Center for Digital Democracy (CDD) called on the Federal Trade Commission (link is external) (“FTC”) to investigate and sanction Google for the deceptive marketing of apps for young children. Google represents that the apps in the “Family” section of the Google Play Store are safe for children, but the apps often violate federal children’s privacy law, expose children to inappropriate content, and disregard Google’s own policies by manipulating children into watching ads and making in-app purchases.The Play Store is Google’s one-stop shop for Android apps, games, and entertainment. Apps in the “Family” section are promoted with a green star and, in some cases, a recommended age, like “Ages 5 & Under,” or “Ages 6-8.” Google is aware from several recent academic studies that many of the apps in this section are a threat to children’s privacy and wellbeing, yet it continues to promote them with these kid-friendly ratings.“The business model for the Play Store’s Family section benefits advertisers, developers, and Google at the expense of children and parents,” said CCFC’s Executive Director Josh Golin. “Google puts its seal of approval on apps that break the law, manipulate kids into watching ads and making purchases, and feature content like kids cleaning their eyes with sharp objects. Given Google’s long history of targeting children with unfair marketing and inappropriate content, including on YouTube, it is imperative that the FTC take swift action.”Lawmakers echoed the call for FTC action. “We’re repeatedly confronted with examples of tech companies that are just not doing enough to protect consumer privacy – and I’m particularly concerned about what this failure means for our children,” said U.S. Senator Tom Udall (D-NM) regarding today’s action by the advocates. “When real-world products are dangerous or violate the law, we expect retailers to pull them off the shelves. Google’s refusal to take responsibility for privacy issues in their Play Store allows for app developers to violate COPPA, all while Google cashes in on our children’s activity. It is past time for the Federal Trade Commission to crack down to protect children’s privacy.”“Google’s dominance in the app market cannot come at the expense of its clear legal obligations to protect kids that use its products.” said David N. Cicilline (RI-01), the top Democrat on the House Antitrust Subcommittee, who raised his concerns about this issue when the Chairman of the FTC testified last week. “I am pleased that this coalition of consumer and children’s advocacy groups are urging the FTC to scrutinize whether Google is improperly tracking children and selling their data.”Google policies require apps in the Kids and Family section of its Play Store to be compliant with the Children’s Online Privacy Protection Act (COPPA). But, Google doesn’t verify compliance, so Play Store apps for children consistently violate COPPA. Many apps send children’s data unencrypted, while others access children’s locations or transmit persistent identifiers without notice or verifiable parental consent. Google has known about these COPPA violations since at least July 2017, when they were publicly reported by Serge Egelman, a researcher at the University of California, Berkeley Center for Long-Term Cybersecurity. Yet Google continues to promote such apps as COPPA-compliant.“Our research revealed a surprising number of privacy violations on Android apps for children, including sharing geolocation with third parties,” said Serge Egelman, a researcher at the University of California, Berkeley. “Given Google’s assertion that Designed for Families apps must be COPPA compliant, it’s disappointing these violations still abound, even after Google was alerted to the scale of the problem.”Google’s policies also require apps for children to avoid “overly aggressive” commercial tactics, but the advocates’ FTC complaint reveals that many popular apps feature ads that interrupt gameplay, are difficult to click out of, or are required to watch in order to advance in a game. In addition, games represented to parents as free often pressure children to make in-app purchases, sometimes going so far as to show characters crying if kids don’t buy locked items. The complaint also offers examples of multiple children’s apps that serve ads for alcohol and gambling, despite those ads being barred by Google’s Ad Policy.Other apps designated as appropriate for children are clearly not. Some contain graphic, sexualized images, like TutoTOONS Sweet Baby Girl Daycare 4 – Babysitting Fun, which has over 10 million downloads. Others model actively harmful behavior, like TabTale’s Crazy Eye Clinic, which teaches children to clean their eyes with a sharp instrument, and has over one million downloads."Parents who download apps recommended for ages 8 and under don’t expect their child to see ads which promote gambling, alcoholic beverages, or violent video games,” said Angela Campbell, Director of the Communications and Technology Clinic at Georgetown Law, which drafted the complaint. “But Google falsely claims that apps listed in the Family section only have ads which are appropriate for children. It’s important for the FTC to act quickly to protect children, especially in light of Google's dominance in the app market."The coalition has previously asked the FTC to investigate developers of children’s apps, citing research from the University of Michigan that revealed manipulative advertising is rampant in apps popular with preschoolers. Today’s complaint focuses on Google, whose misrepresentation and promotion of those apps has led to hundreds of millions of downloads.“Google (Alphabet, Inc.) has long engaged in unethical and harmful business practices, especially when it comes to children,” explained Jeff Chester, executive director of the Center for Digital Democracy (CDD). "And the Federal Trade Commission has for too long ignored this problem, placing both children and their parents at risk over their loss of privacy, and exposing them to a powerful and manipulative marketing apparatus. As one of the world’s leading providers of content for kids online, Google continues to put the enormous profits they make from kids ahead of any concern for their welfare," Chester noted. “It’s time federal and state regulators acted to control Google’s 'wild west' Play Store App activities.”Joining the Campaign for a Commercial-Free Childhood and the Center for Digital Democracy in signing today’s complaint to the FTC are Badass Teachers Association, Berkeley Media Studies Group, Color of Change, Consumer Action, Consumer Federation of America, Consumer Watchdog, Defending the Early Years, Electronic Privacy Information Center, Media Education Foundation, New Dream, Open MIC (Open Media and Information Companies Initiative), Parents Across America, Parent Coalition for Student Privacy, Parents Television Council, Peace Educators Allied for Children Everywhere (P.E.A.C.E.), Privacy Rights Clearinghouse, Public Citizen, the Story of Stuff, TRUCE (Teachers Resisting Unhealthy Childhood Entertainment), and USPIRG.In addition to filing an FTC complaint, CCFC has launched a petition (link is external) asking Google to adopt the Kids’ Safer App Store Standards, which would bar advertising in apps for kids under 5, limit ads in apps for kids 6 -12, bar in-app purchases, and require apps to be reviewed by a human before being included in the Kids and Family section of the Play Store.###
  • Kathryn Montgomery, PhD

    Research Director and Senior Strategist for the Center for Digital Democracy

    Kathryn Montgomery, PhD. is Research Director and Senior Strategist for the Center for Digital Democracy (CDD). In the early 90s, she and Jeff Chester co-founded the Center for Media Education (CME), where she served as President until 2003, and which was the predecessor organization to CDD. CME spearheaded the national campaign that led to passage of the 1998 Children's Online Privacy Protection Act (COPPA) the first federal legislation to protect children's privacy on the Internet. From 2003 until 2018, Dr. Montgomery was Professor of Communication at American University in Washington, D.C., where she founded and directed the 3-year interdisciplinary PhD program in Communication. She has served as a consultant to CDD for a number of years and joined the full-time staff in July 2018. Throughout her career, Dr. Montgomery has written and published extensively about the role of media in society, addressing a variety of topics, including: the politics of entertainment television; youth engagement with digital media; and contemporary advertising and marketing practices. Montgomery's research, writing, and testimony have helped frame the national public policy debate on a range of critical media issues. In addition to numerous journal articles, chapters, and reports, she is author of two books: Target: Prime Time – Advocacy Groups and the Struggle over Entertainment Television (Oxford University Press, 1989); and Generation Digital: Politics, Commerce, and Childhood in the Age of the Internet (MIT Press, 2007). Montgomery’s current research focuses on the major technology, economic, and policy trends shaping the future of digital media in the Big Data era. She earned her doctorate in Film and Television from the University of California, Los Angeles.
  • The law that lets Europeans take back their data from big tech companies, November 11, 1918, CBS 60 Minutes Click to view video.
  • CDD’s Executive Director, Jeff Chester, on CBS 60 Minutes

    The law that lets Europeans take back their data from big tech companies

    "> " type="application/x-shockwave-flash">
  • 34 Civil Rights, Consumer, and Privacy Organizations Unite to Release Principles for Privacy Legislation Contact: Katharina Kopp (kkopp@democraticmedia.org (link sends e-mail)); 202-836 4621 Washington, DC ----- Today, 34 civil rights, consumer, and privacy organizations join in releasing public interest principles for privacy legislation (link is external), because the public needs and deserves strong and comprehensive federal legislation to protect their privacy and afford meaningful redress. Irresponsible data practices lead to a broad range of harms, including discrimination in employment, housing, healthcare, and advertising. They also lead to data breaches and loss of individuals’ control over personal information. Existing enforcement mechanisms fail to hold data processors accountable and provide little-to-no relief for privacy violations. The privacy principles outline four concepts that any meaningful data protection legislation should incorporate at a minimum: Privacy protections must be strong, meaningful, and comprehensive. Data practices must protect civil rights, prevent unlawful discrimination, and advance equal opportunity. Governments at all levels should play a role in protecting and enforcing privacy rights. Legislation should provide redress for privacy violations. These public interest privacy principles include a framework providing guidelines for policymakers considering how to protect the privacy of all Americans effectively while also offering meaningful redress. They follow three days of Federal Trade Commission hearings (link is external) about big data, competition, and privacy as well as the comment deadline on “Developing the Administration’s Approach to Privacy (link is external),” a request for comment from the National Telecommunications and Information Administration as the agency works to develop privacy policy recommendations for the Trump Administration, and ongoing work (link is external) at the National Institute for Standards and Technology to develop a privacy risk framework. The groups urge members of Congress to pass privacy legislation that ensures fairness, prevents discrimination, advances equal opportunity, protects free expression, and facilitates trust between the public and companies that collect their personal data. New America’s Open Technology Institute, Public Knowledge, Access Humboldt, Access Now, Berkeley Media Studies Group, Campaign for a Commercial-Free Childhood, Center for Democracy & Technology, Center for Digital Democracy, Center for Media Justice, Center on Privacy & Technology at Georgetown Law, Color of Change, Common Cause, Common Sense Kids Action, Consumer Action, Consumer Federation of America, Consumers Union, Customer Commons, Demand Progress, Free Press Action Fund, Human Rights Watch, Lawyers’ Committee for Civil Rights Under Law, Media Alliance, Media Mobilizing Project, National Association of Consumer Advocates, National Consumer Law Center, National Consumers League, National Digital Inclusion Alliance, National Hispanic Media Coalition, Oakland Privacy, Open MIC (Open Media and Information Companies Initiative), Privacy Rights Clearinghouse, Public Citizen, U.S. PIRG, and United Church of Christ, OC Inc. signed the principles. Additional local and national privacy advocates are encouraged to sign on. The following can be attributed to Eric Null, Senior Policy Counsel at New America’s Open Technology Institute: “For decades, privacy regulation has favored the company over the user -- companies set their own rules and users are left to fend for themselves. Worse, companies have even discriminated based on protected classes through algorithmic decision-making. Comprehensive privacy legislation must disrupt this status quo. Legislation that follows the public interest privacy principles will better protect users and give users more control over their data.” The following can be attributed to Allie Bohm, Policy Counsel at Public Knowledge: “It is imperative that any comprehensive privacy legislation reflect the concerns, interests, and priorities of actual human beings. Today, consumer protection, privacy, and civil rights groups come together to articulate those interests, priorities, and concerns. Importantly, these principles address the many harms people can experience from privacy violations and misuse of personal data, including enabling unfair price discrimination; limiting awareness of opportunities; and contributing to employment, housing, health care, and other forms of discrimination.” The following can be attributed to Amie Stepanovich, U.S. Policy Manager at Access Now: “From Europe to India to Brazil, data privacy legislation is becoming the norm around the world, and people in the United States are getting left behind. It is long past time that our legislators acted to protect people across the country from opaque data practices that can result in its misuse and abuse, and any acceptable package must start with these principles.” The following can be attributed to Josh Golin, Executive Director at Campaign for a Commercial-Free Childhood: “What big tech offers for ‘free’ actually comes at a high cost -- our privacy. Worst of all is how vulnerable kids are tracked online and then targeted with manipulative marketing. This has to stop. We need laws that will empower parents to protect their children’s privacy.” The following can be attributed to Joseph Jerome, Policy Counsel at Center for Democracy & Technology: “Debates about national privacy laws focus on how companies should implement Fair Information Practices. The operative word is ‘fair.’ When it comes to how companies collect, use, and share our data, too many business practices are simply unfair. Federal law must go beyond giving consumers more notices and choices about their privacy, and we think it is time for legislators in Congress to flip the privacy presumption and declare some data practices unfair.” The following can be attributed to Katharina Kopp, Director of Policy at Center for Digital Democracy: “To this day, U.S. citizens have had to live without effective privacy safeguards. Commercial data practices have grown ever more intrusive, ubiquitous and harmful. It is high time to provide Americans with effective safeguards against commercial surveillance. Any legislation must not only effectively protect individual privacy, it must advance equitable, just and fair data uses, and must protect the most vulnerable among us, including children. In other words, they must bring about real changes in corporate practices. We have waited long enough; the time is now.” The following can be attributed to Laura Moy, Executive Director at Center on Privacy & Technology at Georgetown Law: “Americans want their data to be respected, protected, and used in ways that are consistent with their expectations. Any new legislation governing commercial data practices must advance these goals, and also protect us from data-driven activities that are harmful to society. We need privacy to protect us from uses of data that exclude communities from important opportunities, enable faceless brokers to secretly build ever-more-detailed profiles of us, and amplify political polarization and hate speech.” The following can be attributed to Yosef Getachew, Director of Media and Democracy Program at Common Cause: “An overwhelming majority of Americans believe they have lost control over how their personal information is collected and used across the internet ecosystem. Numerous data breaches and abuses in data sharing practices, which have jeopardized the personal information of millions of Americans, have validated these fears. Our current privacy framework no longer works, and the lack of meaningful privacy protections poses a serious threat to our democracy. Companies can easily manipulate data to politically influence voters or engage in discriminatory practices. These principles should serve as a baseline for any comprehensive privacy legislation that guarantees all Americans control over their data.” The following can be attributed to James P. Steyer, CEO and Founder, at Common Sense: “Any federal legislation should provide for strong baseline protections, particularly for the most surveilled and vulnerable generation ever -- our kids. These principles reflect that as privacy, consumer, and civil rights advocates, we only want federal legislation that will move the ball forward in terms of protecting kids, families, and all of us.” The following can be attributed to Linda Sherry, director of national priorities at Consumer Action: “Our country has floundered far too long without strong federal regulations governing data collection, retention, use and sharing. These privacy principles, developed by a coalition of leading consumer, civil rights and privacy organizations, are offered as a framework to guide Congress in protecting consumers from the many harms that can befall them when they are given little or no choice in safeguarding their data, and companies have few, if any, restrictions on how they use that information.” The following can be attributed to Susan Grant, Director of Consumer Protection and Privacy at Consumer Federation of America: “We need to move forward on data protection in the United States, from a default that allows companies to do what they want with Americans’ personal information as long as they don’t lie about it, to one in which their business practices are aligned with respect for privacy rights and the responsibility to keep people’s data secure.” The following can be attributed to Katie McInnis, Policy Counsel for Consumers Union, the advocacy division of Consumer Reports: “As new data breaches are announced at an alarming rate, now is the time to protect consumers with strong privacy laws. We need laws that do more than just address broad transparency and access rights. Consumers deserve practical controls and robust enforcement to ensure all of their personal information is sufficiently protected.” The following can be attributed to Gaurav Laroia, Policy Counsel at Free Press Action Fund: “The public has lost faith in technology companies' interest and ability to police their own privacy and data usage practices. It’s past time for Congress to pass a strong law that empowers people to make meaningful choices about their data, protects them from discrimination and undue manipulation, and holds companies accountable for those practices.” The following can be attributed to David Brody, Counsel & Senior Fellow for Privacy and Technology at the Lawyers’ Committee for Civil Rights Under Law: “Protecting the right to privacy is essential to protecting civil rights and advancing racial equity in a modern, Internet-focused society. Privacy rights are civil rights. Invasive profiling of online activity enables discrimination in employment, housing, credit, and education; helps bad actors target voter suppression and misinformation; assists biased law enforcement surveillance; chills the free association of advocates; and creates connections between hateful extremists exacerbating racial tensions.” The following can be attributed to Tracy Rosenberg, Executive Director at Media Alliance: “After a flood of data breaches and privacy violations, Americans overwhelmingly support meaningful protections for their personal information that are not written by, for and in the interests of the data collection industry. These principles start to define what that looks like.” The following can be attributed to Francella Ochillo, Vice President of Policy & General Counsel at National Hispanic Media Coalition: “For years, tech platforms have been allowed to monetize personal data without oversight or consequence, losing sight of the fact that personal data belongs to the user. Meanwhile, Latinos and other marginalized communities continue to be exposed to the greatest risk of harm and have the fewest opportunities for redress. The National Hispanic Media Coalition joins the chorus of advocates calling for a comprehensive regulatory framework that protects a user’s right to privacy and access as well as the right to be forgotten.” The following can be attributed to JP Massar, Organizer at Oakland Privacy: “We must not only watch the watchers, and regulate the sellers of our information. We must begin to unravel the information panopticon that has already formed. This is a start.” The following can be attributed to Robert Weissman, President at Public Citizen: “Internet privacy means control. Either we get to control our own lives as lived through the Internet, or the Big Tech companies do. That's what is at stake in whether the U.S. adopts real privacy protections.” The following can be attributed to Ed Mierzwinski, Senior Director for Consumer Programs at U.S. PIRG: “The big banks and the big tech companies all say that they want a federal privacy law, but the law that their phalanx of lobbyists seeks isn’t designed to protect consumers. Instead, it’s designed to protect their business models that treat consumers as commodities for sale; it fails to guarantee that their secret sauce big data algorithms don’t discriminate; it eliminates stronger and innovative state laws forever and it denies consumers any real, enforceable rights when harmed. We can’t allow that.” You may view the privacy principles (link is external) for more information.
  • CDD submits comments to The National Telecommunications and Information Administration On “Developing the Administration’s Approach to Consumer Privacy (link is external)" CDD argues that - Focus on “outcomes” is good but - Outcomes as defined by NTIA are too narrow and must include a broader discussion on privacy harms. They must include + identification harms (risks of identity theft, re-identification and sensitive inferences), + discrimination harms (inequities in the distribution of benefits and risks of exclusion), as well as + exploitation harms (personal data as commodity and risks to the vulnerable). - Legislation must not only achieve a reduction in privacy harms but must also ensure that “privacy benefits are fairly allocated”. Policy remedies must consider and be effective in addressing the inequities in the distribution of privacy benefits and harms. - NTIA’s list of desired outcomes of transparency, control, reasonable minimization, security, access and corrections, risk management, and accountability is a restatement of all-too-familiar privacy self-management paradigm. Privacy self-management alone is not enough as a policy solution. - Privacy is not an individual, commodified good that can and should be traded for other goods. - Legislation should focus less on data and more on outputs of data processing. So, instead of narrowing the scope of legislation to “personal data”, legislation must focus in on inferences, decisions and other data uses. - A risk-management approach must define risks broadly. NTIA should develop methodologies to assess the human rights, social, economic and ethical impacts of the use of algorithms in modern data processing.
  • Press Release

    Advocates ask FTC to investigate apps which manipulate kids

    Popular games for kids 5 and under lure them to watch ads and make in-app purchases

    A coalition of 22 consumer and public health advocacy groups called on the Federal Trade Commission (“FTC”) to investigate the preschool app market. The advocates’ letter urges the FTC to hold app makers accountable for unfair and deceptive practices, including falsely marketing apps that require in-app purchases as “free” and manipulating children to watch ads and make purchases. The complaint was filed in conjunction with a major new study that details a host of concerning practices in apps targeted to young children. The study (link is external) (paywall), “Advertising in Young Children’s Apps,” was led by researchers at University of Michigan C.S. Mott Children’s Hospital, and examined the type and content of advertising in 135 children’s apps.
  • Blog

    Center for Digital Democracy’s Principles for U.S. Privacy Legislation

    PROTECT PRIVACY RIGHTS, ADVANCE FAIR AND EQUITABLE OUTCOMES, LIMIT CORPORATE PRACTICES AND ENSURE GOVERNMENT LEADERSHIP AND ENFORCEMENT

    The Center for Digital Democracy provides the following recommendations for comprehensive baseline Federal privacy legislation. We are building on our expertise addressing digital marketplace developments for more than two decades, including work leading to the enactment of the 1998 Children’s Online Privacy Protection Act--the only federal online privacy law in the United States. Our recommendations are also informed by our long-standing trans-Atlantic work with consumer and privacy advocates in Europe, as well as the General Data Protection Regulation. We are alarmed by the increasingly intrusive and pervasive nature of commercial surveillance, which has the effect of controlling consumers’ and citizens’ behaviors, thoughts, and attitudes, and which sorts and tracks us as “winners” and “losers.” Today’s commercial practices have grown over the past decades unencumbered by regulatory constraints, and increasingly threaten the American ideals of self-determination, fairness, justice and equal opportunity. It is now time to address these developments: to grant basic rights to individuals and groups regarding data about them and how those data are used; to put limits on certain commercial data practices; and to strengthen our government to step in and protect our individual and common interests vis-à-vis powerful commercial entities. We call on legislators to consider the following principles: 1. Privacy protections should be broad: Set the scope of baseline legislation broadly and do not preempt stronger legislation Pervasive commercial surveillance practices know no limits, so legislation aiming to curtail negative practices should - address the full digital data life-cycle (collection, use, sharing, storage, on- and off-line) and cover all private entities’ public and private data processing, including nonprofits; - include all data derived from individuals, including personal information, inferred information, as well as aggregate and de-identified data; - apply all Fair Information Practice Principles (FIPPs) as a comprehensive baseline, including the principles of collection and use limitation, purpose specification, access and correction rights, accountability, data quality, and confidentiality/security; and require fairness in all data practices. - allow existing stronger federal legislation to prevail and let states continue to advance innovative legislation. 2. Individual privacy should be safeguarded: Give individuals rights to control the information about them - Building on FIPPs, individuals ought to have basic rights, including the right to + transparency and explanation + access + object and restrict + use privacy-enhancing technologies, including encryption + redress and compensation 3. Equitable, fair and just uses of data should be advanced: Place limits on certain data uses and safeguard equitable, fair and just outcomes Relying on “privacy self-management”—with the burden of responsibility placed solely on individuals alone to advance and protect their autonomy and self-determination—is not sufficient. Without one’s knowledge or participation, classifying and predictive data analytics may still draw inferences about individuals, resulting in injurious privacy violations—even if those harms are not immediately apparent. Importantly, these covert practices may result in pernicious forms of profiling and discrimination, harmful not just to the individual, but to groups and communities, particularly those with already diminished life chances, and society at large. Certain data practices may also unfairly influence the behavior of online users, such as children. Legislation should therefore address the impact of data practices and the distribution of harm by - placing limits on collecting, using and sharing sensitive personal information (such as data about ethnic or racial origin, political opinions/union membership, data concerning health, sex life or sexual orientation, genetic data, or biometric data) or data that reveals sensitive personal information, especially when using these data for profiling; - otherwise limiting the use of consumer scoring and other data practices, including in advertising, that have the effect of disproportionally and negatively affecting people’s life chances, related to, for example, housing, employment, finance, education, health and healthcare; - placing limits on manipulative marketing practices; - requiring particular safeguards when processing data relating to children and teens, especially with regard to marketing and profiling. 4. Privacy legislation should bring about real changes in corporate practices: Set limits and legal obligations for those managing data and require accountability Currently companies face very few limitations regarding their data practices. The presumption of “anything goes” has to end. Legislation should ensure that entities collecting, using, sharing data - can only do so for specific and appropriate purposes defined in advance, and subject to rules established by law and informed by data subjects’ freely given, specific, informed and unambiguous consent; for the execution of a contract, or as required by law; and without “pay-for-privacy provisions” or “take-it-or leave it” terms of service. - notify users in a timely fashion of data transfers and data breaches, and make consumers whole after a privacy violation or data breach; - cannot limit consumers’ right to redress with arbitration clauses; - are transparent and accountable, and adopt technical and organizational measures, including + provide for transparency, especially algorithmic transparency, + conduct impact assessments for high-risk processing considering the impact on individuals, groups, communities and society at large, + implement Privacy by Design and by Default, + assign resources and staff, including a Data Protection Officer, + implement appropriate oversight over third-party service providers/data processors, + conduct regular audits - are only allowed to transfer data to other countries/international organizations with essentially equivalent data protections in place. 5. Privacy protection should be consequential and aim to level the playing field: Give government at all levels significant and meaningful enforcement authority to protect privacy interests and give individuals legal remedies Without independent and flexible rulemaking data-protection authority, the Federal Trade Commission has been an ineffective agency for data protection. An agency with expertise and resources is needed to enforce company obligations. Ongoing research is required to anticipate and prepare for additionally warranted interventions to ensure a fair marketplace and a public sphere that strengthens our democratic institutions. Legislation should provide - for a strong, dedicated privacy agency with adequate resources, rulemaking authority and the ability to sanction non-compliance with meaningful penalties; - for independent authority for State Attorneys General; - for statutory damages and a private right of action; - for the federal agency to establish an office of technology impact assessment that would consider privacy, ethical, social, political, and economic impacts of high-risk data processing and other technologies; it would oversee and advise companies on their impact-assessment obligations.
  • Media Advisory – Save the Date FOR IMMEDIATE RELEASE October 3, 2018 Contact: Jeff Chester jeff@democraticmedia.org (link sends e-mail) COPPA--Protecting Children’s Privacy Online for 20 Years Sen. Ed Markey, Advocates and Experts Celebrate COPPA as they focus on future challenges posed by the digital marketplace October 17th, Capitol Hill, Open to Public Washington, D.C. To mark the 20th anniversary of the 1998 Children’s Online Privacy Protection Act (COPPA), Senator Edward J. Markey (DMA) —its principal congressional sponsor—will be joined by key representatives from the consumer, child advocacy, and privacy groups involved in implementing the law, at a public forum on Wednesday, October 17 from 12:30-3:30 pm in Room 385 of the Senate Russell Office Building (SR-385). Senator Markey will deliver a keynote speech followed by two panels featuring representatives from Electronic Privacy Information Center, Campaign for Commercial Free Childhood, Common Sense Media, Center for Digital Democracy, Color of Change, and Institute for Public Representation (Georgetown University Law Center), among others. Prof. Kathryn C. Montgomery, who spearheaded the public campaign that led to COPPA, will moderate. “COPPA is the nation’s constitution for children’s communication. For 20 years it has shielded our nation’s children from invasive practices and encroaching actors on the internet,” Sen. Markey noted. “It puts children and families in control and holds violators accountable when they compromise kids’ privacy. As we celebrate the 20th anniversary of COPPA, we must look to the future.” In addition to discussing COPPA’s impact, speakers will explore the expanding interactive and data-driven world young people face today, which is being transformed by a host of powerful technologies, such as artificial intelligence, virtual reality, and internet-connected toys. “In 2018, children grow up in an increasingly connected and digital world with ever-emerging threats to their sensitive personal information,” explained Sen. Markey. “Two decades after the passage of this bedrock law, it is time to redouble our efforts and safeguard the precious privacy of our youngest Americans.” The event is free and open to the public, but seating is limited. Lunch will be served. Please RSVP to jeff@democraticmedia.org (link sends e-mail).