Publishings
Program Areas
-
Blog
Center for Digital Democracy’s Principles for U.S. Privacy Legislation
PROTECT PRIVACY RIGHTS, ADVANCE FAIR AND EQUITABLE OUTCOMES, LIMIT CORPORATE PRACTICES AND ENSURE GOVERNMENT LEADERSHIP AND ENFORCEMENT
The Center for Digital Democracy provides the following recommendations for comprehensive baseline Federal privacy legislation. We are building on our expertise addressing digital marketplace developments for more than two decades, including work leading to the enactment of the 1998 Children’s Online Privacy Protection Act--the only federal online privacy law in the United States. Our recommendations are also informed by our long-standing trans-Atlantic work with consumer and privacy advocates in Europe, as well as the General Data Protection Regulation. We are alarmed by the increasingly intrusive and pervasive nature of commercial surveillance, which has the effect of controlling consumers’ and citizens’ behaviors, thoughts, and attitudes, and which sorts and tracks us as “winners” and “losers.” Today’s commercial practices have grown over the past decades unencumbered by regulatory constraints, and increasingly threaten the American ideals of self-determination, fairness, justice and equal opportunity. It is now time to address these developments: to grant basic rights to individuals and groups regarding data about them and how those data are used; to put limits on certain commercial data practices; and to strengthen our government to step in and protect our individual and common interests vis-à-vis powerful commercial entities. We call on legislators to consider the following principles: 1. Privacy protections should be broad: Set the scope of baseline legislation broadly and do not preempt stronger legislation Pervasive commercial surveillance practices know no limits, so legislation aiming to curtail negative practices should - address the full digital data life-cycle (collection, use, sharing, storage, on- and off-line) and cover all private entities’ public and private data processing, including nonprofits; - include all data derived from individuals, including personal information, inferred information, as well as aggregate and de-identified data; - apply all Fair Information Practice Principles (FIPPs) as a comprehensive baseline, including the principles of collection and use limitation, purpose specification, access and correction rights, accountability, data quality, and confidentiality/security; and require fairness in all data practices. - allow existing stronger federal legislation to prevail and let states continue to advance innovative legislation. 2. Individual privacy should be safeguarded: Give individuals rights to control the information about them - Building on FIPPs, individuals ought to have basic rights, including the right to + transparency and explanation + access + object and restrict + use privacy-enhancing technologies, including encryption + redress and compensation 3. Equitable, fair and just uses of data should be advanced: Place limits on certain data uses and safeguard equitable, fair and just outcomes Relying on “privacy self-management”—with the burden of responsibility placed solely on individuals alone to advance and protect their autonomy and self-determination—is not sufficient. Without one’s knowledge or participation, classifying and predictive data analytics may still draw inferences about individuals, resulting in injurious privacy violations—even if those harms are not immediately apparent. Importantly, these covert practices may result in pernicious forms of profiling and discrimination, harmful not just to the individual, but to groups and communities, particularly those with already diminished life chances, and society at large. Certain data practices may also unfairly influence the behavior of online users, such as children. Legislation should therefore address the impact of data practices and the distribution of harm by - placing limits on collecting, using and sharing sensitive personal information (such as data about ethnic or racial origin, political opinions/union membership, data concerning health, sex life or sexual orientation, genetic data, or biometric data) or data that reveals sensitive personal information, especially when using these data for profiling; - otherwise limiting the use of consumer scoring and other data practices, including in advertising, that have the effect of disproportionally and negatively affecting people’s life chances, related to, for example, housing, employment, finance, education, health and healthcare; - placing limits on manipulative marketing practices; - requiring particular safeguards when processing data relating to children and teens, especially with regard to marketing and profiling. 4. Privacy legislation should bring about real changes in corporate practices: Set limits and legal obligations for those managing data and require accountability Currently companies face very few limitations regarding their data practices. The presumption of “anything goes” has to end. Legislation should ensure that entities collecting, using, sharing data - can only do so for specific and appropriate purposes defined in advance, and subject to rules established by law and informed by data subjects’ freely given, specific, informed and unambiguous consent; for the execution of a contract, or as required by law; and without “pay-for-privacy provisions” or “take-it-or leave it” terms of service. - notify users in a timely fashion of data transfers and data breaches, and make consumers whole after a privacy violation or data breach; - cannot limit consumers’ right to redress with arbitration clauses; - are transparent and accountable, and adopt technical and organizational measures, including + provide for transparency, especially algorithmic transparency, + conduct impact assessments for high-risk processing considering the impact on individuals, groups, communities and society at large, + implement Privacy by Design and by Default, + assign resources and staff, including a Data Protection Officer, + implement appropriate oversight over third-party service providers/data processors, + conduct regular audits - are only allowed to transfer data to other countries/international organizations with essentially equivalent data protections in place. 5. Privacy protection should be consequential and aim to level the playing field: Give government at all levels significant and meaningful enforcement authority to protect privacy interests and give individuals legal remedies Without independent and flexible rulemaking data-protection authority, the Federal Trade Commission has been an ineffective agency for data protection. An agency with expertise and resources is needed to enforce company obligations. Ongoing research is required to anticipate and prepare for additionally warranted interventions to ensure a fair marketplace and a public sphere that strengthens our democratic institutions. Legislation should provide - for a strong, dedicated privacy agency with adequate resources, rulemaking authority and the ability to sanction non-compliance with meaningful penalties; - for independent authority for State Attorneys General; - for statutory damages and a private right of action; - for the federal agency to establish an office of technology impact assessment that would consider privacy, ethical, social, political, and economic impacts of high-risk data processing and other technologies; it would oversee and advise companies on their impact-assessment obligations. -
Media Advisory – Save the Date FOR IMMEDIATE RELEASE October 3, 2018 Contact: Jeff Chester jeff@democraticmedia.org (link sends e-mail) COPPA--Protecting Children’s Privacy Online for 20 Years Sen. Ed Markey, Advocates and Experts Celebrate COPPA as they focus on future challenges posed by the digital marketplace October 17th, Capitol Hill, Open to Public Washington, D.C. To mark the 20th anniversary of the 1998 Children’s Online Privacy Protection Act (COPPA), Senator Edward J. Markey (DMA) —its principal congressional sponsor—will be joined by key representatives from the consumer, child advocacy, and privacy groups involved in implementing the law, at a public forum on Wednesday, October 17 from 12:30-3:30 pm in Room 385 of the Senate Russell Office Building (SR-385). Senator Markey will deliver a keynote speech followed by two panels featuring representatives from Electronic Privacy Information Center, Campaign for Commercial Free Childhood, Common Sense Media, Center for Digital Democracy, Color of Change, and Institute for Public Representation (Georgetown University Law Center), among others. Prof. Kathryn C. Montgomery, who spearheaded the public campaign that led to COPPA, will moderate. “COPPA is the nation’s constitution for children’s communication. For 20 years it has shielded our nation’s children from invasive practices and encroaching actors on the internet,” Sen. Markey noted. “It puts children and families in control and holds violators accountable when they compromise kids’ privacy. As we celebrate the 20th anniversary of COPPA, we must look to the future.” In addition to discussing COPPA’s impact, speakers will explore the expanding interactive and data-driven world young people face today, which is being transformed by a host of powerful technologies, such as artificial intelligence, virtual reality, and internet-connected toys. “In 2018, children grow up in an increasingly connected and digital world with ever-emerging threats to their sensitive personal information,” explained Sen. Markey. “Two decades after the passage of this bedrock law, it is time to redouble our efforts and safeguard the precious privacy of our youngest Americans.” The event is free and open to the public, but seating is limited. Lunch will be served. Please RSVP to jeff@democraticmedia.org (link sends e-mail).
-
October 1, 2018 Chairman John Thune Ranking Member Bill Nelson Senate Commerce Committee Washington, DC Dear Chairman Thune and Ranking Member Nelson, We appreciate your interest in consumer privacy and the hearing you convened recently to explore this topic. Still, our concerns remain that the hearing, with only industry representatives, was unnecessarily biased. Many of the problems consumers face, as well as the solutions we would propose, were simply never mentioned. There is little point in asking industry groups how they would like to be regulated. None of the proposals endorsed by the witnesses yesterday would have any substantial impact on the data collection practices of their firms. Such regulation will simply fortify business interests to the detriment of online users. And the absence of consumer advocates at the first hearing was also missed opportunity for a direct exchange about points made by the industry witnesses. We understand that you are planning to hold a second hearing in early October. In keeping with the structure of the first hearing, we ask that you invite six consumer privacy experts to testify before the Committee. We would also suggest that you organize an additional panel with other experts and enforcement officials, including Dr. Jelenik, the Chair of the European Data Protection Board, as well as State Attorneys General, who are now on the front lines of consumer protection in the United States. Thank you for your consideration of our views. We look forward to working with you. Sincerely, Access Humboldt Access Now Campaign for a Commercial-Free Childhood Center for Digital Democracy Common Sense Consumer Action Consumer Federation of America Customer Commons Digital Privacy Alliance Electronic Frontier Foundation EPIC Media Alliance National Association of Consumer Advocates New America's Open Technology Institute New York Public Interest Research Group (NYPIRG) Privacy Rights Clearing House U.S. Public Interest Research Group (U.S. PIRG) World Privacy Forum
-
Leading consumer privacy organizations in the United States write to express surprise and concern that not a single consumer representative was invited to testify at the September 26 Senate Commerce Committee hearing “Examining Safeguards for Consumer Data Privacy.”
-
CDD Releases E-Guide to Help Protect Voters From Online Manipulation and False News Washington, D.C.: September 12, 2018 To help fight online political misinformation and false news, which has already resurfaced in the 2018 midterm elections, CDD has produced a short e-guide to help voters understand how online media platforms can be hijacked to fan political polarization and social conflict. Enough Already! Protect Yourself from Online Political Manipulation and False News in Election 2018 describes the tactics that widely surfaced in the last presidential election, how they have evolved since, and deconstructs the underlying architecture of online media, especially social networks, that have fueled the rise of disinformation and false news. The e-guide tells voters what they can do to try to take themselves out of the targeted advertising systems developed by Facebook, Twitter, YouTube and other big platforms. The guide also describes the big picture issues that must be addressed to rein in the abuses unleashed by Silicon Valley’s big data surveillance economy and advertising-driven revenue machine. The e-guide is available for free download at the CDD web site. Journalists, activists and interested voters are urged to spread the guide to friends and colleagues. Contact: Jeff Chester, jeff@democraticmedia.org (link sends e-mail) 202-494-7100
-
The Center for Digital Democracy (CDD), Berkeley Media Studies Group, and Color of Change urge the Federal Trade Commission (FTC) to specifically acknowledge the important issues involving the privacy and welfare of young people by adding this issue to its proposed hearing agenda on competition and consumer welfare.
-
Press Release
Public Citizen and Center for Digital Democracy Release Sign-on Letter Urging Companies to Adopt Europe’s New Data Protection Rules
If Companies Can Protect User Data in Europe, They Can Protect It Everywhere
U.S. companies should adopt the same data protection rules that are poised to go into effect in the European Union on May 25, Public Citizen, the Center for Digital Democracy and Privacy International said today. -
Consumer advocates, digital rights, and civil rights groups are calling on U.S. companies to adopt the requirements of the General Data Protection Regulation (GDPR) as a baseline in the U.S. and worldwide. Companies processing personal data* in the U.S. and/or worldwide and which are subject to the GDPR in the European Union, ought to: - extend the same individual privacy rights to their customers in the U.S. and around the world; - implement the obligations placed on them under the GDPR; - demonstrate that they meet these obligations; - accept public and regulatory scrutiny and oversight of their personal data practices; - adhere to the evolving GDPR jurisprudence and regulatory guidance (*Under the GDPR processing includes collecting, storing, using, altering, generating, disclosing, and destroying personal data.) Specifically, at a minimum, companies ought to: 1. Treat the right to data privacy as a fundamental human right. - This right includes the right to: + Information/notice + access + rectification + erasure + restriction + portability + object + avoid certain automated decision-making and profiling, as well as direct marketing - For these rights to be meaningful, give individuals effective control over the processing of their data so that they can realize their rights, including + set system defaults to protect data + be transparent and fair in the way you use people’s data 2. Apply these rights and obligations to all personal data including to data that can identify an individual directly and indirectly. 3. Process data only if you have a legal basis to do so, including - On the basis of freely given, specific, informed and unambiguous consent - If necessary for the performance of a contract 4. In addition, process data only in accordance to the principles of fairness, transparency, purpose limitation, data minimization, accuracy, storage limitation, integrity and confidentiality/security. 5. Add extra safeguards, including explicit consent, when processing sensitive personal data (such as data about ethnic or racial origin, political opinions/union membership, data concerning health, sex life or sexual orientation, genetic data, or biometric data) or data that reveals sensitive personal data, especially when using this data for profiling. 6. Apply extra safeguards when processing data relating to children and teens, particularly with regard to marketing and profiling. 7. Be transparent and accountable, and adopt technical and organizational measures to meet these obligations, including - Provide for algorithmic transparency - Conduct impact assessments for high risk processing - Implement Privacy by Design and by Default - Assign resources and staff, including a Data Protection Officer - Implement appropriate oversight over third party service providers/data processors - Conduct regular audits - Document the processing 8. Notify consumers and regulatory authorities in case of a breach without undue delay. 9. Support the adoption of similar requirements in a data protection law that will ensure appropriate and effective regulatory oversight and enforcement for data processing that does not fall under EU jurisdiction. 10. Adopt these GDPR requirements as a baseline regardless of industry sector, in addition to any other national/federal, provincial/state or local privacy requirements that are stricter than the requirements advanced by the GDPR.
-
The European Union's updated data protection legislation comes into effect in Europe on May 25, 2018. It gives individuals new rights to better control their personal information and strengthens some of the rights that already exist. Enforcement and redress mechanisms have also been strengthened to ensure that these rights are respected. And – importantly – the definition of personal data is wider in the GDPR than in the current EU legislation, and now includes online identifiers, such as an IP address. Read the summary of the eight rights here. The right to information to access to rectify to delete (or “to be forgotten”) to restrict processing to data portability to object to avoid automated decision making and profiling.
-
The European General Data Protection Regulation (GDPR) will take effect May 25, 2018. The Trans Atlantic Consumer Dialogue (link is external) (TACD), of which CDD is a member, published a document detailing 10 things that US citizens and companies need-to-know about the forthcoming General Data Protection Regulation (GDPR).
-
Press Release
Consumer groups in the U.S. and EU urge Facebook to adopt the General Data Protection Regulation as a global baseline standard
In an open letter to Facebook's CEO Mark Zuckkerberg, members of the Transatlantic Consumer Dialogue urge the company "to confirm your company’s commitment to global compliance with the GDPR". -
Blog
Facebook, Cambridge Analytica, Google, the GDPR and Digital Marketers: Will Big Brands and Advertisers Pull the Plug that has Erased Privacy?
A digital “great awakening” has occurred with unprecedented global attention given to the commercial surveillance (link is external) business model at the core of our collective digital experience. Since the earliest days of the commercial Internet in the 1990s, the online medium has been deliberately shaped (link is external) to primarily serve the interests of marketing. Advertisers have poured in many billions of dollars since then to make sure that our platforms, applications and devices all serve the primary need of gathering our information so it could be used for data-driven marketing. Internet industry trade groups have developed the technical standards (link is external) so that data collection is embedded in new services—such as mobile geo-location applications. Marketers developed new technologies, such as programmatic (link is external) advertising, that enabled lightning-fast decisions about individuals based on their data. Leading ad platforms, especially Google and Facebook, fought against privacy legislation for the U.S. Policymakers from both major parties protected them from regulation, including on privacy and antitrust. U.S. companies tried to derail (link is external) the new EU privacy law that starts on May 24—known as the General Data Protection Regulation (GDPR)—but failed to stop it. Europeans—who understand the threat to personal and political freedom when unaccountable institutions control our information—are now on the privacy front lines. The road to privacy and digital rights for America is likely first to pass through the European (link is external) Union. The Facebook/Cambridge Analytica scandal (and kudos (link is external) to The Observer newspaper for its dogged journalism on all this) is, however, not unique. It is emblematic of the way that digital marketing works every day—all over the world. Huge amounts of our information is scooped (link is external) up, from scores of sources, quickly analyzed, and used to send us more personalized marketing and content. Powerful automated (link is external) applications help marketers identify who we and then engage us at deeper emotional and subconscious (link is external) levels. Facebook, Google and others are continually pushing the boundaries of digital advertising, deploying Artificial Intelligence, Virtual Reality (link is external), Neuromarketing and other techniques. They are laying the foundation for the “Internet of Things” world that will be soon upon us, where we will be further tracked and targeted wherever we go and whatever we do. But it’s the global “Fortune” type companies that will really decide what happens with the online privacy of people all over the world. Google and Facebook basically work for the P&Gs (link is external), Coca-Cola’s (link is external), Honda’s and Bank America’s—the leading advertisers. It’s the advertisers who are really in charge of the Internet, and they have created (link is external) for their own companies a kind of mirror image (link is external) to what Google and Facebook have helped unleash. Fortune-size companies are now also in the data business, (link is external) collecting information on consumers via all their devices; they have created in-house consumer data mining and targeting services; and they deploy advanced digital marketing techniques to directly reach us. Over the last year, major advertisers have forced Facebook (link is external) and Google (link is external) to become more accountable to their needs and interests—rather than to the public interest. What they call the need for “brand safety” online—assurances their ads are not undermined by being placed to hate speech or other content harmful to their brands—is really about seizing greater control over their own digital futures. They deeply dislike (link is external) the clout that both Google and Facebook have today over the digital advertising system. We are at a critical moment in the brief history of the Internet and digital media. There is greater awareness of what is at stake—including the future of the democratic electoral process—if we don’t develop the regulations and policies that ensure privacy, promote individual autonomy, and place limits on the now-unchecked corporate power of digital marketers. It’s time to expand the focus of the debate about Facebook and Google to include those who have been paying for all of this consumer surveillance—namely advertisers and the advertising industry. They need to be held accountable if we are to see a global digital medium that puts people—not profits—first. -
In a statement issued today, CDD, EPIC and a coalition of consumer groups have called on the Federal Trade Commission to determine whether Facebook violated a 2011 Consent Order (link is external) when it facilitated the transfer of personal data of 50 million Facebook users to the data mining firm Cambridge Analytica. The groups had repeatedly urged (link is external) the FTC to enforce its own legal judgements. "The FTC's failure to act imperils not only privacy but democracy as well," the groups warned.
-
Blog
AT&T, Comcast & Verizon Expand “Big Data” Tracking & Targeting of Consumers
Why the largest ISPs oppose federal and state privacy and digital marketing safeguards
Internet service provider (ISP) giants, which dominant how Americans gain access to broadband Internet, cable TV, streaming video, and other telecommunications services, are aggressively expanding their capabilities to gather and use personal data. Leading ISPs AT&T, Comcast and Verizon are taking full advantage of all information flowing from PC’s, mobile phones, set-top boxes, and other devices. ISP giants are using “Big Data” analytics, artificial intelligence, and an array of cutting-edge technologies to identify who we are, what we do and how best to target us with marketing and advertising. They are also working closely with data brokers to gain access to even more personal information. AT&T, Comcast and Verizon, for example, work with Acxiom’s LiveRamp (link is external) subsidiary to build robust profiles that help them identify us regardless of what devices we may be using at the moment. ISPs envision a future in which we are continuously connected to their vast digital media networks for nearly everything we do. Through their monopolistic control over key broadband, cable TV, and satellite networks, ISPs are able to closely monitor what we do online—including both in and out of home, school or work. Marketers are told by ISPs that they can use their data and networks to “micro-target” an individual on all their “screens,” including for financial, health, retail and even political advertising. They are also positioning themselves to play an important role as our society is further transformed by digital technology, such as with so-called “smart homes,” (link is external) “connected cars,” the Internet of Things (link is external), and high-speed “5G (link is external)” networks. Last year, the Republican majority in Congress and the Trump administration eliminated (link is external) what have been the only federal consumer privacy protections afforded to all Americans—safeguards enacted in October 2016 by the Federal Communications Commission (FCC) that required ISPs to engage in responsible data practices. ISPs saw that important FCC data-protection safeguard as a formidable obstacle to their plans to “monetize” consumer data. With Network Neutrality—a critical requirement to ensure an open and democratic Internet—also ended by the same FCC, phone and cable ISPs have vanquished federal protections for fair treatment of consumers online—including with their data. Unless states weigh in with safeguards, AT&T, Comcast, Verizon and other major ISPs will pose—despite their denials—a major threat to consumer privacy. With ISPs especially touting their prowess to capitalize on a consumer’s location, states have an especially important role to play protecting the public from “hyper-local targeting” and other geo-marketing practices. Here are just some recent developments on ISP consumer data and digital marketing practices. AT&T, according to the Trump administration’s Department of Justice filing (link is external) opposing its acquisition of Time Warner, is “the country’s largest distributor of traditional subscription television; the second largest wireless telephone company, third largest home internet provider; one of the largest providers of landline telephone service; and also the country’s largest Multichannel Video Programming Distributor (MVPD), with more than 25 million subscribers.” Its control over broadband, satellite, and mobile telecommunications services is a key reason why AT&T tells marketers that it has “More Scale, More Targeted, More Screens…advanced TV and multi-screen solutions for your brand.” Last February, trade publication DigiDay leaked AT&T’s “pitch deck (link is external)—highlighting what it called its “digital video advantage.” That advantage includes “the ability to access the hottest content on TV and across platforms; the ability to reach the multi-platform viewing audience in a single buy; premium and non-skippable inventory” (ads a consumer can’t avoid). AT&T’s DirecTV is positioned by the pitch deck as an effective competitor to streaming video services offered by Amazon, Netflix and YouTube. Through several data advertising platform partners, AT&T offers real-time ad targeting of individuals who view streaming video and other online content. The pitch deck breaks down AT&T’s DirecTV audience, in order to help advertisers more effectively reach Hispanics, African Americans and households with children. Its 2017 “Media kit” explained how the company helps advertisers reach individuals on all their devices, giving marketers the ability to “serve ads to the same target audience on TV and digital devices across tens of billions of impressions.” AT&T especially highlights its deep relationship (link is external) with consumer data providers, including Equifax, Experian, Crossix, Neustar, and Nielsen Catalina. These allies help AT&T target its subscribers with ads promoting loans and other financial services. AT&T Adworks has recently opened (link is external) a “new state-of-the-art-media lab”—an “interactive space designed to inspire marketers…[that] shows the future of media consumption and how marketers can most efficiently reach their targets across any platform.” Designed to reflect how digital media reflects “the consumer’s life,” the lab enables advertisers to “interact” by using “data visualization tools” to see how individuals can be targeted through streaming video, Internet of Things devices, on mobile phones and even via data-enriched outdoor advertising. Comcast: The cable TV colossus, which also operates Universal Studios, NBC, and several digital advertising and technology firms, is committed to better leveraging “Big Data.” Comcast sees itself at the “center (link is external) of the household building connections between users, devices, products, and services.” Comcast is developing capabilities to take better advantage of the insights generated (link is external) by its “systems capable of processing billions of events per day.” This includes identifying actionable data, including in “real time,” generated from its video and Internet platforms. Through its “identity strategy,” (link is external) Comcast plans to deliver a “transformative customer experience” that will market to us online “throughout the customer lifecycle.” To help accomplish this, Comcast is building out its “Big Data” capabilities at the “enterprise level,” including for “event processing, analytics,” storing our information in the cloud, and various forms of digital “testing and optimization.” Comcast’s “Applied Artificial Intelligence (AI)” group (link is external) is working to create “intelligent applications that [can] impact millions of people on a daily basis.” Among its projects involving “machine learning” are ways to “build (link is external) virtual assistants that interact with millions of customers in natural language and automatically find solutions to their needs.” It’s part of a much larger “Technology and Product” research infrastructure at Comcast that has offices in Silicon Valley, Philadelphia, Denver, Chicago and Washington, DC. Comcast is also deploying “blockchain” insights platform technology, which it calls BlockGraph (link is external), to help develop more detailed digital dossiers on consumers so they can be targeted for advertising and other services. Comcast’s “FreeWheel” (link is external) subsidiary, with offices in Paris and other global locations, is an advertising, data management and digital rights management technology company. FreeWheel helps video and digital media companies deploy what it calls a “unified ad management platform.” That system allows clients to engage in “intelligent ad decisioning (link is external) across all devices, environments, and data sets….” FreeWheel’s customers, which for the U.S. market include AT&T’s DirectTV, Fox, Time Warner and Viacom, can use its technologies to “unify audiences across desktop, mobile, OTT (so-called Over-the-Top streaming video), and [cable TV] set-top boxes [to] profitably monetize their content.” Comcast’s growing expansion (link is external) in data-driven marketing—operated by both its Advanced Advertising Group and Spotlight service—involves FreeWheel and other acquisitions, including Visible World and Strata. Strata (link is external), for example, is now partnering with “Choozle,” a Big Data-oriented advertising system that helps marketers target a consumer on social, mobile, video and other platforms. The Comcast’s Strata and Choozle (link is external) alliance “will allow thousands of advertising agencies to access detailed consumer data to execute digital advertising campaigns as conveniently as they would buy local TV advertising.” Comcast’s NBCUniversal division has also deepened its use of data-driven techniques to target its viewers. NBCU has its own “advanced advertising” platform (link is external)—Audience Studio—and is promising marketers that its “Total Audience Delivery” will help target (link is external) a consumer on “digital, linear, mobile and out-of-home viewing.” Its “data-based” profiling of its viewers includes information provided by “set top data from Comcast” and other sources. NBCU’s “Audience Studio Targeted Digital,” for example, enables advertisers to reach “digital (link is external) audiences” who view its portfolio of “entertainment, lifestyle, news, sports and Hispanic” content. Comcast’s NBCU urges advertisers to provide them their own (first-party) information on consumers so it can be merged with the TV network’s data. The result, claims NBCU, is that marketers will reach their “objectives through precision targeting at unequaled scale.” What NBCU means is that the same kinds of sophisticated capabilities that Comcast relies on to reach its broadband consumers are also available through its TV subsidiary. Verizon: To further its plans to harvest consumer information to bolster its “differentiated data” ad-targeting capabilities, Verizon has created a new division called “Oath.” (link is external) Incorporating “50 media and technology brands that engage more than a billion people around the world, the Oath portfolio includes HuffPost, Yahoo Sports, AOL, Tumblr, Yahoo Finance, Yahoo Mail” and other properties. Oath operates a real-time data targeting apparatus called One (link is external). It also owns a leading digital video ad company called BrightRoll (link is external) that delivers targeted marketing in real time to streaming video. The result of Verizon’s investments, it explains, is the “most advanced and open advertising technology” system (link is external) that “spans across mobile, video, search, native and programmatic ads.” Verizon’s Oath promises its clients it delivers “people based marketing (link is external).” People-based marketing is a marketing industry euphemism for using our personal data to identify us online. In the case of Verizon’s Oath, that includes our “location, passions and interest from social (media), purchase intent from search and advertising engagement, cross device identity from users mapped across devices, favorite content from web, app and Smart TV data, on and offline purchases and recent store visits from mobile geolocation data.” Oath explains that its “suite (link is external) of advertising technology lets you activate this data to find and message consumers all along their journey.” Verizon acquired AOL after that company had made its own considerable acquisitions of data targeting companies, including those that access mobile (link is external) data from apps. Consequently, Verizon can claim that “Oath has the industry’s largest mobile demand portfolio to help you monetize across every device…,” including in real-time. This is just a brief snapshot of leading ISP data and digital marketing practices today. ISPs’ unfettered control over broadband communications enable them to eavesdrop on the communications and behaviors of millions of individuals and households. Without consumer safeguards, they will further mushroom into even more formidable—and unaccountable—gatekeepers over our information and privacy. -
Around the world citizens (link is external) and governments (link is external) are putting efforts toward limiting the marketing of unhealthy foods to children in order to address the growing obesity (link is external) epidemic worldwide. In the US, Congress and the Federal Trade Commission rely on weak self-regulatory industry standards, but under Canadian Prime Minister Justin Trudeau, the government of Canada wishes to see restrictions placed on the marketing of food and beverages to children. This was a goal written directly into the Health Minister's mandate letter (link is external) signed by Trudeau in October 2017. As a result, Health Canada, the department of the Canadian government with responsibility for national public health, is considering new regulations that would impose broader restrictions on food advertising that is targeted at those under 17. It could cover everything from TV, online and print advertising to product labelling, in-store displays and even end some sponsorships for sports teams. Health Canada's consultations (link is external) on how it should approach restricting advertising of "unhealthy food and beverages" to kids began in June of 2017 and concluded in early August last year. Although a few contributors opposed any attempt to restrict marketing to children, the summary report (link is external) states that "Overall, the proposed approach and supporting evidence for restricting marketing of unhealthy food and beverages to children were well received." The authors of the report point out that the "issue of age was not an area of inquiry," but most contributors supported the idea of including children between 13 and 17 years of age. Aiming to define "unhealthy foods," the consultation proposed to focus on restricting certain nutrients of concern (sodium, sugars, and saturated fats), and most commentators supported setting the stricter threshold option (of 5% ) for the proposed restrictions, which were based on a percentage of daily values (% DV). Commentators strongly preferred that option over the weaker proposal (15% DV). Using the percentage of daily values to define which foods are "healthy" or "unhealthy" relies on the already existing mandatory food labelling for most relevant foods. In addition to the proposal to restrict certain nutrients of concern, the proposed restrictions to the marketing of non-sugar sweeteners to children was also positively received. For the consultation, Health Canada looked at the Quebec ban (link is external) on advertising to children, which has been in place since 1980, and covers any advertising, not just food-related advertising. In that province, companies cannot market unhealthy food to children under 13 years old. Quebec has the lowest obesity rate (link is external) in Canada among children aged six to 11 and the highest rate of fruit and vegetable consumption. The Stop Marketing to Kids Coalition (link is external) (M2K Coalition), which includes the Heart and Stroke Foundation of Canada, the Childhood Obesity Foundation, the Canadian Cancer Society, Diabetes Canada, Dietitians of Canada, and the Quebec Coalition on Weight-Related Problems, supports the so-called Ottawa Principles (link is external). These evidence-based, expert-informed and collaboratively arrived principles call on governments to restrict the commercial marketing of all food and beverages to children and youth age 16 years and younger. Restrictions would include all forms of marketing with the exception of non-commercial marketing for public education. The M2K Coalition has taken this stance because of the complexities associated with defining healthy versus unhealthy food. The ad industry in Canada has some self-regulatory restrictions in place under the Canadian Children's Food and Beverage Advertising Initiative (link is external). That program, in which many major food companies are participants, sets out nutrition criteria for products that can be advertised in environments where kids under 12 make up 35 percent or more of the audience. The Association of Canadian Advertisers has criticized Health Canada's proposal as "significantly overbroad," calling it an "outright ban on most food and beverage marketing in Canada." The Canadian advertising initiative has tightened its criteria over time and is now monitoring online advertising more closely. 2016 was the first full year in which participating companies that advertise to kids had to ensure their products met new, tighter limits (link is external) on calories, sugar, sodium and saturated and trans fats. However, in 2017, a study (link is external) from the Heart and Stroke Foundation of Canada called into question how effective this effort has been. It looked at the most popular websites visited by children and teens, and found ads for products high in sugar, salt or fat. During the time that the Canadian government began to explore the right approach to restricting the marketing of unhealthy foods to children, Senator Nancy Greene-Raine introduced a private members bill in the Senate in the fall of 2016, seeking to amend the Food and Drugs Act to prohibit the marketing of unhealthy foods and beverages to children (Bill S-228). This would put the activities of Health Canada on a legal basis. The Senator amended the bill to reflect the federal government’s proposed approach on raising the age limit to age 16 and under and kept the focus on “unhealthy” food and beverages. Bill S-228, The Child Health Protection Act (link is external), unanimously passed the Senate in September 2017. Two amendments to the bill were introduced during the first hour of debate in the House of Commons in December 2017, which included a reduction in the age of protection to under 13 (from 17) years, and the introduction of a 5-year post-legislation review period. The rationale for the change in the age amendment was to make the bill more likely to withstand a court challenge, given that the Quebec legislation restricting marketing to children under 13 years withstood a legal challenge in the case of Irwin Toy v Quebec (1989). In this case, the Supreme Court of Canada allowed limits on commercial advertising to children under 13 as constitutionally valid. The Court confirmed that "...advertising directed at young children is per se manipulative." (link is external) And so, while the Court found that the restrictions violated the freedom of expression under the Charter of Rights and Freedoms, a majority of the Court considered this violation to be a justifiable limitation necessary to protect children. For now, the bill is working its way through Parliament. Hopefully, the food industry will not further water down the requirements of the bill. If all goes well, our neighbor to the north will have a law in place by September 2018 that will advance public health and put children's health above the profits of the food industry. --- See attached infographic.
-
Press Release
Statement of Jeff Chester on plans by FCC to review 3-hour Children’s Educational TV Programming Rule
Broadcasters want to kill one of their only few public interest obligations: to air at least 3 hours of educational children’s programming a week. The FCC is engaged in another outrageous form of digital highway robbery—to steal from kids in order to allow TV giants to make even more profits from shows filled with commercials. Broadcasters now earn billions of dollars from their free public license to transit television—including getting access to invaluable cable TV channels. They are supposed to serve as a “Trustee” of the airwaves—not video programming bandits. Without their 3 hour kidvid requirement, broadcasters will able to reap the financial rewards without any real payback to the public. Millions of kids in the U.S. live in homes that can’t afford cable or broadband. Kidvid programming plays an important role providing access to some quality content for these children. The Pai FCC—as it’s done by killing network neutrality—is engaged in a slash and burn campaign when it comes to much needed public interest consumer protections for media. We will vigorously fight this cynical and harmful move by the FCC to place the interests of the TV lobby ahead of America’s children. CDD helped lobby in the 3-hour rule in the 1990s and plans to work with allies, such as Sen Ed Markey, to protect the interests of parents and children. -
Can Democracy Survive Big Data & Micro-Profiling in Elections? (CPDP 2018 Video)
Organized by Center for Digital Democracy & Transatlantic Consumer Dialogue
Today’s political candidates and issue campaigns are fully integrated into the growing Big Data marketing infrastructure, with more and more companies in this sphere accelerating the pace of research and innovation and promising to transform how political campaigns and elections are conducted. Data management platforms, marketing clouds, and other new data services enable information about one’s finances, health, race, ethnicity, shopping behavior, and geo-location to be combined with political interests, reading habits, and voting records. Social media and digital platforms are facilitating many of these techniques, monetizing and normalizing “fake news,” “dark posts”, and other practices, and challenging fundamental principles such as privacy, data protection, and individual autonomy. It has been widely reported that political Big Data digital micro-targeting played a role in the election of President Trump as well as the Brexit vote in the UK, and is now the subject to growing scrutiny by regulatory authorities. Is the use of such technologies likely to cause harm and undermine the democratic process? What is the link between these technologies and fake news? How do policy frameworks in western democracies compare, in terms of controlling political election campaigns practices? What is the role of data protection legislation in protecting the privacy of voters? And what are the challenges for data protection authorities in addressing how commercial data can be sold or shared with political groups? --- Chair: Paul-Olivier Dehaye, PersonalDataIO (CH) Moderator: Anna Fielder, Transatlantic Consumer Dialogue (UK) Speakers: Michael McEvoy, Office for Information and Privacy Commissioner of British Columbia (CA); Irina Vasiliu, DG Justice, European Commission (EU); Jeffrey Chester, Center for Digital Democracy (US); Juhi Kulshrestha, Hans Bredow Institute for Media Research (DE) -
Press Release
Death of Net Neutrality Will Spur Greater Loss of Digital Privacy, Further Content Consolidation & Control
Statement of Jeff Chester, CDD
The phone and cable lobby will use its new power over the Internet to further erode the privacy rights of Americans. Comcast, AT&T, and Verizon will be entirely free to tap into the data flowing from our mobile devices, PCs, gaming and streaming platforms and set-top boxes. These ISP giants have already built up a formidable (link is external) commercial data gathering and Big Data analytics infrastructure. Now they will expand their gathering of our personal information, inc. financial, health, media use, and also force competitors to share the data they collect. If you want ISPs to give you preferential treatment, content providers will be forced to give up your data, so phone and cable can further expand their ad revenues. Independent and small content companies—including non-commercial and diversely-owned services—will be pressed to consent to terms that favor the digital gatekeepers that control our broadband highway. The FCC’s Net Neutrality decision will trigger a powerful wave of consolidation and deal making that further reduces the range of content and services we should expect in the 21st Century (including for children). We also believe that Google, Facebook and other providers will likely make their peace with the big ISPs, creating a powerful alliance that controls the U.S.’s digital destiny. CDD will be a part of the collaborative work to address this. We urge everyone to also “follow the data” as they examine the digital marketing plans of Verizon, Comcast and AT&T (link is external). There they will find plenty of opportunity to educate the public about our digital future has been placed at great risk. -
Statement of Kathryn C Montgomery, Ph.D. Professor, School of Communication, American University Senior Consultant, Center for Digital Democracy December 4, 2017 In its first formal move to enter the children’s digital marketplace, Facebook has taken a responsible approach to this sensitive age group. It has created a “walled garden” messenger service designed exclusively for younger children; established strong parental controls; kept the service free of advertising; and restricted the use of many data collection and targeting practices that are employed routinely in its other services. The Children’s Online Privacy Protection Act (COPPA) – which we helped pass in 1998, and which was updated in 2012 – has established a strong framework for protecting children 12 and under from unfair data collection and targeting. However, additional safeguards are necessary to protect young people from powerful new forms of commercial surveillance in the Big Data and Internet-of-things era. By designing an ad-free and safe environment for children, Facebook is playing a leadership role in developing responsible corporate practices that could be the basis for industry-wide guidelines. But it is too early to understand fully how young people’s engagement with this new generation of digital interactive platforms will impact their psychosocial development. All stakeholders—including health professionals, educators, scholars, advocates, policymakers, and corporations — will need to monitor very closely how these services evolve. ---