• Project

    Big Data Means Big Opportunities and Big Challenges

    Promoting Financial Inclusion and Consumer Protection in the “Big Data” Financial Era

    Dramatic changes are transforming the U.S. financial marketplace. Far-reaching capabilities of “Big-Data” processing that gather, analyze, predict, and make instantaneous decisions about an individual; technological innovation spurring new and competitive financial products; the rapid adoption of the mobile phone as the principal online device; and advances in e-commerce and marketing that change the way we shop and buy, are creating a new landscape that holds both potential promise and risks for economically vulnerable Americans. Using advances in data analytics specifically to promote economic inclusion and fairness during this period of transformation in the U.S. economy should be a proactive strategy embraced by all stakeholders. While not a panacea to address growing financial inequality, a wise investment in strategies that harvest the potential of the new digital financial system may better enable struggling Americans to maneuver a difficult economic future. This work is licensed under a Creative Commons Attribution 4.0 International License (link is external)
  • When Facebook proposed (link is external) to change its data use practices late last August, we wrote a number of papers to help the FTC. This is one of them, which discusses the company's ad practices and its relationship to its privacy claims. This paper addresses a number of Facebook data use and digital marketing strategies, and their impact on user privacy.
  • A report written by Ed Mierzwinski of USPIRG and Jeff Chester of CDD.
  • The Center for Digital Democracy (CDD) closely analyzes Facebook’s privacy and marketing policies. In partnership with other child advocacy, health and consumer organizations, we are in an on-going discussion about Facebook's data collection and marketing policies and its impact on children and teens. As part of our public outreach work, CDD is releasing “5 Reasons Why Facebook is Not Suitable for Children Under 13." The guide lays out some of the key problematic business and marketing practices that makes Facebook own data-driven marketing practices of concern for children. For example, it discusses how Facebook's marketing practices take advantage of children's cognitive, social and developmental vulnerabilities.CDD and our partners plan to expand the public conversation on children and Facebook to include issues related to the platform's extensive data collection, profiling and marketing practices. These issues compound existing concerns about children's risks involving cyberbullying, harmful content, or the activities of predators while on Facebook.The guide can be found below:
  • Washington, DC: A report released today by the Center for Digital Democracy (CDD) criticizes the Obama Administration’s recent effort to establish new privacy safeguards for the Digital Era. The more than yearlong proceeding led by the Department of Commerce’s National Telecommunications and Information Administration (NTIA) to further the Administration’s proposed “Consumer Privacy Bill of Rights” failed to ensure that the public can be protected from the array of sophisticated mobile “app” data-gathering practices. The detailed, 34-page report, “Head in the Digital Sand,” argues that the lobbyist-dominated process failed to examine the actual operations of the mobile app industry and its impact on the ability of consumers to protect their privacy effectively. Among the most disturbing revelations is the growing use of real-time tracking and surveillance of individual mobile app users. Industry practices requiring investigation by the FTC are identified, including apps that stealthily eavesdrop on consumers to ensure they spend more on virtual goods and other services—moving them up, in industry parlance, from “minnows” to “dolphins” and then to big cash-generating “whales.” The report examines other mobile and app-related data collection practices, including the ways users are being tracked from device to device; how app developers “acquire” and target users; the role of so-called “ad exchanges” that auction off mobile consumers to advertisers in milliseconds, through the use of data-rich profiles; so-called “monetization” practices relied on by developers; and industry research on the unique personal relationship users have with mobile devices and content. In 2012, the White House released a privacy “blueprint” with seven “rights” that all consumers should be guaranteed, and urged Congress to enact legislation. The NTIA was also tasked with bringing industry, nonprofit organizations, and others together to develop so-called voluntary but enforceable codes of conduct to implement consumer privacy rights. However, as CDD’s report describes, the so-called “stakeholder” process failed to deliver meaningful and effective privacy safeguards. “There was an assumption that consumers would be willing to dispassionately analyze how an app uses their data before they try it out,” explained CDD Executive Director Jeff Chester. “But as our report reveals, there is already a sophisticated app marketing system in place that actually uses existing data, along with a host of interactive marketing tactics, to influence consumer decisions. Before they download an app, consumers need to know more than just what data that app may collect or share with sponsors or third parties,” he added. “They need to be told how the app really operates—whether it spies on them, whether the app experience will change in order to promote the sales of goods and virtual products, and precisely how any personal data might be used for purposes related to finances, health, their race or age, for example.” Last month, the NTIA hailed the work that led to a proposed “Short Form Notice Code of Conduct to Promote Transparency in Mobile App Practices.” On Thursday, August 29, the NTIA convenes a forum to address “lessons learned” about the work that produced the mobile app code and how that process should be structured for future work. CDD called on the Administration to release its long-promised legislation on consumer privacy, and to replace the NTIA with the Federal Trade Commission as the lead agency proposing new privacy rights for Americans. “The Administration has told the European Union that it has its privacy house in order,” said Chester. “But this initial effort, as well as the revelations of NSA surveillance, raises questions about how well the privacy of Europeans will be protected as a new Transatlantic trade deal (TTIP) is negotiated.” A copy of CDD’s new report on mobile apps and consumer privacy is available at The Visual Appendix can be downloaded via: (link is external) CDD works to protect the interests of consumers in the digital era, focusing on issues related to consumer privacy, public health, children and youth, and financial services.
  • Digital marketing, including data collection, profiling, tracking and targeting, pervades the Internet experience. One of the less-discussed areas is the targeting of individuals because of their race or ethnicity. As CDD's new report discusses, multicultural groups are "In the Digital Bullseye," with online advertisers and others focused on reaching African Americans, Hispanics, Asian Americans and others. Data collected on each of us can include our financial status, health concerns, location, spending habits--and also ethnicity and race. Multicultural groups are seen as vitally important markets for advertisers, bringining in new revenues and influencing overall cultural attitudes. But we believe there is a price--including the loss of privacy--with the incorporation of ethnicity and race in the digital marketing paradigm. The juxtaposition of sensitive information linking income, location, and race, for example, can lead to new forms of redlining or discrimination (as our report mentions, during the mortgage subprime boom, people of color were often sold harmful financial products regardless of their actual financial status). That's why's CDD believes that race and ethnicity should be classified as "sensitive" information by policymakers, requiring prior affirmative consent for its use. It should be part of the list of sensitive data categories, which usually includes information about one's finances, health or collected from a child. Let a individual decide whether they want their ethnicity or race to be part of their digital profile--not a Google, Facebook or some other marketer. We acknowledge that there is a positive side to identifying multicultural online users: it helps support a more robust and diverse digital publishing system. However, safeguards for privacy and consumer protection are also required. This is the first in a series of new reports CDD will release in 2013 on digital-consumer protection issues, including on multicultural digital target marketing. PS: See also our latest reporting on targeting Latinos online (link is external).
  • An analysis of the contemporary digital marketing landscape, focusing on the promotion of food & beverage products to youth. Written by Kathryn Montgomery, Ph.D., Sonya Grier, Ph.D, Lori Dorfman, Dr.PH, and Jeff Chester, MSW.This report provides a brief summary of how digital marketing works and the role it plays in promoting unhealthy food and beverages to children. Detailed in the report are key concepts of digital marketing; implications for young people’s health; challenges digital marketing concepts raise for researchers; and relevant theoretical models for understanding how the new digital marketing framework acts on children and youth. Crucial gaps in knowledge and an agenda for future research are also highlighted.
  • CDD's predecessor group, Center for Media Education, released this report in 1996. It played a key role generating support, at the FTC and in Congress, to enact the Children's Online Privacy Protection Act in 1998 (COPPA).