CDD

program areas Digital Consumer

  • The Center for Digital Democracy (CDD), in its ongoing efforts to monitor the Federal Trade Commission’s enforcement of the Children’s Online Privacy Protection Act (COPPA), has filed a motion in the U.S. District Court of the District of Columbia challenging the FTC’s refusal to release important COPPA documentation. The case involves seven “safe harbor” programs, such as KidSAFE and TRUSTe, approved by the FTC to handle website compliance with COPPA regulations. CDD originally made its request in July 2014, under the Freedom of Information Act, seeking access to annual reports filed with the FTC by safe harbor organizations, as required by COPPA. In light of the commission’s failure to respond to that request within FOIA’s statutory time limit, CDD initiated the current legal proceeding in December 2014. Two months later, the FTC finally responded to CDD’s FOIA request, releasing heavily redacted annual reports amounting to less than half of CDD’s original request.As CDD’s court filing makes clear, the FTC has been overzealous in protecting the self-interest of the private Safe Harbor programs. CDD’s predecessor, the Center for Media Education, spearheaded the movement that led to the passage of COPPA in 1998. The regulation applies primarily to commercial websites that target children under 13, limiting the collection of personal information, providing a mechanism for parental involvement, and placing obligations on companies for adequate disclosure and protection of data. More recently, CDD led a coalition of child advocates, privacy groups, and health experts that successfully pressed for a revised set of regulations that update and clarify COPPA’s basic safeguards. These new regulations, which became effective in 2013, add new protections specifically designed to address a wide range practices on social media, mobile, and other platforms. Without the diligent oversight of the FTC, however, COPPA regulations will mean little in the rapidly evolving online marketplace. As it awaits a favorable ruling from the District Court, CDD remains committed to ensuring that COPPA is fully and fairly enforced. See the filed memo attached below.
  • Advocates Charge Google with Deceiving Parents about Content on YouTube Kids

    App for preschoolers is rife with videos that are potentially harmful to children

    Washington, DC – Tuesday, May 19 – Two leading child and consumer advocacy groups have filed an important update to their Federal Trade Commission complaint against Google’s YouTube Kids app for false and deceptive marketing. In a letter sent to the Commission today, the groups charged that Google is deceiving parents by marketing YouTube Kids as a safe place for children under five to explore when, in reality, the app is rife with videos that would not meet anyone’s definition of “family friendly.” A review by the Campaign for a Commercial-Free Childhood (CCFC) and Center for Digital Democracy (CDD) has found a significant amount of content that would be extremely disturbing and/or potentially harmful for young children to view, including: Explicit sexual language presented amidst cartoon animation Videos that model unsafe behaviors such as playing with lit matches, shooting a nail gun, juggling knives, tasting battery acid, and making a noose A profanity-laced parody of the film Casino featuring Bert and Ernie from Sesame Street Graphic adult discussions about family violence, pornography, and child suicide Jokes about pedophilia and drug use Advertising for alcohol products CDD and CCFC provided a video (link is external) to the FTC today documenting an array of inappropriate content that can found on YouTube Kids. “Federal law prevents companies from making deceptive claims that mislead consumers," said Aaron Mackey, the coalition’s attorney at Georgetown Law's Institute for Public Representation. "Google promised parents that YouTube Kids would deliver appropriate content for children, but it has failed to fulfill its promise. Parents rightfully feel deceived by YouTube Kids." Google claims that YouTube Kids was “built from the ground up with little ones in mind” and is “packed full of age-appropriate videos.” The app includes a search function that is voice-enabled for easy use for preschool children. Google says it uses “a mix of automated analysis, manual sampling, and input from our users to categorize and screen out videos and topics that may make parents nervous.” Google also assures parents that they “can rest a little easier knowing that videos in the YouTube Kids app are narrowed down to content appropriate for kids.” But, as the complaint explains: Google does not, in fact, “screen out the videos that make parents nervous” and its representations of YouTube Kids as a safe, child-friendly version of YouTube are deceptive. Parents who download the app are likely to expose their children to the very content they believed they would avoid by using the preschool version of YouTube. In addition to the unfair and deceptive marketing practices we identified in our initial request for an investigation, it is clear that Google is deceiving parents about the effectiveness of their screening processes and the content on YouTube Kids. “In the rush to expand its advertising empire to preschoolers, Google has made promises about the content on YouTube Kids that it is incapable of keeping,” said Josh Golin, Associate Director of CCFC. “As a parent, I was shocked to discover that an app that Google claims is safe for young children to explore includes so much inappropriate content from the Wild West of YouTube.” Today’s letter is an update to the advocates’ April 7, 2015 FTC complaint that charged Google with engaging in unfair and deceptive practices towards children and their parents. That complaint detailed how YouTube Kids featured ads and other marketing material that took advantage of children’s developmental vulnerabilities. It also noted that the “blending of children’s programming content with advertising material on television has long been prohibited because it is unfair and deceptive to children. The fact that children are viewing the videos on a tablet or smart phone screen instead of on a television screen does not make it any less unfair and deceptive.” The complaint also called on the FTC to address the failure by Google to disclose that many makers of so-called “user-generated” videos featuring toys and candy have relationships with those product's manufacturers. “The same lack of responsibility Google displayed with advertising violations on YouTube Kids is also apparent in the content made available on the app,” observed Dale Kunkel, Professor of Communication at University of Arizona. “There is a serious risk of harm for children who might see these videos. It’s clear Google simply isn’t ready to provide genuinely appropriate media products for children.” Added Jeff Chester, executive director of CDD, “Google gets an 'F' when it comes to protecting America’s youngest kids. The failure of the most powerful and technologically advanced media company to create a safe place for America’s youngest kids requires immediate action by the FTC.” Today’s letter to the FTC is available below. The coalition’s original FTC complaint is available at http://bit.ly/1LeQHCN. The compilation of YouTube Kids video clips can be viewed at https://vimeo.com/127837914 (link is external).
  • This report summarizes how the online lead generation (or “lead gen”) business works. Companies that look as if they are offering you a loan are actually (often deceptively) collecting information about you to sell your profile (a “lead”) to the highest-bidding loan company (and often to fraudulent firms, too). At the end of the report, we offer consumer tips on what you can do to protect yourself. This work is licensed under a Creative Commons Attribution 4.0 International License (link is external)
  • Project

    Private For-Profit Colleges and Online Lead Generation

    Private Universities Use Digital Marketing to Target Prospects, Including Veterans, via the Internet

    This report summarizes how companies that specialize in recruiting students to enroll at for-profit colleges use online lead generation (or “lead gen”) and other targeting tools. Websites that look like news sites or even colleges themselves are actually (often deceptively) collecting information about you to sell your profile (a “lead”) to the highest-bidding for-profit school. Many lead generators specialize in targeting veterans, because the schools will pay a higher fee to obtain access not only to federal student loan funds but also to federal veterans’ benefits, as we explain below. Many of these schools are under investigation or have even been shut down by government agencies for fraudulent practices. At the end of the report, we offer consumer tips on what you can do to protect yourself. This work is licensed under a Creative Commons Attribution 4.0 International License (link is external)
  • Data-driven tools enable marketers and financial firms to specifically target any group, from students and veterans to ethnic groups. This report examines digital targeting and marketing to Hispanics, especially younger Hispanics, due to their growing economic clout and early adoption of mobile smart phones, which enables precision targeting based on behavior, geo-location and language. Unfortunately, as the report explains, the out-sized digital footprint of young Hispanics enables some of the worst elements of the digital economy – from predatory payday lenders to debt settlement companies – to target Hispanics through online lead generator schemes. This work is licensed under a Creative Commons Attribution 4.0 International License (link is external)
  • So-called “native advertising” ─where advertiser-produced or –directed content is designed to blend in with online editorial information ─ is quickly becoming a dominant way American consumers receive marketing. Marketers in the U.S. spent nearly $8 billion last year on native ads (up $3 billion from 2013), which is expected to rise to $21 billion by2018.1 Native ads are where the “format and the tone match that of a publisher’s original editorial content.2 1 http://www.businessinsider.com/spending-on (link is external)‐native‐ads‐will-soar-as-publishers‐and‐ 2 “The Native-Advertising Report: Spending Trends, Format Breakdowns, and Audience Attitudes.” Mark Hoelzel, BI Intellengence. 6 Nov. 2014, personal copy.
    Jeff Chester
  • CDD's executive director Jeff Chester called on regulators representing dozens of nations to address the role that today's data collection complex plays in consumer transactions and services. Speaking at the 2015 annual meeting of the International Consumer Protection and Enforcement Network (ICPEN), Chester said that in order to protect consumes today's regulatory agencies--such as the FTC--must understand how data issues are integrally a part of consumer services, including in the financial, health, and retail marketplace. A modified version of the presentation is attached, minus the videos shown that illustrated the cross-device tracking and Big Data Management Platforms that are just the latest developments in digital targeting of individuals. There were also video presentations on how programmatic advertising works (targeting junk food to kids); the role that measurement plays (continually analyzing how we respond to a range of applications and interactions); and the growing use of neuromarketing (fMRI's, facial coding, etc.) is shaping digital marketing and other communications so that it operates at the subconscious and emotional level of individuals. The "story" the slides tell is that to protect consumers in the 21st Century, consumer regulatory agencies need to address how digital marketing actually operates, which is, of course, through a system that integrates data collection with a range of online advertising applications (to "immerse" users in the interactive content, through social media surveillance, neuromarketing, geo-location, etc.). Consumer agencies should tackle the "path-to-purchase" paradigm, supported by Google and others, that continually targets an individual to influence their purchasing behaviors both online and offline. Digital marketing is really a powerful system designed to promote the influence of brands and products, including through ways designed to change how an individual thinks, feels and acts. We explained that this was a global system, with the same set of marketing and data gathering practices being used in SE Asia, Middle East, Latin America, EU, U.S., etc. So here's a quick run-down of the slides attached, minus the videos. Slide 1: 21st Century Consumer protection must address the role that data collecting and its use play with the marketing and provision of services, including financial and health. Slide 2: Scholars, such as Prof. Frank Pasquale, are raising concerns about the role that complex data analysis plays in decision-making on individuals. They have called for regulators to address how the "Black Box" of algorithms and related predictive analytic tools is used in the marketplace. Slide 3: This slide from Adobe illustrates one of my points, that the “Black Box” reflects deliberately chosen business practices used to target individuals. The so-called “secret sauce” is often visible by examining how the businesses use their data and marketing to sell or promote to consumers. Slide 4: What safeguards are required today. Slide 5: Our work since the early 1990’s to address the role that data plays in the commercial marketplace, including our leading campaign to enact the Children’s Online Privacy Protection Act (COPPA) in 1998. We explained we fought for privacy rules that would protect everyone back in the 1990’s, but the industry opposition then—as today—was too strong to get anything except for children. Slide 6: Explained that the basic business model for online was articulated back in the early 1990’s in the book “One-to-One Future.” At that time, it was about tracking an individual across a single website; today includes omnipresent tracking across devices and applications. The picture on the right is Facebook’s new data center in Sweden, the largest one it has built in the EU. Slide 7: Illustrates the role that online data collection, through lead generation, played in the global financial crisis. Online lead gen used to sell subprime loans in the U.S. Message was there are vast international consequences—to people, families, and nations—with how the online marketing system operates. Slide 8: Our recent FTC complaint on Google’s YouTube Kids unfair and deceptive ad practices that target the youngest children. Slide 9: It’s a global system and an international problem. Slide 10: What’s been created in a commercial surveillance system of individuals, groups, and communities. Slide 11: The path-to-purchase paradigm and need for regulators to understand and address the continual monitoring and targeting of consumers. Slide 12: The role that contemporary “Big Data” practices play in marketing. Slide 13: The mobile device’s critical role in digital marketing, including how quickly it achieved mass use (compared with other media). Slide 14: The complex of data companies, often working closely together, that assembles profiles of an individual. Slide 15: It’s not anonymous. It’s about an individual. Slide 16: To address today’s consumer practices, you need to analyze how both data and digital marketing applications are used. Slide 17: The intent is to understand and “manage” a person’s identity, for commercial (and also political) purposes. Slide 18: Facebook sells itself to advertisers by saying they know the “identity” of the user. Slides: 19-20: A person is sold in real-time, milliseconds, to marketers via so-called programmatic buying (ad exchanges, etc.). Gave example from McDonald’s in Denmark. Slides: 21-23. Features of contemporary digital marketing. Slides 24-25: Companies are engaged in social media surveillance, including through the monitoring and analysis of blogs, posts, etc. They are now social media “command centers” engaging in such practices 24/7. Slides 26-28: Examples of digital marketing of loans to low-income consumers, health products and alcoholic beverages. Slide 29: Real-time data targeting and sells of a user/household coming to TV. Slide 30: Teens require safeguards. Role of junk food companies using digital marketing, despite global youth obesity epidemic. Slide 31: Problems will grow, with Internet of Things, mobile payments, wearable’s, etc. Final Slide: Need to proactively act. Regulators should be concerned that trade deals, such as TPP and TTIP, will restrict their ability to act on the future. PS: FTC Commissioner Julie Brill gave a terrific presentation on these issues, raising many key concerns (attached).
  • U.S. PIRG Education Fund and the Center for Digital Democracy (CDD) respectfully submit these additional comments to the Federal Trade Commission (FTC). A set of regulatory and other safeguards is urgently required to ensure that contemporary “Big Data”-driven financial services are used in an equitable, transparent, and responsible manner. All Americans, especially those who confront daily challenges to their economic security, should be assured that their lives will be enhanced—not undermined—by the new digital-data financial services marketplace. A closer critical examination of the commercial information infrastructure in the U.S. reveals a set of well-developed and interconnected data collection and use practices that few consumers are aware of—let alone have consented to. While the commission’s September 2014 workshop explored some of the key issues, it did not sufficiently examine the implications of current “Big Data” business practices. U.S. PIRG Education Fund and CDD urge the commission to issue a final report that addresses the issues we identify [see attached file].
  • The Federal Trade Commission has issued a powerful and disturbing privacy wake-up call. The report reveals the largely invisible Big Data-driven complex that regularly spies on every American, comprehensively following our activities both online and off. It delivers a critical “black eye” to the data-broker industry, which has cynically expanded its surveillance on Americans without regard to their privacy. Unlike the White House’s Big Data reports issued earlier this month, the FTC study provides a much more realistic—and chilling—analysis of an out-of-control digital data collection industry. However, the commission’s calls for greater transparency and consumer control are insufficient. The real problem is that data brokers—including Google and Facebook—have embraced a business model designed to collect and use everything about us and our friends—24/7. Legislation is required to help stem the tide of business practices purposefully designed to make a mockery of the idea of privacy for Americans.******Here are the key findings from the FTC report that illustrate how the data industry requires major reform:VIII. FINDINGS AND RECOMMENDATIONS This report reflects the information provided in response to the Orders issued to nine data brokers, information gathered through follow-up communications and interviews, and information gathered through publicly available sources. Based primarily on these materials about a cross-section of data brokers, the Commission makes the following findings and recommendations: A. Findings 1. Characteristics of the Industry ⊲⊲ Data Brokers Collect Consumer Data from Numerous Sources, Largely Without Consumers’ Knowledge: Data brokers collect data from commercial, government, and other publicly available sources. Data collected could include bankruptcy information, voting registration, consumer purchase data, web browsing activities, warranty registrations, and other details of consumers’ everyday interactions. Data brokers do not obtain this data directly from consumers, and consumers are thus largely unaware that data brokers are collecting and using this information. While each data broker source may provide only a few data elements about a consumer’s activities, data brokers can put all of these data elements together to form a more detailed composite of the consumer’s life. ⊲⊲ The Data Broker Industry is Complex, with Multiple Layers of Data Brokers Providing Data to Each Other: Data brokers provide data not only to end-users, but also to other data brokers. The nine data brokers studied obtain most of their data from other data brokers rather than directly from an original source. Some of those data brokers may in turn have obtained the information from other data brokers. Seven of the nine data brokers in the Commission’s study provide data to each other. Accordingly, it would be virtually impossible for a consumer to determine how a data broker obtained his or her data; the consumer would have to retrace the path of data through a series of data brokers. ⊲⊲ Data Brokers Collect and Store Billions of Data Elements Covering Nearly Every U.S. Consumer: Data brokers collect and store a vast amount of data on almost every U.S. household and commercial transaction. Of the nine data brokers, one data broker’s database has information on 1.4 billion consumer transactions and over 700 billion aggregated data elements; another data broker’s database covers one trillion dollars in consumer transactions; and yet another data broker adds three billion new records each month to its databases. Most importantly, data brokers hold a vast array of information on individual consumers. For example, one of the nine data brokers has 3000 data segments for nearly every U.S. consumer. ⊲⊲ Data Brokers Combine and Analyze Data About Consumers to Make Inferences About Them, Including Potentially Sensitive Inferences: Data brokers infer consumer interests from the data that they collect. They use those interests, along with other information, to place consumers in categories. Some categories may seem innocuous such as “Dog Owner,” “Winter Activity Enthusiast,” or “Mail Order Responder.” Potentially sensitive categories include those that primarily focus on ethnicity and income levels, such as “Urban Scramble” and “Mobile Mixers,” both of which include a high concentration of Latinos and African Americans with low incomes. Other potentially sensitive categories highlight a consumer’s age such as “Rural Everlasting,” which includes single men and women over the age of 66 with “low educational attainment and low net worths,” while “Married Sophisticates” includes thirty-something couples in the “upper-middle class . . . with no children.” Yet other potentially sensitive categories highlight certain health-related topics or conditions, such as “Expectant Parent,” “Diabetes Interest,” and “Cholesterol Focus.” ⊲⊲ Data Brokers Combine Online and Offline Data to Market to Consumers Online: Data brokers rely on websites with registration features and cookies to find consumers online and target Internet advertisements to them based on their offline activities. Once a data broker locates a consumer online and places a cookie on the consumer’s browser, the data broker’s client can advertise to that consumer across the Internet for as long as the cookie stays on the consumer’s browser. Consumers may not be aware that data brokers are providing companies with products to allow them to advertise to consumers online based on their offline activities. Some data brokers are using similar technology to serve targeted advertisements to consumers on mobile devices.
  • Groups File Report with the White House “Big Data” Review Proceeding Washington, DC: U.S. PIRG Education Fund and the Center for Digital Democracy (CDD) released a comprehensive new report today focused on the realities of the new financial marketplace and the threats and opportunities its use poses to financial inclusion. The report examines the impact of digital technology, especially the unprecedented analytical and real-time actionable powers of “Big Data,” on consumer welfare. The groups immediately filed the report with the White House Big Data review headed by John Podesta, who serves as senior counselor to the President. The White House is to issue a report in April addressing the impact of “Big Data” practices on the public, including the possible need for additional consumer safeguards. In addition to the undeniable convenience of online and mobile banking, explains the report, the new financial environment poses a number of challenges, especially for lower-income consumers. Increasingly, the public confronts an invisible “e-scoring” system that may limit their access to credit and other financial services. “We are being placed under a powerful ‘Big Data’ lens, through which, without meaningful transparency or control, decisions about our financial futures are being decided,” the report explains. “Will big data tools be used to help banks and other financial firms offer lower-cost products that help the unbanked and underbanked join the insured financial system and build assets, or will big data simply make it easier for payday lenders and others seeking to extract money from consumers to win?” asked U.S. PIRG Education Fund Consumer Program Director Ed Mierzwinski. “We intend the report to stimulate a healthy debate among policymakers, industry and consumer and civil rights leaders.” Among the issues examined in the report, “Big Data Means Big Opportunities and Big Challenges: Promoting Financial Inclusion and Consumer Protection in the ‘Big Data’ Financial Era,” are the following:the plight of “underbanked and unbanked consumers,” who face special challenges in the new financial marketplace;the impact of data collection and targeted advertising on all Americans, most of whom have no idea that their personal data shape the offers they receive and the prices they pay online;the use of murky “lead generation” practices, especially by payday lenders and for-profit trade schools, to target veterans and others for high-priced financial and educational products; andthe need for new regulatory oversight to protect consumers from potentially discriminatory and deceptive practices online.The report, co-authored by Ed Mierzwinski, Consumer Program Director of the U.S. PIRG Education Fund, and CDD Executive Director Jeff Chester, reflects on the role that online financial marketing played in the recent economic crisis, and provides a blueprint for how such problems can be avoided in the future. “Technological advances that collect, analyze, and make actionable consumer data,” the report concludes, “are now at the core of contemporary marketing. The public is largely unaware of these changes and there are few safeguards in this new marketplace. Economically vulnerable consumers, and especially youth, will be continually urged to spend their limited resources. Conversely, there are opportunities to use the same tools to urge consumers to budget, save and build assets.” “Consumers increasingly face a far-reaching system that uses data about them to predict and determine the products and services they are offered in the marketplace. Federal safeguards that protect privacy and ensure members of the public are not subject to unfair and discriminatory financial practices are long overdue,” explained CDD’s Jeff Chester. “The White House ‘Big Data’ report should call for strong measures to ensure that the changing financial services marketplace operates in a fair and equitable manner.” A copy of the new report is available at www.democraticmedia.org and www.uspirgedfund.org (link is external) The Center for Digital Democracy is a nonprofit group working to educate the public about the impact of digital marketing on financial services, public health, consumer protection, and privacy. It has played a leading role at the FTC and in Congress to help promote the development of legal safeguards against behavioral targeting and other potentially invasive online data collection practices. U.S. PIRG Education Fund works to protect consumers and promote good government. We investigate problems, craft solutions, educate the public and offer Americans meaningful opportunities for civic participation.