CDD

Projects

  • Re: Exploring Special Purpose National Bank Charters for Fintech Companies Dear Comptroller Curry: The Center for Digital Democracy and U.S. Public Interest Research Group (U.S. PIRG) agree with the consumer, civil rights, and community groups and their separately filed group letter in which they expressed strong opposition to the proposed new federal nonbank lending charters. U.S. PIRG also signed and concurs with the detailed comment from National Consumer Law Center et al. The Office of the Comptroller of the Currency (OCC) must not undermine state rate caps; must not weaken states’ ability to oversee lenders and act to prevent harmful lending practices; and the OCC must not undermine efforts to provide fair and inclusive lending practices, particularly for people of color and low- and moderate-income consumers, in the areas where they operate. Further, the OCC must not allow nonbank lenders to engage in practices that violate privacy rights, or engage in unfair data and marketing practices. State laws often operate as the primary line of defense for consumers and small businesses. The OCC’s charter proposal inadequately protects consumers from these harmful practices and it should not take state law enforcers off the beat of preventing these practices. Center for Digital Democracy and U.S. PIRG file this supplemental comment to focus on the digital rights and consumer privacy concerns raised by the use of opaque Big Data algorithms used by Fintech firms. These practices increasingly threaten consumer privacy and the OCC must also take them into account when considering non-bank special purpose charters. An ongoing and increasingly challenging issue confronting citizens and consumers is the new threats to their privacy and their ability to control how personal and non- personal data about their online and offline behavior are collected and used by online financial services companies. The use of personal data by Fintech companies is pervasive and touches every aspect of their business operation, including marketing, customer loyalty management, pricing, fraud prevention, and underwriting. Fintech companies use many new on- and offline data sources, either directly collecting data from consumers or relying on third parties for Big Data analytics to classify consumers and to make predictions about them. Assigning individuals to socially constructed classifications and then making inferences about them based on group profiles is likely to have consequences that are not well understood and may further increase social inequities. Consumers’ privacy is increasingly undermined and no adequate protections are in place. The OCC must not allow an expansion of these practices via a federal charter that does not provide for adequate privacy safeguards. The OCC must proactively investigate unfair marketing practices and not grant national licenses without affirmative protections. Fintech companies are using Facebook, Instagram, and other digital behavioral data that combine data and interactive experiences to influence consumers and their social networks. Sophisticated data-processing capabilities allow for more precise micro-targeting, the creation of comprehensive profiles, and the ability to act instantly on the insights gained from consumer behaviors. Targeted and highly personalized marketing offers can be intrusive and foster consumer behaviors that are not in the best interest of the individual. Behavioral science shows that consumers are susceptible to ‘nudges’ which raises concerns about the risk of financial institutions taking advantage of the behavioral biases and limitations of consumers. Increasing personalization which Big Data makes possible, could also reduce the comparability of products, making it harder for consumers to compare one offer with another which could have an impact on market competition. Similarly, lack of transparency around the processing of data and automated algorithms may lead to increasing information asymmetries between the financial institution and the individual and thus consumers are left with less awareness and a lack of understanding and control over important financial decisions. These practices happen behind the scenes and can only be addressed by a vigilant regulator. The OCC should not allow fintech companies to operate a national license without properly addressing these data practices. The OCC must also not allow nonbank lenders or partner depository institutions to engage in unfair and discriminatory lending practices. The use of ‘alternative data’ sources can be the cause of bias or contain errors and may lead to consumer harm or unfairness. While alternative credit scoring can be a boon for the underbanked, there need to be standards and safeguards to ensure that any new data are not biased and that their use may not lead to unintended consequences. While industry has argued that increased automation will help expand access to credit and lower costs overall, credit models that are more “accurate” may lead to a more stratified society, as it will leave those at the bottom potentially excluded from credit forever. Models that judge individuals against group profiles based on past data inevitably incorporate elements of past inequality and discrimination. Communities of color are thus most vulnerable. Unless additional policies are put in place to address these consequences, inequality is likely to become more entrenched the more we rely on models for risk evaluations. Fintech platforms must comply fully with the requirements of the Fair Credit Reporting Act and Equal Credit Opportunity Act. In conclusion, the OCC must not grant new federal nonbank lending charters that would give firms free rein to use unfair data and marketing practices. Instead the OCC must proactively mitigate risks from unfair data, marketing, and lending practices that threaten to undermine privacy, consumer rights and economic inclusion. Sincerely, Jeff Chester and Katharina Kopp Center for Digital Democracy Edmund Mierzwinski U.S. PIRG Recommended further reading: BIG DATA MEANS BIG OPPORTUNITIES AND BIG CHALLENGES: Promoting Financial Inclusion and Consumer Protection in the “Big Data” Financial Era U.S. PIRG Education Fund and Center for Digital Democracy, 27 March 2014 Available at http://www.uspirg.org/reports/usf/big-data-means-big-opportunities-and-b... (link is external)
  • This report describes and provides examples of the types of digital marketing research utilized by the food and beverage industry and the potential effects it has on the health of children and adolescents. Researchers found that food and beverage industry, together with the companies they contract, are conducting three major types of research: 1) testing and deploying new marketing platforms, 2) creating new research methods to probe consumers’ responses to marketing, and 3) developing new means to assess the impact of new digital research on marketers’ profits. Researchers also found that industry puts this research into action, specifically through its efforts to target communities of color and youth.
    Jeff Chester
  • This report summarizes how the online lead generation (or “lead gen”) business works. Companies that look as if they are offering you a loan are actually (often deceptively) collecting information about you to sell your profile (a “lead”) to the highest-bidding loan company (and often to fraudulent firms, too). At the end of the report, we offer consumer tips on what you can do to protect yourself. This work is licensed under a Creative Commons Attribution 4.0 International License (link is external)
  • Project

    Private For-Profit Colleges and Online Lead Generation

    Private Universities Use Digital Marketing to Target Prospects, Including Veterans, via the Internet

    This report summarizes how companies that specialize in recruiting students to enroll at for-profit colleges use online lead generation (or “lead gen”) and other targeting tools. Websites that look like news sites or even colleges themselves are actually (often deceptively) collecting information about you to sell your profile (a “lead”) to the highest-bidding for-profit school. Many lead generators specialize in targeting veterans, because the schools will pay a higher fee to obtain access not only to federal student loan funds but also to federal veterans’ benefits, as we explain below. Many of these schools are under investigation or have even been shut down by government agencies for fraudulent practices. At the end of the report, we offer consumer tips on what you can do to protect yourself. This work is licensed under a Creative Commons Attribution 4.0 International License (link is external)
  • Data-driven tools enable marketers and financial firms to specifically target any group, from students and veterans to ethnic groups. This report examines digital targeting and marketing to Hispanics, especially younger Hispanics, due to their growing economic clout and early adoption of mobile smart phones, which enables precision targeting based on behavior, geo-location and language. Unfortunately, as the report explains, the out-sized digital footprint of young Hispanics enables some of the worst elements of the digital economy – from predatory payday lenders to debt settlement companies – to target Hispanics through online lead generator schemes. This work is licensed under a Creative Commons Attribution 4.0 International License (link is external)
  • Consumer, Children’s, and Privacy Groups Challenge Federal Trade Commission’s Proposed Settlement with TRUSTe (True Ultimate Standards Everywhere, Inc.) As Too Lenient Stronger Sanctions Needed for TRUSTe’s Violation of the Public Trust Consumers—Especially Parents—Materially Harmed by Years of Deception Washington, DC: The Center for Digital Democracy (CDD), through its counsel the Institute for Public Representation and on behalf of the American Academy of Child and Adolescent Psychiatry, Campaign for Commercial Free Childhood, Consumer Action, Consumer Federation of America, Consumer Watchdog, and The Rudd Center for Food Policy and Obesity, filed comments today at the Federal Trade Commission (FTC) in response to that agency’s proposed Agreement and Consent Order with True Ultimate Standards Everywhere, Inc. (“TRUSTe”). In November, after conducting an investigation, the FTC filed a complaint against TRUSTe, a company that has been issuing various “privacy seals” since 1997. The display of such seals indicate that a website has been reviewed annually by TRUSTe to ensure it is compliance with TRUSTe’s program requirements designed to protect consumer privacy. In fact, according to the FTC TRUSTe deceived consumers in two important respects. First, TRUSTe failed in over one thousand instances between 2006 and 2013 to conduct the annual re-certifications that it told consumers and the FTC it was conducting. Second, the company failed to require the companies using its privacy seals to change references to TRUSTe’s nonprofit status after it became a for-profit operation in 2008. As CDD’s filing makes clear, these violations are especially significant coming from a company that is entrusted with verifying the self-regulatory privacy-protection efforts of thousands of companies—including some of the biggest in the world—and covering such important areas of concern as the Children’s Online Privacy Protection Act (COPPA) and the EU-US Safe Harbor framework for transatlantic data transfers. Thus while the filing applauds the FTC’s enforcement action against TRUSTe, it finds the proposed sanctions—a $200,000 fine and additional recordkeeping and reporting requirements concerning the COPPA safe harbor program—to be far too lenient. “Safe harbors such as TRUSTe,” the filing points out, “play a pivotal role protecting children’s privacy by prohibiting the collection, use or disclosure of personal information without meaningful notice to parents and advance, verifiable parental consent, limiting the amount of data collected from children and protecting the security of data that is collected.” Unfortunately, because the FTC neither revealed the websites and services that were not properly re-certified, nor estimated the number of consumers who were affected by these violations, consumers—including parents concerned for their children’s privacy—are left wondering just how much meaningful privacy protection they have online. In addition to calling for a significant increase in the size of TRUSTe’s payment (citing individual companies that have paid as much as $1 million for their COPPA violations in the past), CDD’s filing called for all COPPA safe harbor reports (including those filed by TRUSTe) be made available to the public on the FTC’s website in a timely manner. Angela Campbell, co-director of the Institute for Public Representation, emphasized that “Parents rely on seal programs such as TRUSTe when deciding whether a particular website is appropriate for their children. Misrepresentations such as these have the potential to put millions of children at risk across potentially hundreds or thousands of child-directed websites. The FTC must do more to restore public trust in the COPPA safe harbor programs.” “The commission needs to stand up for children and their parents,” explained Jeff Chester, executive director of CDD. “If the FTC had adequately engaged in oversight of these programs, such problems would have been identified earlier,” he noted. “Those companies such as TRUSTe that have pledged to truly protect the privacy of American children should be required to make public how they actually determine whether online companies targeting kids engage in fair and responsible practices.” A copy of CDD’s FTC filing is available at www.democraticmedia.org. --30--
  • On 3 December 2014 a coalition of privacy and consumer groups sent a Joint Submission to APEC asking for significant changes to the APEC Cross Border Privacy Rules system (CBPRs). The submission is available here. (link is external) This joint submission follows a long period of opposition by civil society representatives to the first implementation of the CBPRs, which has now been operating in the US for 18 months. The submission raises concerns at the growing number of false claims of APEC certification and the absence of an official accurate list of members. One key aspect of the submission is that the signatories oppose the appointment of TRUSTe (link is external) as an Accreditation Agent for the CBPRs in the US, citing weaknesses in their program criteria, conflicts of interest, and the unacceptable use of fine print exclusions in TRUSTe certified privacy policies. The group calls on APEC to reform its CBPRs or close it down. The coalition includes: the Australian Privacy Foundation; the Canadian Internet Policy & Public Interest Clinic; the US Center for Digital Democracy; and the Electronic Privacy Information Center.
  • U.S. PIRG Education Fund and the Center for Digital Democracy (CDD) respectfully submit these additional comments to the Federal Trade Commission (FTC). A set of regulatory and other safeguards is urgently required to ensure that contemporary “Big Data”-driven financial services are used in an equitable, transparent, and responsible manner. All Americans, especially those who confront daily challenges to their economic security, should be assured that their lives will be enhanced—not undermined—by the new digital-data financial services marketplace. A closer critical examination of the commercial information infrastructure in the U.S. reveals a set of well-developed and interconnected data collection and use practices that few consumers are aware of—let alone have consented to. While the commission’s September 2014 workshop explored some of the key issues, it did not sufficiently examine the implications of current “Big Data” business practices. U.S. PIRG Education Fund and CDD urge the commission to issue a final report that addresses the issues we identify [see attached file].
  • Today was the deadline (link is external) for Comments to be filed in the President's Big Data and privacy proceeding. CDD filed the attached comments, and also joined with a NGO coalition on thie issue representing the civil rights, consumer and privacy communities. CDD's filing urged the following:The Obama Administration should offer legislation that ensures its Consumer Privacy Bill of Rights framework actually provides individuals with the control over how their personal information is collected and used. Individuals should have the ability to make meaningful decisions about their information, regardless of whether it is collected by a social network, mobile operator, app network, financial institution, etc.Legislation should provide regulatory rulemaking authority to the Federal Trade Commission (FTC) on consumer privacy issues to develop these new rights. Legislation should require the FTC to conduct the necessary proceedings leading to a rulemaking within one year from the enactment of legislation. The same legislation should also call on agencies that currently have rulemaking authority, including the Consumer Financial Protection Bureau (CFPB), the Federal Communications Commission (FCC) and the Food and Drug Administration (FDA), to immediately initiate proceedings on consumer financial, telecommunications, and digital health privacy, respectively. Other agencies with sectorial authority on privacy issues not covered by the FTC and others should also be mandated to develop regulations.The current “multistakeholder” process convened by the NTIA should be replaced by the relevant agency rulemakings. The legislation should acknowledge the threats that much of Big Data-related collection pose to Americans today, and strongly state that it is in the best interests of the nation that businesses refrain from their current practice of ubiquitous data collection and profiling. It should accept that self-regulation has failed.The FTC, CFPB, FCC, and FDA should be mandated to report to the Nation, within six months after legislation is enacted, on how commercial Big Data practices are currently being used in ways that may be harmful to the public and not in the national interest. These reports should identify how current practices can discriminate against Americans, based on their race/ethnicity, sexual orientation, income status, age, residence, and other key variables.Based on these reports, the agencies will propose special regulatory safeguards as required to address sensitive data concerns.
  • Today is the deadline for Comments to be filed for the White House's forthcoming report on "Big Data." NGOs pressed the Administration to include public comments during its 90-day inquiry that is led by Senior WH Counselor John Podesta. Our comments are attached. Here's an excerpt: The inability to implement basic privacy rules in the United States to address Internet data collection practices has resulted in the ubiquitous commercial surveillance landscape that today threatens the privacy of Americans—as well as those in the European Union and other countries where U.S. companies collect and transport their information...CDD believes the Big Data report must address the realties of today’s commercial data gathering and analysis landscape. While we acknowledge the many positive uses of Big Data, and its potential, the Administration should not gloss over the threats as well. We fear that missing for the most part in the White House’s review will be a fact-based assessment of actual commercial data practices conducted by Google, Facebook, Yahoo, data brokers, and many others. Such a review would reveal an out-of-control commercial data collection apparatus, with no restraints, and which is leading to a commercial surveillance complex that should be antithetical in a democratic society. The report should show the consequences of such information gathering on Americans, where the data can be immediately made “actionable.” It should address the consequences when predictive analysis and other “insight” identification applications trigger real-time and future decisions about the products and services we are offered, the content we may receive, and even the online “experiences” with which we interact. The report should make clear how its Consumer Privacy Bill of Rights Principles should be interpreted when data collected from Americans are used to unfairly target them—and their families—for products and services that can be harmful to their well-being (such as the delivery of high-interest payday loans, promotion of questionable medical treatments, and the targeting of junk food ads to children, which contributes to the nation’s obesity epidemic). The filing covers 6 key issues: The Growth of Ubiquitous Cross-Platform and Across-Application Tracking of Individuals Online: The Emergence of Big-Data-derived Comprehensive Data Profiles on Individuals (Data Management Platforms): The Digital Data Collection Apparatus, Including the Use of Multiple Data Sources and the Real-time Buying and Selling of American Internet Users: The Growth of Commercial Digital Surveillance at the Community, Hyper-local Level: The Delivery of Financial, Health, and Other Products Linked to Sensitive Data and Uses that Raise Consumer Protection Concerns: The Failure of Industry Self-regulation and the Limits of the Multi-stakeholder Process:
  • Project

    Big Data Means Big Opportunities and Big Challenges

    Promoting Financial Inclusion and Consumer Protection in the “Big Data” Financial Era

    Dramatic changes are transforming the U.S. financial marketplace. Far-reaching capabilities of “Big-Data” processing that gather, analyze, predict, and make instantaneous decisions about an individual; technological innovation spurring new and competitive financial products; the rapid adoption of the mobile phone as the principal online device; and advances in e-commerce and marketing that change the way we shop and buy, are creating a new landscape that holds both potential promise and risks for economically vulnerable Americans. Using advances in data analytics specifically to promote economic inclusion and fairness during this period of transformation in the U.S. economy should be a proactive strategy embraced by all stakeholders. While not a panacea to address growing financial inequality, a wise investment in strategies that harvest the potential of the new digital financial system may better enable struggling Americans to maneuver a difficult economic future. This work is licensed under a Creative Commons Attribution 4.0 International License (link is external)
  • Today, CDD filed Comments (link is external) in the FTC's forthcoming (link is external)"Mobile Device Tracking" workshop (link is external) (Feb. 19) on mobile and retail tracking. As we explain (excerpt): While it is important to examine the individual components of what is an increasingly pervasive and unregulated source of commercial surveillance in the “Big Data” era, such as in-store tracking of consumers, the Federal Trade Commission (FTC) must place this one use of mobile tracking in a larger context. Such tracking is but one part of a more elaborate and increasingly seamless “always-on” collection apparatus that operates across devices and user experiences. This surveillance is invisible to most consumers and connected to a range of other practices such as “hyper-local” targeting, multi-screen tracking, and data broker-driven offline and online “connected recognition” and data on-boarding services. Current self-regulatory approaches are ineffective and do a disservice to consumers by falsely claiming to provide privacy protection and user control. The FTC should issue a set of recommendations to govern cross-platform marketing that includes mobile devices. This is urgently required as intrusive geo-locational data-gathering practices, some of which raise concerns about the potential for new forms of “digital redlining” and other discriminatory practices, dramatically expand during the next few years. We believe it is especially important for the FTC to examine how geo-location tracking is being used to identify people by race, ethnicity, economic class, and by their age (such as young people and seniors). The FTC should also reiterate its call for Congress to enact meaningful omnibus privacy legislation.... Today, consumer profiles are developed that include so-called first-, second-, and third-party data, linking our online and offline selves. This filing will not address the purposeful and disingenuous claim that such data profiles of individuals are “anonymous.” It is not the case, and the commission should reject such absurd claims. Companies say much of what they now do is “privacy compliant,” hiding behind the falsehood that cookies and all the other ways they collect and analyze data aren’t linked to an actual person. Such distortions should not be tolerated. Real people are being tracked and targeted.... The growth of hyper-local targeting is spurring new forms of segmentation of individuals and their distinct communities. The country is being broken up into highly discrete areas that are mapped to identify unique characteristics—beyond actual location. The use of these so-called “tiles” raises profound concerns. For example, PlaceIQ explains that “What we do is map data from multiple sources onto a grid of tiles that cover every square foot of the US. Each tile is 100 meters by 100 meters, and we inject third-party demographic information about that area into the tile, as well as data on what’s physically located there—points of interest like parks and airports, tourist attractions, retailers, stadiums, and so forth. Then, we connect that data with where a mobile device is in real time, or where it has recently been, to build unique audience segments for brands to target.”... The use of geo-fencing, “geobehavioral targeting,” “geo-cookies” and the role of location analytics, especially when integrated into broader data gathering, requires action by the FTC. As we will document for the forthcoming “Alternative Scoring Products” workshop, geo-location data are being made actionable at real-time events as well as used to make a range of critical decisions about an individual (whether they are credit worthy, seeking some product or service linked to sensitive concerns, etc.). These privacy and consumer-protection concerns extend beyond the individual to their communities and neighborhoods as well. The commission should examine the impact location-driven data gathering has on the financial health and consumer well-being of distinct communities, especially those in which its residents may suffer economically or due to other factors (such as age). CDD will soon be filing on Alternative Scoring Products (e-scores, lifetime value predictaors, etc) for the FTC's March 19th workshop. Today's Comments are attached.
  • The Center for Digital Democracy (CDD) closely analyzes Facebook’s privacy and marketing policies. In partnership with other child advocacy, health and consumer organizations, we are in an on-going discussion about Facebook's data collection and marketing policies and its impact on children and teens. As part of our public outreach work, CDD is releasing “5 Reasons Why Facebook is Not Suitable for Children Under 13." The guide lays out some of the key problematic business and marketing practices that makes Facebook own data-driven marketing practices of concern for children. For example, it discusses how Facebook's marketing practices take advantage of children's cognitive, social and developmental vulnerabilities.CDD and our partners plan to expand the public conversation on children and Facebook to include issues related to the platform's extensive data collection, profiling and marketing practices. These issues compound existing concerns about children's risks involving cyberbullying, harmful content, or the activities of predators while on Facebook.The guide can be found below:
  • June 1 is the deadline for filing Comments in the FTC Internet of Things inquiry. (link is external) Today's contemporary mobile device (link is external), geo-location (link is external) aware, (link is external)offline/online data (link is external), advanced marketing applications like facial recognition (link is external), mobile real-time ad exchanges (link is external), geo-fences (link is external), cross-platform tracking (link is external)and paradigmatic approaches such as Google's Zero Moment of Truth (link is external) and shopper marketing (link is external)based path-to-purchase methodologies make the Internet of Things (link is external) a consumer and privacy concern today--not in some pending future. We call on the FTC to address how the Internet of Things is already a reality, and to do a better job on sensitive data--especially involving finances, health, racial/ethnic information and youth.
  • Project

    FTC Complaint on Digital Pharma & Health Marketing

    Complaint, Request for Investigation, Public Disclosure, Injunction, and Other Relief: Google, Microsoft, QualityHealth, WebMD, Yahoo, AOL, HealthCentral, Healthline, Everyday Health, and Others Named Below

    November 23, 2010 - Washington, DC: In a complaint filed today with the Federal Trade Commission, the Center for Digital Democracy, U.S. PIRG, Consumer Watchdog, and the World Privacy Forum called on the commission to investigate unfair and deceptive advertising practices that consumers face as they seek health information and services online. Consumers now confront a sophisticated and largely stealth interactive medical marketing apparatus that has unleashed an arsenal of techniques designed to promote the use of specific brand drugs and influence consumers about treatments for health conditions. Much of the online health marketing system has been deliberately structured to collect personal information and other data on consumers, including through the use of free e-newsletters on specific medical concerns; discounts for prescription drugs and services; and via the growing number of other online data profiling techniques. Nearly $1 billion dollars will be spent this year by online health and medical marketers targeting the growing number of U.S. consumers who increasingly rely on the Internet for information about medical problems, treatments, and prescription drugs. The online marketing health industry has presented to the FDA and the public a fairytale version of digital marketing, where all consumers become empowered “e- patients,” able to form powerful helping communities. But while the online medium provides medical information to those seeking access to resources and support, it has been structured to engage in aggressive tactics that threaten privacy, raise questions about the fair presentation of independent information, and advance the sales of prescription drugs and over-the-counter products. Pharma and other health online marketers are pressing the FDA for new rules that would allow them to expand digital and social media advertising. Before the FDA acts, it should await an investigation and a report by the FTC. The complaint to the Federal Trade Commission is attached.
  • Project

    CDD Asks FDA to Revise Its Proposed Research on the Digital Marketing of Drugs and Health Products

    CDD Asks Food and Drug Administration to Revise Its Proposed Research on the Digital Marketing of Drugs and Health Products Urges FDA to Gain Better Understanding of Impact of Digital Marketing on Patients and Health Consumers in order to Protect Public Health

    Washington, DC: The Center for Digital Democracy, in comments filed today with the Food and Drug Administration (FDA), urged the agency to significantly revise its proposed studies on the “Examination of Online Direct-to-Consumer Prescription Drug Promotion.” Citing the wide variety of techniques that pharmaceutical and health marketers use to target consumers online, CDD called for a more informed analysis that reflects how U.S. health consumers are actually marketed to on social networks, mobile phones, and via the Web. Among the marketing techniques that CDD cited that must be part of any FDA research are “the tracking and managing of the ‘patient journey’ online”; data collection; the use of social media analytics and related viral marketing; the role of eye tracking, multivariate testing and other Web page optimization techniques to influence perception and behavior; and the impact of immersive multimedia content and neuromarketing designed to stealthily foster consumer decision-making through non-conscious means. Today is the deadline for comments in FDA’s proposed new research “designed to test different ways of presenting prescription drug risk and benefit information on branded drug Web sites” (Docket No. 2011-N-0230). “While the FDA is to be commended for undertaking additional research before it issues further rules on digital and social media pharmaceutical marketing, we are concerned that the agency—responsible for protecting our health—still has a naïve view of how pharmaceutical digital marketing actually influences consumers,” explained CDD’s Executive Director Jeff Chester. “Online marketing is already an extensive, 360-degree juggernaut that features a wide range of techniques far beyond the scrolling text and banner ads the FDA seems to regard as state of the art. The agency should be in the forefront of ensuring U.S. health consumers have the safeguards they require as they increasingly rely on the Internet and social networks to make decisions about medications and medical treatments.” In its detailed 45-page filing, CDD called on the FDA “to re-conceptualize and update its understanding of digital DTC pharmaceutical marketing,” and pledged its support to assist the agency in expanding its inquiry. CDD urged the FDA to consult with leading independent academic experts and consumer organizations knowledgeable of digital marketing in order to revise its research efforts. CDD’s filing asked the FDA to investigate and develop consumer safeguards for practices used by pharmaceutical and health marketers, including data mining technologies; personalized and behavioral advertising; social media marketing; search engine optimization; rich media and online video applications; mobile and location marketing; unbranded websites; minority and youth marketing; and the use of neuromarketing and other immersive tactics designed to deliberately bypass the rational decision-making process of consumers.
  • An analysis of the contemporary digital marketing landscape, focusing on the promotion of food & beverage products to youth. Written by Kathryn Montgomery, Ph.D., Sonya Grier, Ph.D, Lori Dorfman, Dr.PH, and Jeff Chester, MSW.This report provides a brief summary of how digital marketing works and the role it plays in promoting unhealthy food and beverages to children. Detailed in the report are key concepts of digital marketing; implications for young people’s health; challenges digital marketing concepts raise for researchers; and relevant theoretical models for understanding how the new digital marketing framework acts on children and youth. Crucial gaps in knowledge and an agenda for future research are also highlighted.