Newsroom
Program Areas
-
Press Release
Statement from Children’s Advocacy Groups on New Social Media Bill by U.S. Senators Schatz and Cotton
Statement from Children’s Advocacy Groups on New Social Media Bill by U.S. Senators Schatz and CottonWashington, D.C., April 26, 2023– Several children’s advocacy groups expressed concern today with parts of a new bill intended to protect kids and teens from online harms. The bill, “The Protecting Kids on Social Media Act,” was introduced this morning by U.S. Sens. Brian Schatz (D-HI) and Tom Cotton (R-AR).The groups, including Common Sense Media, Fairplay, and The Center for Digital Democracy, play a leading role on legislation in Congress to ensure that tech companies, and social media platforms in particular, are held accountable for the serious and sometimes deadly harms related to the design and operation of these platforms. They said the new bill is well-intentioned in the face of a youth mental health crisis and has some features that should be adopted, but that other aspects of the bill take the wrong approach to a serious problem.The groups said they support the bill’s ban on algorithmic recommendation systems to minors, which would prevent platforms from using personal data of minors to amplify harmful content to them. However, they said they object to the fact that the bill places too many new burdens on parents and creates unrealistic bans and institutes potentially harmful parental control over minors’ access to social media. By requiring parental consent before a teen can use a social media platform, vulnerable minors, including LGBTQ+ kids and kids who live in unsupportive households, may be cut off from access to needed resources and community. At the same time, kids and teens could pressure their parents or guardians to provide consent. Once young users make it onto the platform, they will still be exposed to addictive or unsafe design features beyond algorithmic recommendation systems, such as endless scroll and autoplay. The bill’s age verification measures also introduce troubling implications for the privacy of all users, given the requirement for covered companies to verify the age of both adult and minor users. Despite its importance, there is currently no consensus on how to implement age verification measures without compromising users’ privacy. The groups said that they strongly support other legislation that establish important guardrails on platforms and other tech companies to make the internet a healthier and safer place for kids and families, for example the Kids Online Safety Act (KOSA), COPPA 2.0, bi-partisan legislation that was approved last year by the Senate Commerce Committee and expected to be reintroduced again this year.“We appreciate Senators Schatz and Cotton's effort to protect kids and teens online and we look forward to working with them as we have with many Senators and House members over the past several years. But this is a life or death issue for families and we have to be very careful about how to protect kids online. The truth is, some approaches to the problem of online harms to kids risk further harming kids and families,” said James P. Steyer, founder and CEO of Common Sense Media. “Congress should place the onus on companies to make the internet safer for kids and teens and avoid placing the government in the middle of the parent-child relationship. Congress has many good policy options already under consideration and should act on them now to make the internet healthier and safer for kids.”“We are grateful to Senators Schatz, Cotton, Britt and Murphy for their efforts to improve the online environment for young people but are deeply concerned their bill is not not the right approach,” said Josh Golin, Executive Director of Fairplay. “ Young people deserve secure online spaces where they can safely and autonomously socialize, connect with peers, learn, and explore. But the Protecting Kids on Social Media Act does not get us any closer to a safer internet for kids and teens. Instead, if this legislation passes, parents will face the same exact conundrum they face today: Do they allow their kids to use social media and be exposed to serious online harms, or do they isolate their children from their peers? We need legislative solutions that put the burden on companies to make their platforms safer, less exploitative, and less addictive, instead of putting even more on parents’ plates.”"It’s critical that social media platforms are held accountable for the harmful impacts their practices have on children and teens. However, this bill’s approach is misguided. It places too much of a burden on parents, instead of focusing on platforms’ business practices that have produced the unprecedented public health crisis that harms our children’s physical and mental well-being. Kids and teens should not be locked out of our digital worlds, but be allowed online where they can be safe and develop in age-appropriate ways. One of the unintended consequences of this bill will likely be a two-tiered online system, where poor and otherwise disadvantaged parents and their children will be excluded from digital worlds. What we need are policies that hold social media companies truly accountable, so all young people can thrive,” said Katharina Kopp, Ph.D., Deputy Director of the Center for Digital Democracy.schatz-cotton_bill_coalition_statement.pdf -
Press Release
Advocates, experts urge Mark Zuckerberg to cancel plans to allow minors in Meta’s flagship Metaverse platform
Citing research that illustrates a number of serious risks to children and teens in the Metaverse, advocates say Meta must wait for more research and root out dangers before targeting youth in VR. BOSTON, MA, WASHINGTON, DC and LONDON, UK — Friday, April 14, 2023 — Today, a coalition of over 70 leading experts and advocates for health, privacy, and children’s rights are urging Meta to abandon plans to allow minors between the ages of 13 and 17 into Horizon Worlds, Meta’s flagship virtual reality platform. Led by Fairplay, the Center for Digital Democracy (CDD), and the Center for Countering Digital Hate (CCDH), the advocates underscored the dearth of research on the impact of time spent in the Metaverse on the health and wellbeing of youth as well as the company’s track record of putting profits ahead of children’s safety. The advocates’ letter maintained that the Metaverse is already unsuitable for use by children and teens, citing March 2023 research from CCDH which revealed that minors already using Horizon Worlds were routinely exposed to harassment and abuse—including sexually explicit insults and racist, misogynistic, and homophobic harassment—and other offensive content. In addition to the existing risks present in Horizon Worlds, the advocates’ letter outlined a variety of potential risks facing underage users in the Metaverse, including magnified risks to privacy through the collection of biomarkers, risks to youth mental health and wellbeing, and the risk of discrimination, among others.In addition to Fairplay, CDD, and CCDH, the 36 organizations signing on include Common Sense Media, the Electronic Privacy Information Center (EPIC), Public Citizen, and the Eating Disorders Coalition.The 37 individual signatories include: Richard Gephardt of the Council for Responsible Social Media, former Member of Congress and House Majority Leader; Sherry Turkle, MIT Professor and author of Alone Together and Reclaiming Conversation; and social psychologist and author Jonathan Haidt.Josh Golin, Executive Director, Fairplay:“It's beyond appalling that Mark Zuckerberg wants to save his failing Horizons World platform by targeting teens. Already, children are being exposed to homophobia, racism, sexism, and other reprehensible content on Horizon Worlds. The fact that Mr. Zuckerberg is even considering such an ill-formed and dangerous idea speaks to why we need Congress to pass COPPA 2.0 and the Kids Online Safety Act.”Katharina Kopp, PhD, Deputy Director, Center for Digital Democracy:“Meta is demonstrating once again that it doesn’t consider the best interest of young people when it develops plans to expand its business operations. Before it considers opening its Horizon Worlds metaverse operation to teens, it should first commit to fully exploring the potential consequences. That includes engaging in an independent and research-based effort addressing the impact of virtual experiences on young people’s mental and physical well-being, privacy, safety, and potential exposure to hate and other harmful content. It should also ensure that minors don’t face forms of discrimination in the virtual world, which tends to perpetuate and exacerbate ‘real life’ inequities.”Mark Bertin, MD, Assistant Professor of Pediatrics at New York Medical College, former Director of Developmental Behavioral Pediatrics at the Westchester Institute for Human Development, author of The Family ADHD Solution, Mindful Parenting for ADHD, and How Children Thrive:“This isn't like the panic over rock and roll, where a bunch of old folks freaked out over nothing. Countless studies already describe the harmful impact of Big Tech products on young people, and it’s worsening a teen mental health crisis. We can't afford to let profit-driven companies launch untested projects targeted at kids and teens and let families pick up the pieces after. It is crucial for the well-being of our children that we understand what is safe and healthy first.” Imran Ahmed, CEO of the Center for Countering Digital Hate:“Meta is making the same mistake with Horizon Worlds that it made with Facebook and Instagram. They have prioritized profit over safety in their design of the product, failed to provide meaningful transparency, and refused to take responsibility for ensuring worlds are safe, especially for children.“Yet again, their aim is speed to market in order to achieve monopoly status – rather than building truly sustainable, productive and enjoyable environments in which people feel empowered and safe.“Whereas, to some, ‘move fast and break things’ may have appeared swashbuckling from young startup entrepreneurs, it is a brazenly irresponsible strategy coming from Meta, one of the world’s richest companies. It should have learned lessons from the harms their earlier products imposed on society, our democracies and our citizens.”horizonletter.pdfJeff Chester -
Press Release
Advocates called for investigation of Amazon's Echo Dot Kids Edition, Federal Regulators Now Poised to Act
Reports indicate FTC plans to advance case against Amazon for violation of kids’ privacy after advocates’ 2019 complaint. BOSTON, MA and WASHINGTON, DC — Friday, March 31, 2023 — Following a groundbreaking investigation of Amazon’s Echo Dot Kids by Fairplay and Center for Digital Democracy (CDD), the Federal Trade Commission is preparing to advance a case against Amazon for the company’s violations of children’s privacy law to the Department of Justice. According to new reporting from Politico, the case centers on Amazon’s violations of the Children’s Online Privacy Protection Act (COPPA) through its Alexa voice assistant.In 2019, privacy advocates Fairplay and CDD called for the FTC to take action against Amazon after an investigation of the company’s Echo Dot Kids smart home assistant, a candy-colored version of Amazon’s flagship home assistant with Alexa voice technology. The investigationrevealed a number of shocking illegal privacy violations, including Amazon’s indefinite retention of kids’ sensitive data even after parents requested for it to be deleted. Now, reports indicate that the FTC is acting on the advocates’ calls for investigation.“We’re thrilled that the Federal Trade Commission and Department of Justice are close to taking action against Amazon for its egregious violations of children’s privacy,” said Josh Golin, Executive Director of Fairplay. “We know it’s not just social media platforms and apps thatmisuse children’s sensitive data. This landmark case would be the first time the FTC sanctioned the maker of a voice-enabled device for flouting COPPA. Amazon and its Big Tech peers must learn that COPPA violations are not just a cost of doing business.” “It is time for the FTC to address the rampant commercial surveillance of children via Internet of Things (IoT) devices, such as Amazon’s Echo, and enforce existing law,” said Katharina Kopp, Director of Policy at Center for Digital Democracy. “Children are giving away sensitive personal data on a massive scale via IoT devices, including their voice recordings and data gleaned from kids’ viewing, reading, listening, and purchasing habits. These data practices lead to violating children’s privacy, to manipulating them into being interested in harmful products, undermining their autonomy, and to perpetuating discrimination and bias. Both the FTC and the Department of Justice must hold Amazon accountable.”[see attached for additional comments] ftc_amazon_investigation_statement_fairplay_cdd.pdfJeff Chester -
Consumer Advocates Urge Action Walmart Deceptively Marketing to Kids on RobloxConsumer Advocates Urge ActionMADISON, CONN. January 23, 2023 – A coalition of advocacy groups led by ad watchdog truthinadvertising.org (TINA.org) is urging the Children’s Advertising Review Unit (CARU) – a BBB National Program – to immediately audit the Walmart Universe of Play advergame, a recent addition to the self-regulatory group’s COPPA Safe Harbor Program and bearer of one of the Program’s certification seals. According to a letter from TINA.org, Fairplay, Center for Digital Democracy and the National Association of Consumer Advocates, a copy of which was sent to Walmart, Roblox and the FTC, the retail giant is exposing children to deceptive marketing on Roblox, the online gaming and creation platform used by millions of kids on a daily basis.Walmart’s first foray into the Roblox metaverse came last September, when it premiered two experiences, Walmart Universe of Play and Walmart Land, which collectively have been visited more than 12 million times. Targeted at – and accessible to – young children on Roblox, Universe of Play features virtual products and characters from L.O.L. Surprise!, Jurassic World, Paw Patrol, and more and is advertised to allow kids to play with the “year’s best toys” and make a “wish list” of toys that can then be purchased at Walmart.As the consumer groups warn, Walmart completely blurs the distinction between advertising content and organic content, and simultaneously fails to provide clear or conspicuous disclosures that Universe of Play (or content within the virtual world) are ads. In addition, as kids’ avatars walk through the game, they are manipulated into opening additional undisclosed advertisements disguised as surprise wrapped gifts.To make matters worse, Walmart is using the CARU COPPA Safe Harbor Program seal to convey the false message that its children’s advergame is not only in compliance with COPPA (Children’s Online Privacy Protection Act), but CARU's Advertising Guidelines and truth-in-advertising laws, as well as a shield against enforcement action.“Walmart’s brazen use of stealth marketing directed at young children who are developmentally unable to recognize the promotional content is not only appalling, it’s deceptive and against truth-in-advertising laws. We urge CARU to take swift action to protect the millions of children being manipulated by Walmart on a daily basis.” Laura Smith, TINA.org Legal Director“Walmart's egregious and rampant manipulation of children on Roblox -- a platform visited by millions of children every day -- demands immediate action. The rise of the metaverse has enabled a new category of deceptive marketing practices that are harmful to children. CARU must act now to ensure that children are not collateral damage in Walmart's digital drive for profit.” Josh Golin, Executive Director, Fairplay“Walmart’s and Roblox’s practices demonstrate that self-regulation is woefully insufficient to protect children and teens online. Today, young people are targeted by a powerful set of online marketing tactics that are manipulative, unfair, and harmful to their mental and physical health. Digital advertising operates in a ‘wild west’ world where anything goes in terms of reaching and influencing the behaviors of kids and teens. Congress and the Federal Trade Commission must enact safeguards to protect the privacy and well-being of a generation of young people.” Katharina Kopp, Director of Policy, Center for Digital DemocracyTo read more about Walmart’s deceptive marketing on Roblox see: /articles/tina-org-urges-action-against-walmarts-undisclosed-advergame-on-robloxAbout TINA.org (truthinadvertising.org) TINA.org is a nonprofit organization that uses investigative journalism, education, and advocacy to empower consumers to protect themselves against false advertising and deceptive marketing.About Fairplay Fairplay is the leading nonprofit organization committed to helping children thrive in an increasingly commercialized, screen-obsessed culture, and the only organization dedicated to ending marketing to children.About Center for Digital DemocracyThe Center for Digital Democracy is a nonprofit organization using education, advocacy, and research into commercial data practices to ensure that digital technologies serve and strengthen democratic values, institutions, and processes.About National Association of Consumer AdvocatesThe National Association of Consumer Advocates is a nonprofit association of more than 1,500 attorneys and consumer advocates committed to representing consumers’ interests.For press inquiries contact: Shana Mueller at 203.421.6210 or press@truthinadvertising.org.walmart_caru_press_release_final.pdf
-
Josh Golin, executive director, Fairplay:The FTC’s landmark settlement against Epic Games is an enormous step forward towards creating a safer, less manipulative internet for children and teens. Not only is the Commission holding Epic accountable for violating COPPA by illegally collecting the data of millions of under 13-year-olds, but the settlement is also a shot across the bow against game makers who use unfair practices to drive in-game purchases by young people. The settlement rightly recognizes not only that unfair monetization practices harm young people financially, but that design choices used to drive purchases subject young people to a wide array of dangers, including cyberbullying and predation.Today’s breakthrough settlement underscores why it is so critical that Congress pass the privacy protections for children and teens currently under consideration for the Omnibus bill. These provisions give teens privacy rights for the first time, address unfair monetization by prohibiting targeted advertising, and empower regulators by creating a dedicated youth division at the FTC. Jeff Chester, executive director, Center for Digital Democracy:Through this settlement with EPIC Games using its vital power to regulate unfair business practices, the FTC has extended long-overdue and critically important online protections for teens. This tells online marketers that from now on, teenagers cannot be targeted using unfair and manipulative tactics designed to take advantage of their young age and other vulnerabilities.Kids should also have their data privacy rights better respected through this enforcement of the federal kids data privacy law (COPPA). Gaming is a “wild west” when it comes to its data gathering and online marketing tactics, placing young people among the half of the US population who play video games at especially greater risk. While today’s FTC action creates new safeguards for young people, Congress has a rare opportunity to pass legislation this week ensuring all kids and teens have strong digital safeguards, regardless of what online service they use.Jeff Chester
-
Consumer financial safeguards for online payments needed, says U.S. PIRG & CDDBig Tech Payment PlatformsSupplemental Comments of USPIRG and the Center for Digital DemocracyCFPB-2021-0017December 7, 2022United States Public Interest Research Group (USPIRG) and the Center for Digital Democracy (CDD) submit these additional comments to further inform the Bureau’s inquiry. They amplify the comments USPIRG and CDD submitted last year.[1] We believe that since we filed our original comment, the transformation of “Big Tech” operated digital payment platforms has significantly evolved, underscoring the need for the Bureau to institute much needed consumer protection safeguards. We had described how online platform based payment services seamlessly incorporate the key elements of “commerce” today—including content, promotion, marketing, sales and payment. We explained how these elements are part of the data-driven “surveillance” and personalized marketing system that operates as the central nervous system for nearly all U.S. online operations. We raised the growing role that “social media commerce” plays in contemporary payment platforms, supporting the Bureau’s examination of Big Tech platforms and consumer financial payment services. For example, U.S. retail social media commerce sales will generate $53 billion in 2022, rising to $107 billion by 2025, according to a recent report by Insider Intelligence/eMarketer. Younger Americans, so-called “Generation Z,” are helping drive this new market—an indicator of how changing consumer financial behaviors are being shaped by the business model and affordances of the Big Tech platforms, including TikTok, Meta and Google.[2]In order to meaningfully respond to the additional questions raised by the Bureau in its re-opening of the comment period, in particular regarding how the payment platforms handle “complaints, disputes and errors” and whether they are “sufficiently staffed…to address consumer protection and provide responsible customer service,” USPIRG and CDD offer some further analysis regarding the structural problems of contemporary platform payment systems below.[3]First, payment services such as operated by Google, Meta, TikTok and others have inherent conflicts of interest.They are, as the Bureau knows, primarily advertising systems, that are designed to capture the “engagement” of individuals and groups using a largely stealth array of online marketing applications (including, for example, extensive testing to identify ways to engage in subconscious “implicit” persuasion).[4] Our prior comment and those of other consumer groups have already documented the extensive use of data profiling, machine learning, cross-platform predictive analysis and “identity” capture that are just a few of current platform monetization tactics. The continually evolving set of tools available for digital platforms to target consumers has no limits—and raises critical questions when it comes to the financial security of US consumers. The build-out of Big Tech payment platforms leveraging their unique capabilities to seamlessly combine social media, entertainment, commerce with sophisticated data-driven contemporary surveillance has transformed traditional financial services concepts. Today’s social media giants are also global consumer financial banking and retail institutions. For example, J.P. Morgan has “built a real-time payments infrastructure” for TikTok’s parent company ByteDance: “that can be connected to local clearing systems. This allows users, content producers, and influencers to be paid instantaneously and directly into their bank accounts at any day or time. ByteDance has enabled this capability in the U.S. and Europe, meaning it covers approximately one-fifth of TikTok’s 1 billion active users worldwide.”[5]J.P. Morgan assisted ByteDance to also replace its “host-to host connectivity with banks, replacing it with application programming interfaces (API) connectivity that allows real-time exchange of data” between ByteDance and Morgan. This allows ByteDance to “track and trace the end-to-end status through the SWIFT network, see and monitor payments, and allow users to check for payments via their TikTok or other ByteDance apps in real time.” Morgan also has “elevated and further future-proofed ByteDance’s cash management through a centralized account structure covering all 15 businesses” through a “virtual account management and liquidity tool.”[6]Google’s Pay operations also illustrate how distinct digital payment platforms are from previous forms of financial services. Google explains to merchants that by integrating “with Google Wallet [they can] engage with users through location-based notifications, real-time updates” and offers, including encouraging consumers to “add offers from your webpage or app directly to Google wallet.” Google promotes the use of “geofenced notifications to drive engagement” with its Pay and Wallet services as well. Google’s ability to leverage its geolocation and other granular tracking and making that information available through a package of surveillance and engagement tools to merchants to drive financial transactions in real-time is beyond the ability of a consumer to effectively address. A further issue is the growing use of “personalization” technologies to make the financial services offering even more compelling. Google has already launched its “Spot” service to deliver “payment enabled experiences for users, including “fully customized experiences” in Google Pay. Although currently available only in India and Singapore, Google’s Spot platform, which allows consumers with “a few simple taps…to search, review, choose and pay” for a product is an example of how payment services online are continually advanced—and require independent review by consumer financial regulators. It also reflects another problem regarding protecting the financial well-being of US consumers. What are the impacts to financial security when there is no distance—no time to reflect—when the seamless, machine and socially-driven marketing and payment operations are at work?[7]A good example of the lack of meaningful protections for online financial consumers is Google Pay’s use of what’s known as “discovery,” a popular digital marketing concept meaning to give enhanced prominence to a product or service. Here’s how Google describes how that concept works in its Spot-enabled Pay application: “We understand that discovery is where it starts, but building deep connections is what matters the most - a connection that doesn’t just end with a payment, but extends to effective post sale engagement. The Spot Platform helps merchants own this relationship by providing a conversational framework, so that order updates, offers, and recommendations can easily be surfaced to the customer. This is powered by our Order API which is specialised to surface updates and relevant actions for users' purchases, and the Messaging API which can surface relevant messages post checkout to the user.”[8]Meta (Facebook), along with ad giant WPP, also relies on the growing use of “discovery” applications to promote sales. In a recent report, they explain that “digital loyalty is driven by seamless shopping experiences, convenience, easy discovery, consistent availability, positive community endorsement and personal connections.”[9] Since Google and other payment platforms have relationships with dozens of financial institutions, and also have an array of different requirements for vendors and developers, USPIRG and CDD are concerned that consumers are placed at a serious disadvantage when it comes to protecting their interests and also seeking redress for complaints. The chain of digital payment services relationships, including with partners that conduct their own powerful data driven marketing systems, requires Bureau review. For example, PayPal is a partner with Google Pay, while the PayPal Commerce Platform has Salesforce as one of many partners.[10]See also PIRG’s recent comments to the FTC, for an extensive discussion of retail media networks and data clean rooms:[11]“Clean rooms are data platforms that allow companies to share first party data with one another without giving the other party full access to the underlying, user-level data. This ability to set controls on who has access to granular information about consumers is the primary reason that data clean rooms are able to subvert current privacy regulations.” Another important issue for the Bureau is the ability of the Big Tech payment platforms to collect and analyze data in ways that allow it to identify unique ways to influence consumer spending behaviors. In a recent report, Chinese ecommerce platform Alibaba explained how such a system operates: “The strength of Alibaba’s platforms allows a birds-eye view of consumer preferences, which is combined with an ecosystem of tactical solutions, to enable merchants to engage directly and co-create with consumers and source suppliers to test, adapt, develop, and launch cutting-edge products…helps merchants identify new channels and strategies to tap into the Chinese market by using precise market analysis, real-time consumer insights, and product concept testing.”[12]Such financial insights are part of what digital payment and platform services provide. PayPal, for example, gathers data on consumers as part of their “shopping journey.” In one case study for travel, PayPal explained that its campaign for Expedia involved pulling “together data-driven destination insights, creative messaging and strategic placements throughout the travel shoppers’ journey.” This included a “social media integration that drove users to a campaign landing page” powered by “data to win.” This data included what is the growing use of what’s euphemistically called “first-party data” from consumers, where there has been alleged permission to use it to target an individual. Few consumers will ever review—or have the ability to influence—the PayPal engine that is designed for merchants to “shape [their] customer journey from acquisition to retention.” This includes applications that add “flexible payment options…right on product pages or through emails;” “relevant Pay Later offer to customers with dynamic messaging;’ ability to “increase average order value” through “proprietary payment methods;” or “propose rewards as a payment option to help inspire loyalty.”[13]The impact of data-driven fostered social commerce on promoting the use of consumer payments should be assessed. For example, Shopify’s “in-app shopping experience on TikTok” claims that the placement of its “shopping tabs” by vendors on posts, profiles and product catalogs unleashes “organic discovery.” This creates “a mini-storefront that links directly to their online store for check out.’’ A TikTok executive explains how the use of today’s digital payment services are distinct—“rooted in discovery, connection, and entertainment, creating unparalleled opportunities for brands to capture consumers’ attention…that drives [them] directly to the digital point of purchase.”[14] TikTok also has partnered with Stripe, helping it “become much more integrated with the world of payments and fintech.”[15]TikTok’s Stripe integrations enable “sellers to send fans directly from TikTok videos, ads, and shopping tabs on their profiles to products available in their existing Square Online (link is external)store, providing a streamlined shopping experience that retains the look and feel of their personal brand.”[16] The Square/TikTok payment alliance illustrates the role that data driven commercial surveillance marketing plays in payment operations, such as the use of the “TikTok pixel” and “advanced matching.”[17] In China, ByteDance’s payment services reflects its growing ability to leverage its mass customer data capture for social media driven marketing and financial services.[18]We urge the Bureau to examine TikTok’s data and marketing practices as it transfers U.S. user information to servers in the U.S., the so-called “Project Texas,” to identify how “sensitive” data may be part of its financial services offerings.[19]Apple’s payment services deserve further scrutiny as its reintroduces its role as a digital advertising network, leveraging its dominant position in the mobile and app markets.[20] PayPal recently announced that it will be “working with Apple to enhance offerings for PayPal and Venmo merchants and consumers.” Apple is also making its payment service available through additional vendors, including the giant Kroger grocery store chain stores in California.[21]Amazon announced in October 2022 that Venmo was now an official payment service, where users could, during checkout, “select “Select a payment method” and then “Add a Venmo account.” This will redirect them to the Venmo app, where they can complete the authentication. Users can also choose Venmo to be their default payment method for Amazon purchases on that screen.”[22] Amazon’s AWS partners with fintech provider Plaid, another example of far-reaching partnerships restructuring the consumer financial services market.[23]ConclusionUSPIRG and CDD hope that both our original comments and these additional comments help the Bureau to understand the impact of rapid changes in Big Tech’s payments network relationships and partnerships. We believe urgent CFPB action is needed to protect consumers from the threat of Big Tech’s continued efforts to breach the important wall separating banking and commerce and to ensure that all players in the financial marketplace follow all the rules. Please contact us with additional questions.Sincerely yours,Jeff Chester, Executive Director, Center for Digital DemocracyEdmund Mierzwinski, Senior Director, Federal Consumer Program, U.S. PIRG [[1] /comment/CFPB-2021-0017-0079[2] /what-s-behind-social-commerce-surge-5-charts[3] We also believe that the Bureau’s request for comments concerning potential abuse of terms of service and use of penalties merits discussion. We look forward to additional comments from others. [4] /business/en-US/blog/mediascience-study-brands-memorable-tiktok; see Google, Meta, TikTok as well: https://www.neuronsinc.com/cases[5] /content/dam/jpm/treasury-services/documents/case-study-bytedance.pdf[6] /content/dam/jpm/treasury-services/documents/case-study-bytedance.pdf[7] /about/business/checkout/(link is external); /pay/spot(link is external); /about/business/passes-and-rewards/[8] /pay/spot[9] /news/meta-publishes-new-report-on-the-importance-of-building-brand-loyalty-in-on/625603/[10] See, for example, the numerous bank partners of Google in the US alone: /wallet/answer/12168634?hl=en. Also: /payments/apis-secure/u/0/get_legal_document?ldo=0&ldt=buyertos&ldr=us; /wallet/retail; /wallet/retail/offers/resources/terms-of-service; /us/webapps/mpp/google-pay-paypal; /products/commerce-cloud/overview/?cc=dwdcmain[11] /wp-content/uploads/2022/11/PIRG-FTC-data-comment-no-petitions-Nov-2022.pdf[12] /article/how-merchants-can-use-consumer-insights-from-alibaba-to-power-product-development/482374[13] /us/brc/article/enterprise-solutions-expedia-case-study(link is external); /us/brc/article/enterprise-solutions-acquire-and-retain-customers[14] /scaling-social-commerce-shopify-introduces-new-in-app-shopping-experiences-on-tiktok#[15] /financial-services-finserv/tiktok-partners-fintech-firm-stripe-tips-payments[16] /us/en/press/square-x-tiktok[17] /help/us/en/article/7653-connect-square-online-with-tiktok(link is external); /help/article/data-sharing-tiktok-pixel-partners[18] /video/douyin-chinas-version-tiktok-charge-093000931.html; /2021/01/19/tiktok-owner-bytedance-launches-mobile-payments-in-china-.html[19] /a/202211/16/WS6374c81ea31049175432a1d8.html[20] /news/newsletters/2022-08-14/apple-aapl-set-to-expand-advertising-bringing-ads-to-maps-tv-and-books-apps-l6tdqqmg?sref=QDmhoVl8[21] /231198771/files/doc_financials/2022/q3/PYPL-Q3-22-Earnings-Release.pdf;/2022/11/08/ralphs-begins-accepting-apple-pay/[22] /2022/10/25/amazon-now-allows-customers-to-make-payments-through-venmo/[23] /blogs/apn/how-to-build-a-fintech-app-on-aws-using-the-plaid-api/pirg_cdd_cfpb_comments_7dec2022.pdfJeff Chester
-
Coalition of child advocacy, health, safety, privacy and consumer organization document how data-driven marketing undermines privacy and welfare of young peopleChildren and teenagers experience widespread commercial surveillance practices to collect data used to target them with marketing. Targeted and personalized advertising remains the dominant business model for digital media, with the marketing and advertising industry identifying children and teens as a prime target. Minors are relentlessly pursued while, simultaneously, they are spending more time online than ever before. Children’s lives are filled with surveillance, involving the collection of vast amounts of personal data of online users. This surveillance, informed by behavior science and maximized by evolving technologies, allows platforms and marketers to profile and manipulate children.The prevalence of surveillance advertising and targeted marketing aimed at minors is unfair in violation of Section 5. Specifically, data-driven marketing and targeted advertising causes substantial harm to children and teens by:violating their privacy;manipulating them into being interested in harmful products;undermining their autonomyperpetuating discrimination and bias;Additionally, the design choices tech companies use to optimize engagement and data collection in order to target marketing to minors further harm children and teens. These harms include undermining their physical and mental wellbeing and increasing the risk of problematic internet risk. These harms cannot reasonably be avoided by minors or their families, and there are no countervailing benefits to consumers or competition that outweigh these harms.Surveillance advertising is also deceptive to children, as defined by the Federal Trade Commission. The representations made about surveillance advertising by adtech companies, social media companies, apps, and games are likely to mislead minors and their parents and guardians. These misrepresentations and omissions are material. Many companies also mislead minors and their guardians by omission because they fail to disclose important information about their practices. These practices impact the choices of minors and their families every day as they use websites, apps, and services without an understanding of the complex system of data collection, retention, and sharing that is used to influence them online. We therefore urge the Commission to promulgate a rule that prohibits targeted marketing to children and teenagers.Groups filing the comment included: The Center for Digital Democracy, Fairplay, and #HalfTheStory, American Academy of Pediatrics, Becca Schmill Foundation, Berkeley Media Studies Group, Children and Screens: Institute of Digital Media and Child Development, Consumer Federation of America, Consumer Federation of California, CUNY Urban Food Policy Institute, Eating Disorders Coalition for Research, Policy & Action, Enough is Enough, LookUp.live, Lynn’s Warriors, National Eating Disorders Association, Parents Television and Media Council, ParentsTogether, Peace Educators Allied for Children Everywhere (P.E.A.C.E.), Public Citizen and UConn Rudd Center for Food Policy & Health FairPlay's executive director Josh Golin said: "Big Tech's commercial surveillance business model undermines young people's wellbeing and development. It causes kids and teens to spend excessive time online, and exposes them to harmful content and advertising targeted to their vulnerabilities. The FTC must adopt a series of safeguards to allow vulnerable youth to play, learn, and socialize online without being manipulated or harmed. Most importantly, the Commission should prohibit data-driven advertising and marketing to children and teens, and make clear that Silicon Valley profits cannot come at the expense of young people's wellbeing.”CDD's Jeff Chester underscored this saying: "Children and teens are key commercial targets of today’s data-driven surveillance complex. Their lives are tethered to a far-reaching system that is specifically designed to influence how they spend their time and money online, and uses artificial intelligence, virtual reality, geo-tracking, neuromarketing and more to do so. In addition to the loss of privacy, surveillance marketing threatens their well-being, health and safety. It’s time for the Federal Trade Commission to enact safeguards that protect young people. "[full filing attached]
-
FTC Commercial Surveillance Filing from CDD focuses on how pharma & other health marketers target consumers, patients, prescribers “Acute Myeloid Lymphoma,” “ADHD,” “Brain Cancer,” “High Cholesterol,” “Lung Cancer,” “Overweight,” “Pregnancy,” “Rheumatoid Arthritis,” “Stroke,” and “Thyroid Cancer.” These are just a handful of the digitally targetable medical condition “audience segments” available to surveillance advertisers While health and medical condition marketers—including pharmaceutical companies and drug store chains—may claim that such commercial data-driven marketing is “privacy-compliant,” in truth it reveals how vulnerable U.S. consumers are to having some of their most personal and sensitive data gathered, analyzed, and used for targeted digital advertising. It also represents how the latest tactics leveraging data to track and target the public—including “identity graphs,” artificial intelligence, surveilling-connected or smart TV devices, and a focus on so-called permission-based “first-party data”—are now broadly deployed by advertisers—including pharma and medical marketers. Behind the use of these serious medical condition “segments” is a far-reaching commercial surveillance complex including giant platforms, retailers, “Adtech” firms, data brokers, marketing and “experience” clouds, device manufacturers (e.g., streaming), neuromarketing and consumer research testing entities, “identity” curation specialists and advertisers...We submit as representative of today’s commercial surveillance complex the treatment of medical condition and health data. It incorporates many of the features that can answer the questions the commission seeks. There is widespread data gathering on individuals and communities, across their devices and applications; techniques to solicit information are intrusive, non-transparent, and out of meaningful scope for consumer control; these methods come at a cost to a person’s privacy and pocketbook, and potentially has significant consequences to their welfare. There are also societal impacts here, for the country’s public health infrastructure as well as with the expenditures the government must make to cover the costs for prescription drugs and other medical services...Health and pharma marketers have adopted the latest data-driven surveillance-marketing tactics—including targeting on all of a consumer’s devices (which today also includes streaming video delivered by Smart TVs); the integration of actual consumer purchase data for more robust targeting profiles; leveraging programmatic ad platforms; working with a myriad of data marketing partners; using machine learning to generate insights for granular consumer targeting; conducting robust measurement to help refine subsequent re-targeting; and taking advantage of new ways to identify and reach individuals—such as “Identity Graphs”— across devices. [complete filing for the FTC's Commercial Surveillance rulemaking attached]cddsurveillancehealthftc112122.pdfJeff Chester
-
At every turn, young people face tricks and traps to keep them online for hours and sharing sensitive data. Contact:David Monahan, Fairplay: david@fairplayforkids.orgJeff Chester, Center for Digital Democracy: jeff@democraticmedia.orgAdvocates to FTC: Write rules to protect kids from harmful manipulative design onlineAt every turn, young people face tricks and traps to keep them online for hoursand sharing sensitive data.BOSTON, MA and WASHINGTON, DC – November 17, 2022 – A coalition of leading health and privacy advocates filed a petition today asking the Federal Trade Commission to promulgate a rule prohibiting online platforms from using unfair design features to manipulate children and teens into spending excessive time online. Twenty-one groups, led by Fairplay and the Center for Digital Democracy, said in their petition: “When minors go online, they are bombarded by widespread design features that have been carefully crafted and refined for the purpose of maximizing the time users spend online and activities users engage in.” They urged the FTC to establish rules of the road to establish when these practices cross the line into unlawful unfairness.The advocates’ petition details how the vast majority of apps, games, and services popular among minors generate revenue primarily via advertising, and many employ sophisticated techniques to cultivate lucrative long term relationships between minors and their brands. As a result, platforms use techniques like autoplay, endless scroll, and strategically timed advertisements to keep kids and teens online as much as possible– which is not in their best interests.The petition also details how manipulative design features on platforms like TikTok, Twitter, YouTube, Facebook, Instagram, and Snapchat undermine young people’s wellbeing. Excessive time online displaces sleep and physical activity, harming minors’ physical and mental health, growth, and academic performance. Features designed to maximize engagement also expose minors to potential predators and online bullies and age-inappropriate content, harm minors’ self-esteem, and aggravate risks of disordered eating and suicidality. The manipulative tactics also undermine children’s and teens’ privacy by encouraging the disclosure of massive amounts of sensitive user data.The advocates’ petition comes just months after California passed its Age Appropriate Design Code, a law requiring digital platforms to act in the best interests of children, and as momentum grows in Congress for the Kids and Online Safety Act and the Children and Teens’ Online Privacy Protection Act.The petition was drafted by the Communications and Technology Law Clinic at Georgetown University Law Center. Haley Hinkle, Policy Counsel, Fairplay:“The manipulative tactics described in this Petition that are deployed by social media platforms and apps popular with kids and teens are not only harmful to young people’s development– they’re unlawful. The FTC should exercise its authority to prohibit these unfair practices and send Big Tech a message that manipulating minors into handing over their time and data is not acceptable.”Katharina Kopp, Deputy Director, Center for Digital Democracy:“The hyper-personalized, data-driven advertising business model has hijacked our children’s lives. The design features of social media and games have been purposefully engineered to keep young people online longer and satisfy advertisers. It’s time for the FTC to put an end to these unfair and harmful practices. They should adopt safeguards that ensure platforms and publishers design their online content so that it places the well-being of young people ahead of the interests of marketers.”Jenny Radesky, MD, Associate Professor of Pediatrics, University of Michigan and Chair-elect, American Academy of Pediatrics Council on Communications and Media:“As a pediatrician, helping parents and teens navigate the increasingly complex digital landscape in a healthy way has become a core aspect of my work. If the digital environment is designed in a way that supports children’s healthy relationships with media, then it will be much easier for families to create boundaries that support children’s sleep, friendships, and safe exploration. However, this petition highlights how many platforms and games are designed in ways that actually do the opposite: they encourage prolonged time on devices, more social comparisons, and more monetization of attention. Kids and teens are telling us that these types of designs actually make their experiences with platforms and apps worse, not better. So we are asking federal regulators to help put safeguards in place to protect against the manipulation of children’s behavior and to instead prioritize their developmental needs.”Professor Laura Moy, Director, Communications & Technology Law Clinic at Georgetown Law, and counsel for Center for Digital Democracy and Fairplay:“As any parent or guardian can attest, games and social media apps keep driving kids and teens to spend more and more time online, in a way that neither minors nor their guardians can reasonably prevent. This is neither accidental nor innocuous—it's engineered and it's deeply harmful. The FTC must step in and set some boundaries to protect kids and teens. The FTC should clarify that the most harmful and widespread design features that manipulate users into maximizing time online, such as those employed widely by social media services and popular games, are unlawful when used on minors.” Groups signing on to the petition include: Center for Digital Democracy; Fairplay; Accountable Tech; American Academy of Pediatrics; Becca Schmill Foundation, Inc.; Berkeley Media Studies Group; C. Everett Koop Institute at Dartmouth; Center for Humane Technology; Children and Screens: Institute of Digital Media and Child Development; Eating Disorders Coalition; Electronic Privacy Information Center (EPIC); LookUp.live; Lynn's Warriors; Network for Public Education; Parent Coalition for Student Privacy; ParentsTogether Action; Protect Young Eyes; Public Citizen; Together for Girls; U.S. Public Interest Research Group; and UConn Rudd Center for Food Policy and Health.###ftc_engagement_petition_pr1.pdf, unfair_design_practices_petition_for_rulemaking_final_combined_filing.pdf
-
https://www.ftc.gov/system/files/ftc_gov/pdf/R307000_RULE_MAKING_PETITION_TO_PROHIBIT_THE%20_USE_ON_CHILDREN_OF_DESIGN_FEATURES.pdfJeff Chester
-
Government Needs to Step up its Efforts to Provide Meaningful and Effective Regulation.Under intensifying pressure from Congress and the public, top social media platforms popular with young people – Instagram, Snapchat, TikTok, Twitch, and YouTube – have launched dozens of new safety features for children and teens in the last year, according to a report from the Center for Digital Democracy (CDD). Researchers at CDD conducted an analysis of tech industry strategies to head off regulation in the wake of the 2021 Facebook whistleblower revelations and the rising tide of public criticism, Congressional hearings, and pressures from abroad. These companies have introduced a spate of new tools, default navigation systems, and AI software aimed at increasing safeguards against child sexual abuse material, problematic content, and disinformation, the report found. But tech platforms have been careful not to allow any new safety systems to interfere significantly with advertising practices and business models that target the lucrative youth demographic. As a consequence, while industry spokespersons tout their concerns for children, “their efforts to establish safeguards are, at best, fragmented and conflicted,” the report concludes. “Most of the operations inside these social media companies remain hidden from public view, leaving many questions about how the various safety protocols and teen-friendly policies actually function.” More attention should also be placed on advertisers, the report suggests, which have become a much more powerful and influential force in the tech industry in recent years. Researchers offer a detailed description of the industry’s “brand safety” system – an “expanding infrastructure of specialized companies, technological tools, software systems, and global consortia that now operate at the heart of the digital economy, creating a highly sophisticated surveillance system that can determine instantaneously which content can be monetized and which cannot.” This system, which was set up to protect the advertisers from having their ads associated with problematic content, could do much more to ensure better protections for children. “The most effective way to ensure greater accountability and more meaningful transparency by the tech industry,” the authors argue, “is through stronger public policies.” Pointing out that protection of children online remains a strong bipartisan issue, researchers identify a number of current legislative vehicles and regulatory proceedings – including bills that are likely to be reintroduced in the next Congress – which could provide more comprehensive protections for young people, and rein in some of the immense power of the tech industry. “Tech policies in the U.S. have traditionally followed a narrow, piecemeal approach to addressing children’s needs in the online environment,” the authors note, “providing limited safeguards for only the youngest children, and failing to take into account the holistic nature of young peoples’ engagement with the digital media environment.” What is needed is a more integrated approach that protects privacy for both children and teens, along with safeguards that cover advertising, commercial surveillance, and child safety. Finally, the report calls for a strategic campaign that brings together the diverse constituencies working on behalf of youth in the online media. “Because the impacts of digital technologies on children are so widespread, efforts should also be made to broaden the coalition of organizations that have traditionally fought for children’s interests in the digital media to include groups representing the environment, civil rights, health, education, and other key stakeholder communities.”Jeff Chester
-
Commercial Surveillance expands via the "Big" Screen in the Home Televisions now view and analyze us—the programs we watch, what shows we click on to consider or save, and the content reflected on the “glass” of our screens. On “smart” or connected TVs, streaming TV applications have been engineered to fully deliver the forces of commercial surveillance. Operating stealthily inside digital television sets and streaming video devices is an array of sophisticated “adtech” software. These technologies enable programmers, advertisers and even TV set manufacturers to build profiles used to generate data-driven, tailored ads to specific individuals or households. These developments raise important questions for those concerned about the transparency and regulation of political advertising in the United States.Also known as “OTT” (“over-the-top” since the video signal is delivered without relying on traditional set-top cable TV boxes), the streaming TV industry incorporates the same online advertising techniques employed by other digital marketers. This includes harvesting a cornucopia of information on viewers through alliances with leading data-brokers. More than 80 percent of Americans now use some form of streaming or Smart TV-connected video service. Given such penetration, it is no surprise that streaming TV advertising is playing an important role in the upcoming midterm elections. And, streaming TV will be an especially critical channel for campaigns to vie for voters in 2024. Unlike political advertising on broadcast television or much of cable TV, which is generally transmitted broadly to a defined geographic market area, “addressable” streaming video ads appear in programs advertisers know you actually watch (using technologies such as dynamic ad insertion). Messaging for these ads can also be fine-tuned as a campaign progresses, to make the message more relevant to the intended viewer. For example, if you watch a political ad and then sign up to receive campaign literature, the next TV commercial from a candidate or PAC can be crafted to reflect that action. Or, if your data profile says you are concerned about the costs of healthcare, you may see a different pitch than your nextdoor neighbor who has other interests. Given the abundance of data available on households, including demographic details such as race and ethnicity, there will also be finely tuned pitches aimed at distinct subcultures produced in multiple languages.An estimated $1.4 billion dollars will be spent on streaming political ads for the midterms (part of an overall $9 billion in ad expenditures). With more people “cutting the cord” by signing up for cheaper, ad-supported streaming services, advances in TV technologies to enable personalized data-driven ad targeting, and the integration of streaming TV as a key component of the overall online marketing apparatus, it is evident that the TV business has changed. Even what’s considered traditional broadcasting has been transformed by digital ad technologies. That’s why it’s time to enact policy safeguards to ensure integrity, fairness, transparency and privacy for political advertising on streaming TV. Today, streaming TV political ads already combine information from voter records with online and offline consumer profile data in order to generate highly targeted messages. By harvesting information related to a person’s race and ethnicity, finances, health concerns, behavior, geolocation, and overall digital media use, marketers can deliver ads tied to our needs and interests. In light of this unprecedented marketing power and precision, new regulations are needed to protect consumer privacy and civic discourse alike. In addition to ensuring voter privacy, so personal data can’t be as readily used as it is today, the messaging and construction of streaming political ads must also be accountable. Merely requiring the disclosure of who is buying these ads is insufficient. The U.S. should enact a set of rules to ensure that the tens of thousands of one-to-one streaming TV ads don’t promote misleading or false claims, or engage in voter suppression and other forms of manipulation. Journalists and campaign watchdogs must have the ability to review and analyze ads, and political campaigns need to identify how they were constructed—including the information provided by data brokers and how a potential voter’s viewing behaviors were analyzed (such as with increasingly sophisticated machine learning and artificial intelligence algorithms). For example, data companies such as Acxiom, Experian, Ninth Decimal, Catalina and LiveRamp help fuel the digital video advertising surveillance apparatus. Campaign-spending reform advocates should be concerned. To make targeted streaming TV advertising as effective as possible will likely require serious amounts of money—for the data, analytics, marketing and distribution. Increasingly, key gatekeepers control much of the streaming TV landscape, and purchasing rights to target the most “desirable” people could face obstacles. For example, smart TV makers– such as LG, Roku, Vizio and Samsung– have developed their own exclusive streaming advertising marketplaces. Their smart TVs use what’s called ACR—”automated content recognition”—to collect data that enables them to analyze what appears on our screens—“second by second.” An “exclusive partnership to bring premium OTT inventory to political clients” was recently announced by LG and cable giant Altice’s ad division. This partnership will enable political campaigns that qualify to access 30 million households via Smart TVs, as well as the ability to reach millions of other screens in households known to Altice. Connected TVs also provide online marketers with what is increasingly viewed as essential for contemporary digital advertising—access to a person’s actual identity information (called “first-party” data). Streaming TV companies hope to gain permission to use subscriber information in many other ways. This practice illustrates why the Federal Trade Commission’s (FTC) current initiative designed to regulate commercial surveillance, now in its initial stage, is so important. Many of the critical issues involving streaming political advertising could be addressed through strong rules on privacy and online consumer protection. For example, there is absolutely no reason why any marketer can so easily obtain all the information used to target us, such as our ethnicity, income, purchase history, and education—to name only a few of the variables available for sale. Nor should the FTC allow online marketers to engage in unfair and largely stealth tactics when creating digital ads—including the use of neuroscience to test messages to ensure they respond directly to our subconscious. The Federal Communications Commission (FCC), which has largely failed to address 21st century video issues, should conduct its own inquiry “in the public interest.” There is also a role here for the states, reflecting their laws on campaign advertising as well as ensuring the privacy of streaming TV viewers.This is precisely the time for policies on streaming video, as the industry becomes much more reliant on advertising and data collection. Dozens of new ad-supported streaming TV networks are emerging—known as FAST channels (Free Ad Supported TV)—which offer a slate of scheduled shows with commercials. Netflix and Disney+, as well as Amazon, have or are soon adopting ad-supported viewing. There are also coordinated industry-wide efforts to perfect ways to more efficiently target and track streaming viewers that involve advertisers, programmers and device companies. Without regulation, the U.S. streaming TV system will be a “rerun” of what we historically experienced with cable TV—dashed expectations of a medium that could be truly diverse—instead of a monopoly—and also offer both programmers and viewers greater opportunities for creative expression and public service. Only those with the economic means will be able to afford to “opt-out” of the advertising and some of the data surveillance on streaming networks. And political campaigns will be allowed to reach individual voters without worry about privacy and the honesty of their messaging. Both the FTC and FCC, and Congress if it can muster the will, have an opportunity to make streaming TV a well-regulated, important channel for democracy. Now is the time for policymakers to tune in.***This essay was originally published by Tech Policy Press.Support for the Center for Digital Democracy’s review of the streaming video market is provided by the Rose Foundation for Communities and the Environment.Jeff Chester
-
Discussion by Jeff Chester at the Global Alcohol Policy Alliance Alcohol Marketers are now big data companies. They are also commercial surveillance marketing enterprises, which is how data driven digital marketing is increasingly described by regulators and critics. Like many other global industries, alcohol marketing uses an ever expanding set of diverse and sophisticated online and offline techniques designed to identify and deeply influence its target audiences. Alcoholic beverage companies have broadly adopted the business model and tactics perfected by Google, Meta/Facebook, and Amazon. This includes “omnichannel” marketing operations that identify a single person and follow them on their various devices, such as gaming, mobile, and streaming. The alcoholic beverage industry engages in cutting edge digital marketing campaigns throughout the world. However, the use of contemporary marketing techniques for alcoholic beverages enables us to use various regulatory and other legal tools to protect public health and the public at large. That includes pursuing various privacy complaints, across state, national or regional data protection regulators (as well as class actions where possible); developing related complaints for consumer protection regulators on the kinds of unfair advertising practices that embody digital marketing, such as the use of neuromarketing to influence subconscious and emotional processes; the reliance on “immersive” ad applications involving virtual and augmented reality (such as metaverse), whose effects also impact non rational processes; the role of influencers used to penetrate youth culture to promote the brand; and, on the data practices itself, the widespread adoption of machine learning and Artificial Intelligence systems to generate predictive and personalized marketing plans on individuals, groups and communities. Another critical aspect of data marketing, as we know, is the gathering and use of a host of data on people—their race, ethnicity, income, health concerns, geolocation, etc., that when assembled in today’s real-time online marketing machine are used to reach us with a highly informed assessment of who we are and what we do. In addition to regulation and judicial recourse, there are also the public shaming aspects that can be generated through the news media and other informational campaigns.I will summarize several of the troubling practices of the alcohol marketing industry today that could form the basis for potential regulatory interventions.The use of Big Data operations: As leading advertisers, alcoholic beverage companies already hold a vast—and growing--array of data on their customers and targets. For example, AB InBev relies on [quote] 1000 different data sources and has more than 70.1 million unique customer records [unquote]. Its data sources include information gathered thru mobile devices, social media, and ecommerce, among others. AB InBev has invested in the latest technologies to consolidate, manage and make actionable this information, including Data Management Platforms (DMPs—which integrate and analyze diverse data points) that help identify and target an individual. Through state-of-the-art online campaigns, companies like ABInBev collect huge amounts of key data. For example, the company created a platform in Columbia not long ago—[quote] “a central online store where customers could share their location and place their order which was then sent via Whatsapp to their local grocer to be fulfilled…it digitized every (convenience) store, in every corner, in every block, in every neighborhood and connected them” [unquote] to its online store. Pervasive Surveillance on social media used for insight generation. Alcohol companies deploy abundant “social listening” strategies that use sentiment mining, AI-driven computer vision and other tools to understand what is being said, by who and where, about the brand or topics that can be better leveraged for marketing; for example, to help pinpoint who are the most influential or useful voices to reach out to. Much of this work is conducted 24/7 with real-time capabilities to take advantage of what is identified. E-commerce: Online is increasingly an environment that seamlessly merges content, sales, marketing, and payment. Alcoholic beverage companies are taking advantage of the powerful data driven promotion engines that operate these online sales channels, to make sure you see its product, place it in the shopping cart, and buy it. Leading grocery and retail companies have also established their own highly developed online marketing operations that work with alcoholic beverages and other brands to showcase them on their e-commerce and online marketing sites; another source of privacy concern, as data sets merge].The use of neuroscience and other emotional technologies. Used to identify how to trigger non-rational responses to marketing, including measuring the emotional intensity of an ad as well as assessing how well a person’s memory encodes that message. Alcohol companies (and many others) hook subjects up to EEGs and other similar tech to map their brainwaves responses to ads and content. Then an ad or message is honed and deployed. These tools are also used “in flight” [during a running ad campaign] to correct errors and fine-tune their impact.Repositioning themselves as providers of economic opportunity and social good. A recent trend by alcohol marketers is to position itself as generating economic opportunity for small businesses, as a strategy to deepen its connections for data. For example, in Brazil last year during Carnival, one alcoholic beverages company used emails, push notifications, text messaging, an app, ecommerce platform, personalized QR codes and social media to support nearly 11,000 street vendors working out of their homes that ended up selling 200,000 of the brand’s products. It established a critical digital link between the vendors, the alcohol brand, and its customers. Providers of technology: This is especially true with branded alcoholic beverage company mobile apps, which are a key source of data gathering, monitoring of consumer behaviors (inc. geolocation), enrollment in loyalty programs and becomes an immediate influence and marketing channel. These apps are aksi used for sales and payments, creating another highly valuable data source.Penetrating further into the community. Mobile and other digital marketing tech enables highly targeted, geo-aware, campaigns. For example, in South Africa one brand—as part of a wider social media effort—used what’s known as DOOH—giving away software while encouraging its targets to [quote] create a personalized shout out to someone special and then select a digital billboard at a specific location for their message to be displayed on. [unquote]. Finally, creating impressive online experiences--such as music events to connect to youth. In China, Jagermeister, who knew it was loosing its youth demographic, created [quote] “two days-worth of performance lineups and subculture experiences” [unquote] with livestreaming music and other ways to engage and interact with its young audience. This event claimed to reach 200m impressions. There are many more examples of such experiential virtual campaigns by alcoholic beverages companies.Policy Options:This is an optimum time to seek safeguards regarding the marketing of alcoholic beverages, to both underage consumers as well as address public health concerns overall on adult consumption. Concern over the loss of privacy and autonomy, as well as its impact on youth development and health, is fueling greater interest by policymakers to regulate digital marketing. For example, here in the U.S. we have a new proposed rulemaking on surveillance marketing by the Federal Trade Commission, which offers multiple opportunities for the public health community to call for safeguards. In the EU, there is the GDPR, Digital Services Act and other consumer legislation at the national and EU level that can be consideed. The UK’s privacy commissioner has begun to enforce its new “Design Code” that governs how the online industry interacts with children and adolescents. There are data protection commissioners in many countries, as well as varying laws, that should be assessed. To advance these opportunities, public health advocates will likely find support from the global community of public interest privacy and consumer protection NGOs and scholars, who could be enlisted to identify the potential remedies and develop the appropriate regulatory complaints. The WHO, of course, is in the forefront of documenting many of the practices we’ve discussed, including its recent work on digital marketing on unhealthy foods and beverages, breast milk substitutes, and alcohol marketing. As these reports show, and as this conference reflects, the significant advances by these producers and marketers into the digital sphere, which operates now as such a key force in our lives, should be challenged. Limits and expectations for this industry should be set, along with ongoing research into the effects of such marketing as well as analyzing its marketing operations. With timely action, we might be able to set a healthier course for the role that alcoholic beverages can play in our societies. Thank you.Jeff Chester
-
A coalition of more than 100 organizations is sending two letters to Congress urging action. A letter addressed to Senate Majority Leader Chuck Schumer and Minority Leader Mitch McConnell, from 145 organizations, urges them to advance KOSA and COPPA to full Senate votes. A letter addressed to House Energy and Commerce Chair Frank Pallone and Ranking Member Cathy McMorris Rodgers, from 158 organizations, urges them to introduce a House companion bill to KOSA. The advocates state in the letter to the Senate: “The enormity of the youth mental health crisis needs to be addressed as the very real harms of social media are impacting our children today. Taken together, the Kids Online Safety Act and the Children and Teens’ Online Privacy Protection Act would prevent online platforms from exploiting young users’ developmental vulnerabilities and targeting them in unfair and harmful ways.” kosa_coppa_senate_leadership_letter_final_9.12.22-1.pdf, eandc_leadership_kosa_letter_final_9.12.22-1.pdf, kosa_coppa_rally_press_release_embargo_to_9_13.pdf
-
Press Release
Press Statement regarding today’s FTC Notice of Proposed Rulemaking Regarding the Commercial Surveillance and Data Security
Press Statement regarding today’s FTC Notice(link is external) of Proposed Rulemaking Regarding the Commercial Surveillance and Data SecurityKatharina Kopp, Deputy Director, Center for Digital Democracy:Today, the Federal Trade Commission issued its long overdue advanced notice of proposed rulemaking (ANPRM) regarding a trade regulation rule on commercial surveillance and data security. The ANPRM aims to address the prevalent and increasingly unavoidable harms of commercial surveillance. Civil society groups including civil rights groups, privacy and digital rights and children’s advocates had previously called on the commission to initiate this trade regulation rule to address the decades long failings of the commission to reign in predatory corporate practices online. CDD had called on the commission repeatedly over the last two decades to address the out-of-control surveillance advertising apparatus that is the root cause of increasingly unfair, manipulative, and discriminatory practices harming children, teens, and adults and which have a particularly negative impact on equal opportunity and equity.The Center for Digital Democracy welcomes this important initial step by the commission and looks forward to working with the FTC. CDD urges the commission to move forward expeditiously with the rule making and to ensure fair participation of stakeholders, particularly those that are disproportionately harmed by commercial surveillance.press_statement_8-11fin.pdf -
Blog
Protecting Children and Teens from Unfair and Deceptive Marketing, including Stealth Advertising
CDD Comments to FTC for "Stealth" Marketing Inquiry The Center for Digital Democracy (CDD) urges the FTC to develop and implement a set of policies designed to protect minors under 18 from being subjected to a host of pervasive, sophisticated and data-driven digital marketing practices. Children and teens are targeted by an integrated set of online marketing operations that are manipulative, unfair, invasive and can be especially harmful to their mental and physical health. The commission should make abundantly clear at the forthcoming October workshop that it understands that the many problems generated by contemporary digital marketing to youth transcend narrow categories such as “stealth advertising” and “blurred content.” Nor should it propose “disclosures” as a serious remedy, given the ways advertising is designed using data science, biometrics, social relationships and other tactics. Much of today’s commercially supported online system is purposefully developed to operate as “stealth”—from product development, to deployment, to targeting, tracking and measurement. Age-based cognitive development capacities to deal with advertising, largely based on pre-digital (especially TV) research, simply don’t correspond to the methods used today to market to young people. CDD calls on the commission to acknowledge that children and teenagers have been swept into a far reaching commercial surveillance apparatus.The commission should propose a range of safeguards to protect young people from the current “wild west” of omnichannel directed at them. These safeguards should address, for example, the role market research and testing of child and teen-directed commercial applications and messaging play in the development of advertising; how neuromarketing[pdf] practices designed to leverage a young person’s emotions and subconscious are used to deliver “implicit persuasion”; the integration by marketers and platforms of “immersive” applications, including augmented and virtual reality, designed to imprint brand and other commercial messages; the array of influencer-based strategies, including the extensive infrastructure used by platforms and advertisers to deliver, track and measure their impact; the integration of online marketing with Internet of Things objects, including product packaging and the role of QR codes, (experiential marketing) and digital out-of-the-home advertising screens; as well as contemporary data marketing operations that use machine learning and artificial intelligence to open up new ways for advertisers to reach young people online. AI services increasingly deliver personalized content online, further automating the advertising process to respond in real-time.It is also long overdue for the FTC to investigate and address how online marketing targets youth of color, who are subjected to a variety of advertising practices little examined by privacy and other regulators.The FTC should use all its authority and power to stop data-driven surveillance marketing to young people under 18; end the role sponsored influencers play; enact rules designed to protect the online privacy for teens 13-17 who are now subjected to ongoing tracking by marketers; and propose policies to redress the core methods employed by digital advertisers and online platforms to lure both children and teens. For more than 20 years, CDD and its allies have urged the FTC to address the ways digital marketing has undermined consumer protection and privacy, especially for children and adolescents. Since the earliest years of the commercial internet, online marketers have focused on young people, both for the revenues they deliver as well as to secure loyalty from what the commercial marketing industry referred to as “native” users. The threat to their privacy, as well as to their security and well-being, led to the complaint our predecessor organization filed in 1996, which spurred the passage of the Children’s Online Privacy Protection Act (COPPA) in 1998. COPPA has played a modest role protecting some younger children from experiencing the totality of the commercial surveillance marketing system. However, persistent failures of the commission to enforce COPPA; the lack of protections for adolescents (despite decades-long calls by advocates for the agency to act on this issue); and a risk-averse approach to addressing the methods employed by the digital advertising, even when applied to young people, have created ongoing threats to their privacy, consumer protection and public health. In this regard, we urge the commission to closely review the comments submitted in this proceeding by our colleague Fairplay and allies. We are pleased Fairplay supports these comments.If the FTC is to confront how the forces of commercial digital surveillance impact the general public, the building blocks to help do so can be found in this proceeding. Young people are exposed to the same unaccountable forces that are everywhere online: a largely invisible, ubiquitous, and machine-intelligence-driven system that tracks and assesses our every move, using an array of direct and indirect techniques to influence behaviors. If done correctly, this proceeding can help inform a larger policy blueprint for what policy safeguards are needed—for young people and for everyone else.The commission should start by reviewing how digital marketing and data-gathering advertising applications are “baked in” at the earliest stages of online content and device development. These design and testing practices have a direct impact on young people. Interactive advertising standards groups assess and certify a host of approved ad formats, including for gaming, mobile, native advertising, and streaming video. Data practices for digital advertising, including ways that ads are delivered through the behavioral/programmatic surveillance engines, as well as their measurement, are developed through collaborative work involving trade organizations and leading companies. Platforms such as Meta, as well as ad agencies, adtech companies, and brands, also have their own variations of these widely adopted formats and approaches. The industry-operated standards process for identifying new methods for digital advertising, including the real-world deployment of applications such “playable” ads or the ways advertisers can change its personalized messaging in real-time, have never been seriously investigated by the commission. A review of the companies involved show that many are engaged in digital marketing to young people.Another critical building block of contemporary digital marketing to address when dealing with youth-directed advertising is the role of “engagement.” As far back as 2006, the Interactive Advertising Bureau (IAB) recognized that to effectively secure the involvement of individuals with marketing communications, at both the subconscious and conscious levels, it was necessary to define and measure the concept of engagement. IAB initially defined “Engagement… [as] turning on a prospect to a brand idea enhanced by the surrounding context..” By 2012, there were more elaborate definitions identifying “three major forms of engagement… cognitive, physical and emotional.” A set of corresponding metrics, or measurement tools, were used, including those tracking “attention” (“awareness, interest, intention”); emotional and motor functioning identified through biometrics (“heart palpitations, pupil dilation, eye tracking”); and through omnipresent tracking of online behaviors (“viewability and dwell time, user initiated interaction, clicks, conversions, video play rate, game play”). Today, research and corresponding implementation strategies for engagement are an ongoing feature for the surveillance-marketing economy. This includes conducting research and implementing data-driven and other ad strategies targeting children—known as “Generation Alpha”—children 11 and younger—and teens—“Generation Z.”We will briefly highlight some crucial areas this proceeding should address:Marketing and product research on children and adolescents: An extensive system designed to ensure that commercial online content, including advertising and marketing, effectively solicits the interest and participation of young people, is a core feature of the surveillance economy. A host of companies are engaged in multi-dimensional market research, including panels, labs, platforms, streaming media companies, studios and networks, that have a direct impact on the methods used to advertise and market to youth. CDD believes that such product testing, which can rely on a range of measures designed to promote “implicit persuasion” should be considered an unfair practice generally. Since CDD and U.S. PIRG first urged the commission to investigate neuromarketing more than a decade ago, this practice has in ways that enable it to play a greater role influencing how content and advertising is delivered to young people.For example, MediaScience (which began as the Disney Media and Advertising Lab), serves major clients including Disney, Google, Warner Media, TikTok, Paramount, Fox and Mars. It conducts research for platforms and brands using such tools as “neurometrics (skin conductivity and heart rate), eye tracking, facial coding, and EEGs, among others, that assess a person’s responses across devices. Research is also conducted outside of the lab setting, such as directly through a subject’s “actual Facebook feed.” It has a panel of 80,000 households in the U.S., where it can deliver digital testing applications using a “variety of experimental designs… facilitated in the comfort of people’s homes.” The company operates a “Kids” and “Teens” media research panel. Emblematic of the far-reaching research conducted by platforms, agencies and brands, in 2021 TikTok’s “Marketing Science team” commissioned MediaScience to use neuromarketing research to test “strong brand recall and positive sentiment across various view durations.” The findings indicated that “ads on TikTok see strong brand recall regardless of view duration…. Regardless of how long an ad stays on screen, TikTok draws early attention and physiological engagement in the first few seconds.”NBCUniversal is one of the companies leveraging the growing field of “emotional analytics” to help advance advertising for streaming and other video outlets. Comcast’s NBCU is using “facial coding and eye-tracking AI to learn an audience’s emotional response to a specific ad.” Candy company Mars just won a “Best Use of Artificial Intelligence” award for its “Agile Creative Expertise (ACE) tool that “tracks attentional and emotional response to digital video ads.” Mars is partnering with neuromarketer Realeyes to “measure how audience’s attention levels respond as they view Mars' ads.Knowing what captures and retains attention or even what causes distraction, generated intelligence that enabled Mars to optimize the creative itself or the selection of the best performing ads across platforms including TikTok, Facebook, Instagram and YouTube.” TikTok, Meta/Facebook, and Google have all used a variety of neuromarketing measures. The Neuromarketing Science and Business Association (NMSBA) includes many of the leading companies in this field as members. There is also an “Attention Council” within the digital marketing industry to help advance these practices, involving Microsoft, Mars, Coca-Cola, AB/InBev, and others. A commercial research infrastructure provides a steady drumbeat of insights so that marketers can better target young people on digital devices. Children’s streaming video company Wildbrain, for example, partnered with Ipsos for its 2021 research report, “The Streaming Generation,” which explained that “Generation Alpha [is] the most influential digital generation yet…. They have never known a world without digital devices at their fingertips, and for Generation Alpha (Gen A), these tech-first habits are now a defining aspect of their daily lives.” More than 2,000 U.S. parents and guardians of children 2-12 were interviewed for the study, which found that “digital advertising to Gen A influences the purchasing decisions of their parents…. Their purchasing choices, for everything from toys to the family car, are heavily influenced by the content kids are watching and the ads they see.” The report explains that among the “most popular requests” are toys, digital games, clothing, tech products and “in-game currencies” for Roblox and Fortnite.Determining the levels of “brand love” by children and teens, such as the use of “Kidfinity” and “Teenfinity” scores—“proprietary measures of brand awareness, popularity and love”—are regularly provided to advertisers. Other market researchers, such as Beano Studios, offer a “COPPA-compliant” “Beano Brain Omnibus” website that, through “games, quizzes, and bespoke questions” for children and teens, “allows bands to access answers to their burning questions.” These tools help marketers better identify, for example, the sites—such as TikTok—where young people spend time. Among the other services Beano provides, which reflect many other market-research companies’ capabilities, are “Real-time UX/UI and content testing—in the moment, digital experience exploration and evaluation of brands websites and apps with kids and teens in strawman, beta or live stages,” and “Beano at home—observing and speaking to kids in their own homes. Learning how and what content they watch.” Adtech and other data marketing applications: In order to conduct any “stealth” advertising inquiry, the FTC should review the operations of contemporary “Big Data”-driven ad systems that can impact young people. For example, Disney has an extensive and cutting-edge programmatic apparatus called DRAX(Disney Real-Time Ad Exchange) that is delivering thousands of video-based campaigns. DRAX supports “Disney Select,” a "suite of ad tech solutions, providing access to an extensive library of first-party segments that span the Disney portfolio, including streaming, entertainment and sports properties…. Continuously refined and enhanced based on the countless ways Disney connects with consumers daily. Millions of data inputs validated through data science…. Advertisers can reach their intended audiences by tapping into Disney’s proprietary Audience Graph, which unifies Disney’s first party data and audience modeling capabilities….” As of March 2022, Disney Select contained more than 1,800 “audience segments built from more than 100,000 audience attributes that fuel Disney’s audience graph.” According to Disney Advertising, its “Audience Graph” includes 100 million households, 160 million connected TV devices and 190 million device IDs, which enables modeling to target households and families. Children and teens are a core audience for Disney, and millions of their households receive its digital advertising. Many other youth-directed leading brands have developed extensive internal adtech applications designed to deliver ongoing and personalized campaigns. For example, Pepsi, Coca-Cola, McDonald’s, and Mondelez have in-house capabilities and extensive partnerships that create targeted marketing to youth and others. The ways that “Big Data” analytics affect marketing, especially how insights can be used to target youth, should be reviewed. Marketers will say to the FTC that they are only targeting 18-year-olds and over, but an examination of their actual targets, and asking for child-related brand-safety data they collect, should provide the agency with a robust response to such claims.New methods to leverage a person’s informational details and then target them, especially without “cookies,” requires the FTC to address how this is being used to market to children and teens. This review should also be extended to “contextual” advertising, since that method has been transformed through the use of machine learning and other advanced tactics—called “Contextual 2.0.”Targeting youth of color: Black, Hispanic, Asian-American and other “multicultural” youth, as the ad industry has termed it, are key targets for digital advertising. An array of research, techniques, and services is focused on these young people, whose behaviors online are closely monitored by advertisers. A recent case study to consider is the McDonald’s U.S. advertising campaign designed to reverse its “decline with multicultural youth.” The goal of its campaign involving musician Travis Scott was to “drive penetration by bringing younger, multicultural customers to the brands… and drive immediate behavior too.” As a case study explains, “To attract multicultural youth, a brand… must have cultural cachet. Traditional marketing doesn’t work with them. They don’t watch cable TV; they live online and on social media, and if you are not present there you’re out of sight, out of mind.”It’s extremely valuable to identify some of the elements involved in this case, which are emblematic of the integrated set of marketing and advertising practices that accompany so many campaigns aimed at young people. These included working with a celebrity/influencer who is able to “galvanize youth and activate pop culture”; offering “coveted content—keepsakes and experiences to fuel the star’s fanbase, driving participation and sales”; employing digital strategies through a proprietary (and data-collecting) “app to bring fans something extra and drive digital adoption”; and focusing on “affordability”—to ensure “youth with smaller wallets” would participate. To illustrate how expenditures for paid advertising are much less relevant with digital marketing, McDonald’s explains that “Before a single dollar had been spent on paid media, purely on the strength of a few social posts by McDonald’s and Travis Scott, and reporting in the press, youth were turning up at restaurants across the country, asking for the Travis Scott meal.” This campaign was a significant financial success for McDonald’s. Its partnership with this influencer was effective as well in terms of “cultural response: hundreds of thousands of social media mentions and posts, fan-art and memes, unboxing videos of the meal…, fans selling food and stolen POS posters on eBay…, the multi merch drops that sold out in seconds, the framed receipts.” Online ads targeted to America’s diverse communities of young people, who can also be a member of a group at risk (due to finances, health, and the like) have long required an FTC investigation. The commission should examine the data-privacy and marketing practices on these sites, including those that communicate via languages other than English.Video and Video Games: Each of these applications have developed an array of targeted advertising strategies to reach young people. Streaming video is now a part of the integrated surveillance-marketing system, creating a pivotal new place to reach young people, as well as generate data for further targeting. Children and teens are viewing video content on Smart TVs, other streaming devices, mobile phones, tablets as well as computers. Household data where young people reside, which is amplified through the use of a growing number of “identity” tools that permit cross-device tracking, enable an array of marketing practices to flourish. The commission should review the data-gathering, ad-formatting, and other business practices that have been identified for these “OTT” services and how they impact children and teens. There are industry-approved ad-format guidelines for digital video and Connected TV. Digital video ads can use “dynamic overlays,” “shoppable and actionable video,” “voice-integrated video ads,” “sequential CTV creative,” and “creative extensions,” for example. Such ad formats and preferred practices are generally not vetted in terms of how they impact the interests of young people.Advertisers have strategically embedded themselves within the video game system, recognizing that it’s a key vantage point to surveil and entice young people. One leading quick-service restaurant chain that used video games to “reach the next generation of fast-food fans” explained that “gaming has become the primary source of entertainment for the younger generation. Whether playing video games or watching others play games on social platforms, the gaming industry has become bigger than the sports and music industries combined. And lockdowns during the global pandemic accelerated the trend. Gaming is a vital part of youth culture.” Illustrating that marketers understand that traditional paid advertising strategies aren’t the most effective to reach young people, the fast-food company decided to “approach gaming less like an advertising channel and more like an earned social and PR platform…. [V]ideo games are designed as social experiences.” As Insider Intelligence/eMarketer reported in June 2022, “there’s an ad format for every brand” in gaming today, including interstitial ads, rewarded ads, offerwalls, programmatic in-game ads, product placement, advergames, and “loot boxes.” There is also an “in-game advertising measurement” framework, recently released for public comment by the IAB and the Media Ratings Council. This is another example where leading advertisers, including Google, Microsoft, PepsiCo and Publicis, are determining how “ads that appear within gameplay” operate. These guidelines will impact youth, as they will help determine the operations of such ad formats as “Dynamic In-Game Advertising (DIGA)—Appear inside a 3D game environment, on virtual objects such as billboards, posters, etc. and combine the customization of web banners where ads rotate throughout the play session”; and “Hardcoded In-Game Ad Objects: Ads that have not been served by an ad server and can include custom 3D objects or static banners. These ads are planned and integrated into a video game during its design and development stage.” Leading advertising platforms such as Amazon sell as a package video ads reaching both streaming TV and gaming audiences. The role of gaming and streaming should be a major focus in October, as well as in any commission follow-up report.Influencers: What was once largely celebrity-based or word-of mouth style endorsements has evolved into a complex system including “nano-influencers (between 1,000 and 10,000 followers); micro-influencers (between 10,000 and 100,000); macro-influencers (between 100,000 and a million); and mega or celebrity influencers (1 million-plus followers). According to a recent report in the Journal of Advertising Research, “75 percent of marketers are now including social-media influencers in their marketing plans, with a worldwide market size of $2.3 billion in 2020.” Influencer marketing is also connected to social media marketing generally, where advertisers and others have long relied on a host of surveillance-related systems to “listen,” analyze and respond to people’s social online communications.Today, a generation of “content creators” (aka influencers) is lured into becoming part of the integrated digital sales force that sells to young people and others. From “unboxing videos” and “virtual product placement” in popular content, to “kidfluencers” like Ryan’s World and “brand ambassadors” lurking in video games, to favorite TikTok creators pushing fast-food, this form of digital “payola” is endemic online.Take Ryan’s World. Leveraging “more than one billion views” on YouTube, as well as a Nickelodeon show, has “catapulted him... to a global multi-category force,” notes his production and licensing firm. The deals include a “preschool product line in multiple categories, “best in class partnerships, and a “Tag with Ryan” app that garnered 16 million downloads. Brands seeking help selling products, says Ryan’s media agency, “can connect with its kid fanbase of millions that leverages our world-class portfolio of kid-star partners to authentically and seamlessly connect your brand with Generation Alpha across YouTube, social media, mobile games, and OTT channels—everywhere kids tune in!... a Generation Alpha focused agency that delivers more than 8 BILLION views and 100 MILLION unique viewers every month!” (its emphasis). Also available is a “custom content and integrations” feature that can “create unique brand experiences with top-tier kid stars.” Ryan’s success is not unique, as more and more marketers create platforms and content, as well as merge companies, to deliver ads and marketing to children and teens. An array of influencer marketing platforms that offer “one-stop” shopping for brands to employ influencers, including through the use of programmatic marketing-like data practices (to hire people to place endorsements, for example) is a core feature of the influencer economy. There are also software programs so brands and marketers can automate their social influencer operations, as well as social media “dashboards” that help track and analyze social online conversations, brand mentions and other communications. The impact of influencers is being measured through a variety of services, including neuromarketing. Influencers are playing a key role in “social commerce,” where they promote the real-time sales of products and services on “shoppable media.” U.S. social commerce sales are predicted to grow to almost $80 billion in 2025 from its 2022 estimated total of $45.74 billion. Google, Meta, TikTok, Amazon/Twitch and Snapchat all have significant influencer marketing operations. As Meta/Facebook recently documented, there is also a growing role for “virtual” influencers that are unleashed to promote products and services. While there may be claims that many promotions and endorsements should be classified as “user generated content” (UGC), we believe the commission will find that the myriad influencer marketing techniques often play a role spurring such product promotion.The “Metaverse”: The same forces of digital marketing that have shaped today’s online experience for young people are already at work organizing the structure of the “metaverse.” There are virtual brand placements, advertisements, and industry initiatives on ad formats and marketing experiences. Building on work done for gaming and esports, this rapidly emerging marketing environment poses additional threats to young people and requires timely commission intervention.Global Standards: Young people in the U.S. have fewer protections than they do in other countries and regions, including the European Union and the United Kingdom. In the EU, for example, protections are required for young people until they are 18 years of age. The impact of the GDPR, the UK’s Design Code, the forthcoming Digital Services Act (and even some self-regulatory EU initiatives by companies such as Google) should be assessed. In what ways do U.S.-based platforms and companies provider higher or more thorough safeguards for children when they are required to do so outside of this country? The FTC has a unique role to ensure that U.S. companies operating online are in the forefront—not in the rear—of protecting the privacy and interests of children.The October Workshop: Our review of the youth marketing landscape is just a partial snapshot of the marketplace. We have not discussed “apps” and mobile devices, which pose many concerns, including those related to location, for example. But CDD hopes this comment will help inform the commission about the operations of contemporary marketing and its relationship to young people. We call on the FTC to ensure that this October, we are presented with an informed and candid discussion of the nature and impact of today’s marketing system on America’s youth.ftcyouthmarketing071822.pdfJeff Chester -
Considering Privacy Legislation in the context of contemporary digital data marketing practices Last week, the leading global advertisers, online platforms and data marketers gathered for the most important awards given by the ad industry—the “Cannes Lions.” Reviewing the winners and the “shortlist” of runners-up—competing in categories such as “Creative Data,” “Social and Influencer,” “Brand Experience & Activation,” “Creative Commerce” and “Mobile”—is essential to learn where the data-driven marketing business—and ultimately much of our digital experiences—is headed. An analysis of the entries reveals a growing role for machine learning and artificial intelligence in the creation of online marketing, along with geolocation tracking, immersive content and other “engagement” technologies. One takeaway, not surprisingly, is that the online ad industry continues to perfect techniques to secure our interest in its content so it can to gather more data from us.A U.S.-based company that also generated news during Cannes was The Trade Desk, a relatively unknown data marketing service that is playing a major role assisting advertisers and content providers to overcome any new privacy challenges posed by emerging or future legislation. The Trade Desk announced last week a further integration of its data and ad-targeting service with Amazon’s cloud AWS division, as well as a key role assisting grocer Albertsons new digital ad division. The Trade Desk has brokered a series of alliances and partnerships with Walmart, the Washington Post, Los Angeles Times, Gannett, NBC Universal, and Disney—to name only a few.There are several reasons these marketers and content publishing companies are aligning themselves with The Trade Desk. One of the most important is the company’s leadership in developing a method to collect and monetize a person’s identity for ongoing online marketing. “Unified ID 2.0” is touted to be a privacy-focused method that enables surveillance and effective ad targeting. The marketing industry refers to these identity approaches as “currencies” that enable the buying and selling of individuals for advertising. There are now dozens of identity “graph” or “identity spine” services, in addition to UDID, which reflect far-reaching partnerships among data brokers, publishers, adtech specialists, advertisers and marketing agencies. Many of these approaches are interoperable, such as the one involving Acxiom spin-off LiveRamp and The Trade Desk. A key goal, when you listen to what these identity brokers say, is that they would like to establish a universal identifier for each of us, to directly capture our attention, reap our data, and monetize our behavior. For the last several years, as a result of the enactment of the GDPR in the EU, the passage of privacy legislation in California, and the potential of federal privacy legislation, Google, Apple, Firefox and others have made changes or announced plans related to their online data practices. So-called “third party cookies,” which have long enabled commercial surveillance, are being abandoned—especially since their role has repeatedly raised concerns from data-protection regulators. Taking their place are what the surveillance marketing business believes are privacy-regulation-proof strategies. There are basically two major, but related, efforts that have been underway—here in the U.S. and globally.The first tactic is for a platform or online publisher to secure the use of our information through an affirmative consent process—called a “first-party” data relationship in the industry. The reasoning goes is that an individual wants an ongoing interaction with the site—for news, videos, groceries, drugs and other services, etc. Under this rationale, we are said to understand and approve how platforms and publishers will use our information as part of the value exchange. First-party data is becoming the most valuable asset in the global digital marketing business, enabling ongoing collection, generating insights, and helping maintain the surveillance model. It is considered to have few privacy problems. All the major platforms that raise so many troubling issues—including Google, Amazon, Meta/Facebook—operate through extensive first-party data relationships. It’s informative to see how the lead digital marketing trade group—the Interactive Advertising Bureau (IAB)—explains it: “ “first party data is your data…presents the least privacy concerns because you have full control over its collection, ownership and use.”The second tactic is a variation on the first, but also relies on various forms of identity-resolution strategies. It’s a response in part to the challenges posed by the dominance of the “walled garden” digital behemoths (Google, etc.) as well the need to overcome the impact of privacy regulation. These identity services are the replacement for cookies. Some form of first-party data is captured (and streaming video services are seen as a gold mine here to secure consent), along with additional information using machine learning to crunch data from public sources and other “signals.” Multimillion member panels of consumers who provide ongoing feedback to marketers, including information about their online behaviors, also help better determine how to effectively fashion the digital targeting elements. The Trade Desk-led UDID is one such identity framework. Another is TransUnion’s “Fabrick,” which “provides marketers with a sustainable, privacy-first foundation for all their data management, marketing and measurement needs.” Such rhetoric is typical of how the adtech/data broker/digital marketing sectors are trying to reframe how they conduct surveillance.Another related development, as part of the restructuring of the commercial surveillance economy, is the role of “data clean rooms.” Clean rooms enable data to be processed under specific rules set up by a marketer. As Advertising Agerecently explained, clean rooms enable first-party and other marketers to provide “access to their troves of data.” For Comcast’s NBCU division and Disney, this treasure chest of information comes from “set-top boxes, streaming platforms, theme parks and movie studios.” Various privacy rules are supposed to be applied; in some cases where they have consent, two or more parties will exchange their first-party data. In other cases, where they may not have such open permission, they will be able to “create really interesting ad products; whether it's a certain audience slice, or audience taxonomy, or different types of ad units….” As an NBCU executive explained about its clean room activity, “we match the data, we build custom audiences…we plan, activate and we measure. The clean room is now the safe neutral sandbox where all the parties can feel good sharing first party data without concerns of data leakage.”We currently have at least one major privacy bill in Congress that includes important protections for civil rights and restricts data targeting of children and teens, among other key provisions. It’s also important when examining these proposals to see how effective they will be in dealing with the surveillance marketing industry’s current tactics. If they don’t effectively curtail what is continuous and profound surveillance and manipulation by the major digital marketers, and also fail to rein in the power of the most dominant platforms, will such a federal privacy promise really deliver? We owe it to the public to determine whether such bills will really “clean up” the surveillance system at the core of our online lives.
-
-
Groups say FIFA: Ultimate Team preys on children’s vulnerability with loot boxes, “funny money" Contact:David Monahan, Fairplay david@fairplayforkids.orgJeff Chester, CDD jeff@democraticmedia.org; 202-494-7100Advocates call on FTC to investigate manipulative design abuses in popular FIFA gameGroups say FIFA: Ultimate Team preys on children’s vulnerability with loot boxes, “funny money”BOSTON and WASHINGTON, DC – Thursday, June 2, 2022 – Today, advocacy groups Fairplay and Center for Digital Democracy (CDD) led a coalition of 15 advocacy groups in calling on the Federal Trade Commission (FTC) to investigate video game company Electronic Arts (EA) for unfairly exploiting young users in EA’s massively popular game, FIFA: Ultimate Team. In a letter sent to the FTC, the advocates described how the use of loot boxes and virtual currency in FIFA: Ultimate Team exploits the many children who play the game, especially given their undeveloped financial literacy skills and poor understanding of the odds of receiving the most desirable loot box items.Citing the Norwegian Consumer Council’s recent report, Insert Coin: How the Gaming Industry Exploits Consumers Using Lootboxes, the advocates’ letter details how FIFA: Ultimate Team encourages gamers to engage in a constant stream of microtransactions as they play the game. Users are able to buy FIFA points, a virtual in-game currency, which can then be used to purchase loot boxes called FIFA packs containing mystery team kits; badges; and player cards for soccer players who can be added to a gamer’s team. In their letter, the advocates noted the game’s use of manipulative design abuses such as “lightning round” sales of premium packs to promote the purchase of FIFA packs, which children are particularly vulnerable to. The advocates also cite the use of virtual currency in the game, which obscures the actual cost of FIFA packs to adult users, let alone children. Additionally, the actual probability of unlocking the best loot box prizes in FIFA: Ultimate Team is practically inscrutable to anyone who is not an expert in statistics, according to the advocates and the NCC report. In order to unlock a specific desirable player in the game, users would have to pay around $14,000 or spend three years continuously playing the game. “By relentlessly marketing pay-to-win loot boxes, EA is exploiting children’s desire to compete with their friends, despite the fact that most adults, let alone kids, could not determine their odds of receiving a highly coveted card or what cards cost in real money. The FTC must use its power to investigate these design abuses and determine just how many kids and teens are being fleeced by EA.” Josh Golin, Executive Director, Fairplay“Lootboxes, virtual currencies, and other gaming features are often designed deceptively, aiming to exploit players’ known vulnerabilities. Due to their unique developmental needs, children and teens are particularly harmed. Their time and attention is stolen from them, they're financially exploited, and are purposely socialized to adopt gambling-like behaviors. Online gaming is a key online space where children and teens gather in millions, and regulators must act to protect them from these harmful practices.” Katharina Kopp, Deputy Director, Center for Digital Democracy“As illustrated in our report, FIFA: Ultimate Team uses aggressive in-game marketing and exploits gamers’ cognitive biases - adults and children alike - to manipulate them into spending large sums of money. Children especially are vulnerable to EA’s distortion of real-world value of its loot boxes and the complex, misleading probabilities given to describe the odds of receiving top prizes. We join our US partners in urging the Federal Trade Commission to investigate these troubling practices.” Finn Lützow-Holm Myrstad, Digital Policy Director, Norwegian Consumer Council"The greed of these video game companies is a key reason why we're seeing a new epidemic of child gambling in our families. Thanks to this report, the FTC has more than enough facts to take decisive action to protect our kids from these predatory business practices." Les Bernal, National Director of Stop Predatory Gambling and the Campaign for Gambling-Free Kids“Exploiting consumers, especially children, by manipulating them into buying loot boxes that, in reality, rarely contain the coveted items they are seeking, is a deceptive marketing practice that causes real harm and needs to stop. TINA.org strongly urges the FTC to take action.” Laura Smith, Legal Director at TINA.orgAdvocacy groups signing today's FTC complaint include Fairplay; the Center for Digital Democracy; Campaign for Accountability; Children and Screens: Institute of Digital Media and Child Development; Common Sense Media; Consumer Federation of America; Electronic Privacy Information Center (EPIC); Florida Council on Compulsive Gambling, Inc.; Massachusetts Council on Gaming and Health; National Council on Problem Gambling; Parent Coalition for Student Privacy; Public Citizen; Stop Predatory Gambling and the Campaign for Gambling-Free Kids; TINA.org (Truth in Advertising, Inc.); U.S. PIRG### lootboxletter_pr.pdf, lootboxletterfull.pdf
-
Press Release
Press Statement regarding FTC Policy Statement on Education Technology and the Children’s Online Privacy Protection Act
Press Statement regarding today’s FTC Policy Statement on Education Technology and the Children’s Online Privacy Protection ActJeff Chester, Executive Director, Center for Digital Democracy:Today, the Federal Trade Commission adopts a long overdue policy designed to protect children’s privacy. By shielding school children from the pervasive forces of commercial surveillance, which gathers their data for ads and marketing, the FTC is expressly using a critical safeguard from the bipartisan Children’s Online Privacy Protection Act (COPPA). Fairplay, Center for Digital Democracy, and a coalition of privacy, children’s health, civil and consumer rights groups had previously called on the commission to enact policies that make this very Edtech safeguard possible. We look forward to working with the FTC to ensure that parents can be confident that their child’s online privacy and security is protected in—or out of-the classroom. However, the Commission must also ensure that adolescents receive protections from what is now an omniscient and manipulative data-driven complex that profoundly threatens their privacy and well-being.