program areas Digital Youth
Program Areas
-
Press Release
Advocates demand Federal Trade Commission investigate Google for continued violations of childrenâs privacy law
Following news of Googleâs violations of COPPA and 2019 settlement, 4 advocates ask FTC for investigation
Contact:Josh Golin, Fairplay: josh@fairplayforkids.orgJeff Chester, Center for Digital Democracy: jeff@democraticmedia.org Advocates demand Federal Trade Commission investigate Google for continued violations of childrenâs privacy lawFollowing news of Googleâs violations of COPPA and 2019 settlement, 4 advocates ask FTC for investigation BOSTON and WASHINGTON, DC â WEDNESDAY, August 23, 2023 â The organizations that alerted the Federal Trade Commission (FTC) to Googleâs violations of the Childrenâs Online Privacy Protection Act (COPPA) are urging the Commission to investigate whether Google and YouTube are once again violating COPPA, as well as the companiesâ 2019 settlement agreement and the FTC Act. In a Request for Investigation filed today, Fairplay and the Center for Digital Democracy (CDD) detail new research from Adalytics, as well as Fairplayâs own research, indicating Google serves personalized ads on âmade for kidsâ YouTube videos and tracks viewers of those videos, even though neither is permissible under COPPA. Common Sense Media and the Electronic Privacy Information Center (EPIC), joined Fairplay and CDD in calling on the Commission to investigate and sanction Google for its violations of childrenâs privacy. The advocates suggest that the FTC should seek penalties upwards of tens of billions of dollars. In 2018, Fairplay and Center for Digital Democracy led a coalition asking the FTC to investigate YouTube for violating the Childrenâs Online Privacy Protection Act (COPPA) by collecting personal information from children on the platform without parental consent. As a result of the advocatesâ complaint, Google and YouTube were required to pay a then-record $170 million fine in a 2019 settlement with the FTC and comply with COPPA going forward. Rather than getting the required parental permission before collecting personally identifiable information from children on YouTube, Google claimed instead it would comply with COPPA by limiting data collection and eliminating personalized advertising on âmade for kids.â But an explosive new report released by Adalytics last week called into question Googleâs assertions and compliance with federal privacy law. The report detailed how Google appeared to be surreptitiously using cookies and identifiers to track viewers of âmade for kidsâ videos. The report also documented how YouTube and Google appear to be serving personalized ads on âmade for kidsâ videos and transmitting data about viewers to data brokers and ad tech companies. In response to the report, Google told the New York Times that ads on childrenâs videos are based on webpage content, not targeted to user profiles. But follow-up research conducted independently by both Fairplay and ad buyers suggests the ads are, in fact, personalized and Google is both violating COPPA and making deceptive statements about its targeting of children. Both Fairplay and the ad buyers ran test ad campaigns on YouTube where they selected a series of users of attributes and affinities for ad targeting and instructed Google to only run the ads on âmade for kidsâ channels. In theory, these test campaigns should have resulted in zero placements, because under Google and YouTubeâs stated policy, no personalized ads are supposed to run on âmade for kidsâ videos. Yet, Fairplayâs targeted $10 ad campaign resulted in over 1,400 impressions on âmade for kidsâ channels and the ad buyers reported similar results. Additionally, the reporting Google provided to Fairplay and the ad buyers to demonstrate the efficacy of the ad buys would not be possible if the ads were contextual, as Google claims. âIf Googleâs representations to its advertisers are accurate, it is violating COPPA,â said Josh Golin, Executive Director of Fairplay. âThe FTC must launch an immediate and comprehensive investigation and use its subpoena authority to better understand Googleâs black box child-directed ad targeting. If Google and YouTube are violating COPPA and flouting their settlement agreement with the Commission, the FTC should seek the maximum fine for every single violation of COPPA and injunctive relief befitting a repeat offender.â The advocatesâ letter urges the FTC to seek robust remedies for any violations, including but not limited to: ¡      Civil penalties that demonstrate that continued violations of COPPA and Section 5 of the FTC Act are unacceptable. Under current law, online operators can be fined $50,120 per violation of COPPA. Given the immense popularity of many âmade for kidsâ videos, it is likely millions of violations have occurred, suggesting the Commission should seek civil penalties upwards of tens of billions of dollars.¡      An injunction requiring relinquishment of all ill-gotten gains¡      An injunction requiring disgorgement of all algorithms trained on impermissibly collected data¡      A prohibition on the monetization of minorsâ data¡      An injunction requiring YouTube to move all âmade for kidsâ videos to YouTube Kids and remove all such videos from the main YouTube platform. Given Googleâs repeated failures to comply with COPPA on the main YouTube platform â even when operating under a consent decree â these videos should be cabined to a platform that has not been found to violate existing privacy law¡      The appointment of an independent âspecial masterâ to oversee Googleâs operations involving minors and provide the Commission, Congress, and the public semi-annual compliance reports for a period of at least five yearsKatharina Kopp, Deputy Director of the Center for Digital Democracy, said âThe FTC must fully investigate what we believe are Googleâs continuous violations of COPPA, its 2019 settlement with the FTC, and Section 5 of the FTC Act. These violations place many millions of young viewers at risk. Google and its executives must be effectively sanctioned to stop its ârepeat offenderâ behaviorsâincluding a ban on monetizing the personal data of minors, other financial penalties, and algorithmic disgorgement. The Commissionâs investigation should also review how Google enables advertisers, data brokers, and leading online publisher partners to surreptitiously surveil the online activities of young people. The FTC should set into place a series of âfail-safeâ safeguards to ensure that these irresponsible behaviors will never happen again.â Caitriona Fitzgerald, Deputy Director of the Electronic Privacy Information Center (EPIC), said "Google committed in 2019 that it would stop serving personalized ads on 'made for kids' YouTube videos, but Adalyticsâ research shows that this harmful practice is still happening. The FTC should investigate this issue and Google should be prohibited from monetizing minorsâ data."Jim Steyer, President and CEO of Common Sense Media, said "The Adalytics findings are troubling but in no way surprising given YouTubeâs history of violating the kidsâ privacy. Google denies doing anything wrong and the advertisers point to Google, a blame game that makes children the ultimate losers. The hard truth is, companies â whether itâs Big Tech or their advertisers â basically care only about their profits, and they will not take responsibility for acting against kidsâ best interests. We strongly encourage the FTC to take action here to protect kids by hitting tech companies where it really hurts: their bottom line." ### -
In comments to the Federal Trade Commission, EPIC, the Center for Digital Democracy, and Fairplay urged the FTC to center privacy and data security risks as it evaluates Yoti Incâs proposed face-scanning tool for obtaining verifiable parental consent under the Childrenâs Online Privacy Protection Act (COPPA).In a supplementary filing CDD urges the Federal Trade Commission (FTC) to reject the parent-consent method proposed by the applicants Entertainment Software Rating Board (ESRB) and EPIC Gamesâ SuperAwesome division. Prior to any decision, the FTC must first engage in due diligence and investigate the contemporary issues involving the role and use of facial coding technology and its potential impact on childrenâs privacy. The commission must have a robust understanding of the data flows and insight generation produced by facial coding technologies, including the debate over their role as a key source of âattentionâ metrics, which are a core advertising measurement modality. Since this proposal is designed to deliver a significant expansion of childrenâs data collectionâgiven the constellation of brands, advertisers and publishers involved with the applicants and their child-directed market focusâa digital âcautionaryâ principle on this consent method is especially required here. Moreover, one of the applicants, as well as several key affiliates of the ESRBâEPIC Games, Amazon, and Microsoftâhave recently been sanctioned for violating COPPA, and any approval in the absence of a thorough fact-finding here would be premature.Â
-
Press Release
Advocates that filed YouTube kidsâ privacy case call for FTC investigation in light of NYT report of ongoing tracking and ad placement directed at kids
New research released today by Adalytics raises serious questions about whether Google is violating the Children's Online Privacy Protection Act (COPPA) by collecting data and serving personalized ads on child-directed videos on YouTube. In 2019, in response to a Request for Investigation by Fairplay and the Center for Digital Democracy, the Federal Trade Commission fined Google $170 million for violating COPPA on YouTube and required Google to change its data-collection and advertising practices on child-directed videos. As a result of that settlement, Google agreed to stop serving personalized ads and limit data collection on child-directed videos. Today's report - and subsequent reporting by The New York Times - call into question whether Google is complying with the settlement.  STATEMENTS FROM FAIRPLAY AND CDD:Josh Golin, Executive Director, Fairplay:This report should be a wake-up call to parents, regulators and lawmakers, and anyone who cares about children -- or the rule of law, for that matter. Even after being caught red-handed in 2019 violating COPPA, Google continues to exploit young children, and mislead parents and regulators about its data collection and advertising practices on YouTube. The FTC must launch an immediate and comprehensive investigation of Google and, if they confirm this report's explosive allegations, seek penalties and injunctive relief commensurate with the systematic disregard of the law by a repeat offender. Young children should be able to watch age-appropriate content on the world's biggest video platform with their right to privacy guaranteed, full stop. Jeff Chester, Executive Director, Center for Digital Democracy:Google operates the leading online destination for kidsâ video programming so it can reap enormous profits, including through commercial surveillance data and advertising tactics.  It must be held accountable by the FTC for what appears are violations of the Childrenâs Online Privacy Protection Act and its own commitments.   Leading advertisers, ad agencies, media companies and others partnering with Google appear to have been more interested in clicks than the safety of youth. There is a massive and systemic failure across the digital marketplace when it comes to protecting childrenâs privacy.   Congress should finally stand up to the powerful âBig Dataâ ad lobby and enact long-overdue privacy legislation.  Googleâs operations must also be dealt with by antitrust regulators.  It operates imperiously in the digital arena with no accountability. The Adalytics study should serve as a chilling reminder that our commercial surveillance system is running amok, placing even our most vulnerable at great risk. -
Press Release
Advocates call for FTC action to rein in Metaâs abusive practices targeting kids and teens
Letter from 31 organizations in tech advocacy, childrenâs rights, and health supports FTC action to halt Metaâs profiting off of young usersâ sensitive data
Contact:David Monahan, Fairplay: david@fairplayforkids.orgKatharina Kopp, Center for Digital Democracy: kkopp@democraticmedia.org Advocates call for FTC action to rein in Metaâs abusive practices targeting kids and teensLetter from 31 organizations in tech advocacy, childrenâs rights, and health supports FTC action to halt Metaâs profiting off of young usersâ sensitive data BOSTON/ WASHINGTON DCâJune 13, 2023â A coalition of leading advocacy organizations is standing up today to support the Federal Trade Commissionâs recent order reining in Metaâs abusive practices aimed at kids and teens.  Thirty-one groups, led by the Center for Digital Democracy, the Electronic Privacy Information Center (EPIC), Fairplay, and U.S. PIRG, sent a letter to the FTC saying âMeta has violated the law and its consent decrees with the Commission repeatedly and flagrantly for over a decade, putting the privacy of all users at risk. In particular, we support the proposal to prohibit Meta from profiting from the data of children and teens under 18. This measure is justified by Metaâs repeated offenses involving the personal data of minors and by the unique and alarming risks its practices pose to children and teens.â  Comments from advocates: Katharina Kopp, Director of Policy, Center for Digital Democracy:âThe FTC is fully justified to propose the modifications of Metaâs consent decree and to require it to stop profiting from the data it gathers on children and teens. There are three key reasons why. First, due to their developmental vulnerabilities, minors are uniquely harmed by Metaâs failure to comply repeatedly with its 2012 and 2020 settlements with the FTC, including its non-compliance with the federal childrenâs privacy law (COPPA); two, because Meta has failed for many years to even comply with the procedural safeguards required by the Commission, it is now time for structural remedies that will make it less likely that Meta can again disregard the terms of the consent decree; and three, the FTC must affirm its credibility and that of the rule of law and ensure that tech giants cannot evade regulation and meaningful accountability.â John Davisson, Director of Litigation, Electronic Privacy Information Center (EPIC): "Meta has had two decades to clean up its privacy practices after many FTC warnings, but consistently chose not to. That's not 'tak[ing] the problem seriously,' as Meta claimsâthat's lawlessness. The FTC was right to take decisive action to protect Meta's most vulnerable users and ban Meta from profiting off kids and teens. It's no surprise to see Meta balk at the legal consequences of its many privacy violations, but this action is well within the Commission's power to take.â Haley Hinkle, Policy Counsel, Fairplay: âMeta has been under the FTC's supervision in this case for over a decade now and has had countless opportunities to put user privacy over profit. The Commission's message that you cannot monetize minors' data if you can't or won't protect them is urgent and necessary in light of these repeated failures to follow the law. Kids and teens are uniquely vulnerable to the harms that result from Metaâs failure to run an effective privacy program, and they canât wait for change any longer.â R.J. Cross, Director of U.S. PIRGâs Donât Sell My Data campaign: âThe business model of social media is a recipe for unhappiness. Weâre all fed content about what we should like and how we should look, conveniently presented alongside products that will fix whatever problem with our lives the algorithm has just helped us discover. Thatâs a hard message to hear day in and day out, especially when youâre a teen. Weâre damaging the self-confidence of some of our most impressionable citizens in the name of shopping. Itâs absurd. Itâs time to short circuit the business model.â  ### -
Press Release
FTC proposed order on COPPA case against Microsoft ensures COPPA keeps pace with increasingly sophisticated practices of marketers
âBy clarifying what types of data constitute personal data under COPPA, the FTC ensures that COPPA keeps pace with the 21st century and the increasingly sophisticated practices of marketers,â said Katharina Kopp, Director of Policy at Center for Digital Democracy.âAs interactive technologies evolve rapidly, COPPA must be kept up to date and reflect changes in the way children use and access these new media, including virtual and augmented realities. The metaverse typically involves a convergence of physical and digital lives, where avatars are digital extension of our physical selves. We agree with the FTC that an avatarâs characteristics and its behavior constitute personal information. And as virtual and augmented reality interfaces allow for the collection of extensive sets of personal data, including sensitive and biometric data, this data must be considered personal information under COPPA. Without proper protections this highly coveted data would be exploited by marketers and used to further manipulate and harm children online.â -
Contact: Katharina Kopp, kkopp [at] democraticmedia.orgâWe welcome the FTC âs action to address the rampant commercial surveillance of children via Internet of Things (IoT) devices, such as Amazonâs Echo, and for enforcing existing law,â said Katharina Kopp, Director of Policy at Center for Digital Democracy. âChildrenâs data is taken away from them illegally and surreptitiously on a massive scale via IoT devices, including their voice recordings and data gleaned from kidsâ viewing, reading, listening, and purchasing habits. These violations in turn lead to further exploitation and manipulation of children and teens. They lead to violating their privacy, to manipulating them into being interested in harmful products, to undermining their autonomy and hooking them to digital media, and to perpetuating discrimination and bias. As Commissioner Bedoyaâs separate statement points out, with this proposed order the FTC warns companies that they cannot take data from children and teens (and others) illegitimately to develop even more sophisticated methods to take advantage of them. Both the FTC and the Department of Justice must hold Amazon accountable.â
-
News
The Kids Online Safety Act: Protecting LGBTQ+ Children & Adolescents Online - How changes to the Kids Online Safety Act will protect LGBTQ+ youth
FACT SHEET
FACT SHEETSummary of the Kids Online Safety ActAs Congressional hearings, media reports, academic research, whistleblower disclosures, and heartbreaking stories from youth and families have repeatedly shown, social media platforms have exacerbated the mental health crisis among children and teens fostering body image issues, creating addiction-like use, promoting products that are dangerous for young audiences, and fueling destructive bullying.  The Kids Online Safety Act (KOSA) provides children, adolescents, and parents with the tools, safeguards, and transparency they need to protect against threats to young people's health and wellbeing online. The design and operation of online platforms have a significant impact on these harms, such as recommendation systems that send kids down rabbit holes of destructive content, and weak protections against relentless bullying.KOSA would provide safeguards and accountability through:   Creating a duty of care for social media platforms to prevent and mitigate specific dangers to minors in their design and operation of products, including the promotion of suicidal behaviors, eating disorders, substance use, sexual exploitation, advertisements for tobacco and alcohol, and more.Requiring social media platforms to provide children and adolescents with options to protect their information, disable addictive product features, and opt out of algorithmic recommendations. Platforms are required to enable the strongest settings by default.  Giving parents new tools to help support their children and providing them (as well as schools) a dedicated reporting channel to raise issues (such as harassment or threats) to the platforms.How Online Harms Impact LGBTQ+ CommunitiesSocial media can be an important tool for self-discovery, expression, and community. However, online platforms have failed to take basic steps to protect their users from profound harm and have put profit ahead of safety. Companies have operationalized their products to keep young users on their sites for as long as possible, even if the means to get people to use their platforms more are harmful. From documents provided by a whistleblower, Facebookâs own researchers described Instagram itself as a âperfect stormâ that âexacerbates downward spiralsâ and produces hundreds of millions of dollars in revenue annually.  This âperfect stormâ has been shown by academic research and surveys to weigh most profoundly on LGBTQ+ children and adolescents, who are more at risk of bullying, threats, and suicidal behaviors on social media. Some harms and examples of the protections KOSA would provide include:  LGBTQ+ youth are more at risk of cyberbullying and harassment.LGBTQ+ high school students consistently report higher rates of cyberbullying than their heterosexual peers, and suffer more severe forms of harassment, such as stalking, non-consensual imagery, and violent threats.Surveys have found that 56% of LGBTQ+ students had been cyberbullied in their lifetime compared to 32% for non-LGBTQ+ students.One in three young LGBTQ+ people have said that they had been sexually harassed online, four times as often as other young people.  LGBTQ+ youth are more at risk for eating disorders and substance use.Young LGBTQ+ people experience significantly greater rates of eating disorders and substance use compared to their heterosexual and cisgender peers. Transgender and nonbinary youth are at even higher risk for eating disorders, and Black LGBTQ+ youth are diagnosed at half the rate of their white peers.Prolonged use of social media is linked with negative appearance comparison, which in turn increases risk for eating disorder symptoms.Engagement-based algorithms feed extreme eating disorders through recommending more eating disorder content to vulnerable users (every click or view sends more destructive content to a user).For example, TikTok began recommending eating disorder content within 8 minutes of creating a new account and Instagram was found to deluge a new user with eating disorder recommendations within one day.How KOSA Will Help:KOSA would require that platforms give users the ability to turn off engagement-based algorithms or options to influence the recommendation they receive. A user would be able to stop recommendation systems that are sending them toxic content.  KOSAâs duty of care requires platforms to prevent and mitigate cyberbullying. It also requires that platforms give users options to restrict messages from other users and to make their profile private.It would require platforms to provide a point of contact for users to report harassment and mandates platforms respond to these reports within a designated time frame.  LGBTQ+ youth are more at risk of suicide and suicidal behaviors.Young people exposed to hateful messaging online in tandem with self-harm material on social media, increases the risk of suicidal behaviors and/or suicide.These risks are exacerbated when platform recommendation systems amplify hateful content and self-harm content.For example, after creating a new teen account on TikTok, suicide content was recommended under three minutes.Surveys have found 42% of LGBTQ+ youth seriously considered attempting suicide, including more than half of transgender and nonbinary youth.Moreover, eating disorders, depression, bullying, substance use, and other mental health harms that fall harder on LGBTQ+ communities further increase risks of self-harm and suicide.  How KOSA Will Help:In addition to the core safeguards and options provided to kids, such as controls and transparency over algorithmic recommendation systems, KOSAâs duty of care would require platforms consider and address the ways in which their recommendation systems promote suicide and suicidal behaviors, creating incentives for the platforms to provide self-help resources, uplift information about recovery, and prevent their algorithms from pushing users down rabbit holes of harmful and deadly content.Protections for LGBTQ+ CommunitiesThe reintroduction of the Kids Online Safety Act takes into account recommended edits from a diverse group of organizations, researchers, youth, and families.The outcome from experts in the field and those with lived experience is a thoughtful and tailored bill designed to be a strong step in advancing a core set of accountability provisions to provide children, adolescents, and families with a safer online experience. Below is a summary comparing previous bill text and changes that were made for reintroduction.Concerns with Previous DraftHow Current Draft Protects LGBTQ+The âduty of careâ is too vague, creating liabilities for broad and undefined harms to children and teens.The duty of care is now limited to a set of specific harms that have been shown to be exacerbated by online platformsâ product designs and algorithms. Specific harms are focused on serious threats to the wellbeing of young users, such as, eating disorders, substance use, depression, anxiety, suicidal behaviors, physical violence, sexual exploitation, and the marketing of narcotics, tobacco, gambling, alcohol. The terms used to describe those harms are linked to clinical or legal definitions where there is a perceived risk of misuse. In addition, the duty of care includes a limitation to ensure it is not construed to require platforms to block access to content that a young user specifically requests or block access to evidence-informed medical information and support resources.The inclusion of âgroomingâ in the duty of care could be weaponized against entities providing information about gender-affirming care.âGroomingâ was cut from the bill. Sexual exploitation and abuse are now defined using existing federal criminal statutes to prevent politicalization or distortion of terms.The duty of care to prevent and mitigate âself-harmâ or âphysical harmâ could be weaponized against trans youth and those who provide information about gender-affirming care.The specific reference to âself-harmâ has been removed from the duty of care. âPhysical harmâ has been changed to âphysical violenceâ to enhance clarity. Other covered harms related to âself-harmâ are covered using terminology that is anchored in a medical definition.Will allow non-supportive parents to surveil LGBTQ+ youth online.The legislation clarifies the tools available to protect kids and differentiates the developmental differences between children and young teens.KOSA has always included requirements that children and adolescents are notified if parental controls are turned on, and required kids know before parents are informed about creating a new account. For teens, the bill requires platforms to give parents the ability to restrict purchases, view metrics on how much time a minor is spending on a platform and view - but not change - account settings. It does not require the disclosure of a minorâs browsing behavior, search history, messages, or other content or metadata of their communications.KOSA will lead to privacy-invasive age verification across the internet.KOSA never required age verification or gating, nor did it create liability for companies if kids lie about their age.The bill explicitly states that companies are not required to age-gate or collect additional data to determine a userâs age.Additionally, a knowledge standard is more consistently applied across the bill for the purpose of clarifying that companies are not liable if they have no knowledge whether a user is a child or adolescent.KOSA will affect access to sexual health information, schools, or nonprofit services.KOSA requirements only apply to commercial online platforms, such as social media and games that have been the largest source of issues for kids online.Nonprofits, schools, and broadband services are exempt from KOSA and a previous reference to âeducational servicesâ was removed from the âcovered platformâ definition of the bill.KOSA does not apply to health sites or other information resources. -
CDD urges Congress to adopt stronger online safeguards for kids and teensContact: Katharina Kopp, kkopp [at] democraticmedia.orgThe Childrenâs Online Privacy Protection Act (COPPA 2.0), introduced by Senators Markey and Cassidy, will provide urgently needed online safeguards for children and teens. It will enact real platform accountability and limit the economic and psychological exploitation of children and teens online and thus address the public health crisis they are experiencing.By banning targeted ads to young people under 16, the endless streams of data collected by online companies to profile and track them will be significantly reduced. The ability of digital marketers and platforms to manipulate, discriminate, and exploit children and teens will be curtailed. COPPA 2.0 will also extend the original COPPA law protections for youth from 12 to 16 years of age. Â The proposed law provides the ability to delete childrenâs and teenâs data with a click of an âeraser button.â Â With the creation of a new FTC "Youth Marketing and Privacy Division,â COPPA 2.0 will ensure young peoplesâ privacy rights are enforced.
-
Metaâs Virtual Reality-based Marketing Apparatus Poses Risks to Teens and OthersWhether itâs called Facebook or Meta, or known by its Instagram, WhatsApp, Messenger or Reels services, the company has always seen children and teens as a key target. The recent announcement opening(link is external) up the Horizon Worlds metaverse(link is external) to teens, despite calls to first ensure it will be a safe and healthy experience, is lifted out of Facebookâs well-worn political playbookâmake whatever promises necessary to temporarily quell any political opposition to its monetization plans. Metaâs priorities are intractably linked to its quarterly shareholder revenue reports. Selling our ârealâ and âvirtualâ selves to marketers is their only real source of revenue, a higher priority than any self-regulatory scheme Meta offers(link is external) claiming to protect children and teens.Metaâs focus on creating more immersive, AI/VR, metaverse-connected experiences for advertisers should serve as a âwake-upâ call for regulators. Meta has unleashed a digital environment designed to trigger the âengagement(link is external)â of young people with marketing, data collection and commercially driven manipulation. Action is required to ensure that young people are treated fairly, and not exposed to data surveillance, threats to their health and other harms.Here are a few recent developments that should be part of any regulatory review of Meta and young people:Expansion of âimmersive(link is external)â video and advertising-embedded applications: Meta tells marketers it provides âseamless video experiences that are immersive and fueled by discovery,â including the âexciting(link is external) opportunity for advertisersâ with its short-video âReelsâ system. Through virtual reality (VR) and augmented reality (AR(link is external)) technologies, we are exposed to advertising content designed to have a greater impact by influencing our subconscious and emotional processes. With AR ads, Meta tells(link is external) marketers, they can âcreate immersive experiences, encourage people to virtually try out your products and inspire people to interact with your brand,â including encouraging âpeople who interact with your ad⌠[to]take photos or videos to share their experience on Facebook Feed, on Facebook and Instagram Stories or in a message on Instagram.â Meta has also been researching(link is external) the use of AR(link is external) and VR(link is external) that will ensure that its ad and marketing messaging becomes even more compelling.Expanded integration of ads throughout Meta applications: Meta allows advertisers to âturn organic image and video posts into ads in Ads Manager on Facebook Reels,â including adding a âcall-to-actionâ feature. It permits marketers to âboost their Reels within the Instagram app to turn them into adsâŚ.â It enables marketers âto add a âSend Messageâ button to their Facebook Reels ads [that] give people an option to start a conversation in WhatsApp(link is external) right from the ad.â This follows last yearâs Meta âBoosted Reelsâ product(link is external) release, allowing Instagram Reels to be turned into ads as well.âAds Managerâ âoptimization(link is external) goalsâ that are inappropriate when used for targeting young people: These include âimpressions, reach, daily unique reach, link clicks and offsite conversions.â âAd placementsâ to target teens are available for the âFacebook Marketplace, Facebook Feed, ⌠Facebook Stories, Facebook-instream video (mobile), Instagram Feed, Instagram Explore, Instagram Stories, Facebook Reels and Instagram Reels.âThe use of metrics for delivering and measuring the impact of augmented reality ads: As Meta explains, it uses:(link is external)Instant Experience View Time: The average total time in seconds that people spent viewing an Instant Experience. An Instant Experience can include videos, images, products from a catalog, an augmented reality effect and more. For an augmented reality ad, this metric counts the average time people spent viewing your augmented reality effect after they tapped your ad.Instant Experience Clicks to Open: The number of clicks on your ad that open an Instant Experience. For an augmented reality ad, this metric counts the number of times people tapped your ad to open your augmented reality effect.Instant Experience Outbound Clicks: The number of clicks on links in an Instant Experience that take people off Meta technologies. For an augmented reality ad, this metric counts the number of times people tapped the call to action button in your augmented reality effect.Effect Share: The number of times someone shared an image or video that used an augmented reality effect from your ad. Shares can be to Facebook or Instagram Stories, to Facebook Feed or as a message on Instagram.These ad effects can be designed and tested(link is external) through Metaâs âSpark Hubâ and ad manager. Such VR and other measurement systems require regulators to analyze their role and impact on youth.Expanded use of machine learning/AI to promote shopping via Advantage(link is external)+: Last year, Meta rolled out âAdvantage+ shopping campaigns, Metaâs machine-learning capabilities [that] save advertisers(link is external) time and effort while creating and managing campaigns. For example, advertisers can set up a single Advantage+ shopping campaign, and the machine learning-powered automation automatically combines prospecting and retargeting audiences, selects numerous ad creative and messaging variations, and then optimizes for the best-performing ads.â While Meta says that Advantage+ isnât used to target teens, it deploys(link is external) it for âGen Zâ audiences. How Meta uses machine learning/AI to target families should also be on the regulatory agenda.Immersive advertising will shape the near-term evolution of marketing, where brands will be âworld agnostic and transcend the limitations of the current physical and digital space.â The Advertising Research Foundation (ARF) predicts(link is external) that âin the next decade, AR and VR hardware and software will reach ubiquitous status.â One estimate is that by 2030, the metaverse will âgenerate(link is external) up to $5 trillion in value.âIn the meantime, Metaâs playbook in response to calls from regulators and advocates is to promise some safeguards, often focused on encouraging the use of what it calls âsafety(link is external) tools.â But these tools(link is external) do not ensure that teens arenât reached and influenced by AI- and VR-driven marketing technologies and applications. Meta also knows that today, ad-targeting is less important than so-called âdiscovery(link is external),â where its purposeful melding of its video content, AR effects, social interactions and influencer marketing will snare young people into its marketing âconversionâ(link is external) net.Last week, Mark Zuckerberg told(link is external) investors his vision of bringing âAI agents to billions of people,â as well as into his âmetaverseâ that will be populated by âavatars, objects, worlds, and codes to tieâ online and offline together. There will be, as previously reported, an AI-driven âdiscovery(link is external) engineâ that will âincrease the amount of suggested content to users.âThese developments reflect just a few of the AI- and VR-marketing-driven changes to the Meta system. They illustrate why responsible regulators and advocates must be in the forefront of holding this company accountable, especially with regard to its youth-targeting apparatus.Please also read(link is external) Fairplay for Kidsâ account of Metaâs long history of failing to protect children online.   metateensaivr0523fin.pdfJeff Chester
-
Press Release
Reining In Metaâs Digital âWild Westâ as FTC protects young peopleâs safety, health and privacy
Reining In Metaâs Digital âWild Westâ as FTC protects young peopleâs safety, health and privacyContacts:Jeff Chester, CDD, 202-494-7100David Monahan, Fairplay, 781-315-2586Childrenâs advocates Fairplay and Center for Digital Democracy respond to todayâs announcement that the FTC proposes action to address Facebookâs privacy violations in practices impacting children and teens.  And see important new information compiled by Fairplay and CDD, linked below.Josh Golin, executive director, Fairplay:The action taken by the Federal Trade Commission against Meta is long overdue. For years, Meta has flouted the law and exploited millions of children and teens in their efforts to maximize profits, with little care as to the harms faced by young users on their platforms. The FTC has rightly recognized Meta simply cannot be trusted with young peopleâs sensitive data and proposed a remedy in line with Metaâs long history of abuse of children. We applaud the Commission for its efforts to hold Meta accountable and for taking a huge step toward creating the safe online ecosystem every young American deserves.Jeff Chester, executive director, Center for Digital Democracy:Todayâs action by the Federal Trade Commission (FTC) is a long-overdue intervention into what has become a huge national crisis for young people. Meta and its platforms are at the center of a powerful commercialized social media system that has spiraled out of control, threatening the mental health and wellbeing of children and adolescents. The company has not done enough to address the problems caused by its unaccountable data-driven commercial platforms. Amid a continuing rise in shocking incidents of suicide, self-harm and online abuse, as well as exposĂŠs from industry âwhistleblowers,â Meta is unleashing even more powerful data gathering and targeting tactics fueled by immersive content, virtual reality and artificial intelligence, while pushing youth further into the metaverse with no meaningful safeguards. Parents and children urgently need the government to institute protections for the âdigital generationâ before it is too late. Todayâs action by the FTC limiting how Meta can use the data it gathers will bring critical protections to both children and teens. It will require Meta/Facebook to engage in a proper âdue diligenceâ process when launching new products targeting young peopleârather than its current method of ârelease first and address problems later approach.â The FTC deserve the thanks of U.S parents and others concerned about the privacy and welfare of our âdigital generation.âNEW REPORTS:META HAS A LONG HISTORY OF FAILING TO PROTECT CHILDREN ONLINE(link is external)(from Fairplay)METAâS VIRTUAL REALITY-BASED MARKETING APPARATUS POSES RISKS TO TEENS AND OTHERS(from CDD)