Publishings Digital Citizen

  • Alvaro Bedoya, Center for Digital Democracy, Common Sense Kids Action, Consumer Action, Consumer Federation of America, Consumer Watchdog, Privacy Rights Clearinghouse, and U.S. PIRG The “Privacy Best Practice Recommendations for Commercial Facial Recognition Use” that have finally emerged from the multistakeholder process convened by the National Telecommunications and Information Administration (NTIA) are not worthy of being described as “best practices.” In fact, they reaffirm the decision by consumer and privacy advocates to withdraw from the proceedings. In aiming to provide a “flexible and evolving approach to the use of facial recognition technology” they provide scant guidance for businesses and no real protection for individuals, and make a mockery of the Fair Information Practice Principles on which they claim to be grounded. That is not surprising. It was clear to those of us who participated in this process that it was dominated by commercial interests and that we could not reach consensus on even the most fundamental question of whether individuals should be asked for consent for their images to be collected and used for purposes of facial recognition. Under these “best practices,” consumers have no say. Instead, those who follow these recommendations are merely “encouraged” to “consider” issues such as voluntary or involuntary enrollment, whether the facial template data could be used to determine a person’s eligibility for things such as employment, healthcare, credit, housing or employment, the risks and harms that the process may impose on enrollees, and consumers’ reasonable expectations. No suggestions are provided, however, for how to evaluate and deal with those issues. If entities use facial recognition technology to identify individuals, they are “encouraged” to provide those individuals the opportunity to control the sharing of their facial template data – but only for sharing with unaffiliated third parties that don’t already have the data, and “control” is not defined. Just as there is nothing that “encourages,” let alone requires, asking individuals for consent for their images to be collected and used for facial recognition in the first place, there is nothing that “encourages” offering them the ability to review, correct or delete their facial template data later. The recommendations merely “encourage” entities to disclose to individuals that they have that ability, if in fact they do. Further, if facial recognition is being used to target specific marketing to, for example, groups of young children, there is no “encouragement” to follow even these weak principles. There is much more lacking in these “best practices,” but there is one good thing: this document helps to make the case for why we need to enact laws and regulations to protect our privacy. If this is the “best” that businesses can do to address the privacy implications of collecting and using one of the most intimate types of individuals’ personal data – their facial images – it falls so short that it cannot be taken seriously and it demonstrates the ineffectiveness of the NTIA multistakeholder process.
  • Blog

    AT&T: See databrokers they use in attached doc. "Best Practices"

    How to optimize results on this groundbreaking platform

    Addressable TV was launched in 2012 by DIRECTV. Multi Video Program Distributors (MVPDs), such as DIRECTV, are currently the only entities offering true Addressable TV due to required access of both the video distribution system and data center. MVPDs offer Addressable TV in the ad breaks they receive from program networks, such as ESPN and CNN, as part of their carriage agreements. Currently, four MVPDs (DIRECTV, DISH, Comcast, and Cablevision) offer Addressable TV and that footprint is set to grow to 40 million households by end of year1. With AT&T’s acquisition of DIRECTV in 2015, AT&T AdWorks now has the largest national addressable platform, offering Addressable TV advertising across nearly 13 million DIRECTV households out of the 26 million combined DIRECTV and U-verse TV households. In a recent study conducted by Adweek and AT&T AdWorks, a survey of leading marketers indicated that current TV buying (without addressability) isn’t meeting marketing needs and there is both frustration and a desire to reach relevant audiences more effectively. Nearly all respondents agree that there is too much waste associated with TV and that traditional methods of measurement are outdated. As a result, over 80% are shifting TV dollars into digital for greater accountability and effectiveness. However, nearly all agree that TV would be more attractive if they “could target more finely.” As the leader in Addressable TV, AT&T AdWorks has run hundreds of campaigns across a wide array of advertisers and verticals. The purpose of this white paper is to share the learnings from those experiences to inform future campaigns and advertisers – and to demonstrate that TV still remains the most impactful advertising medium made even more effective by addressability. --- For more information, see the AT&T White Paper PDF.
  • Blog

    Katharina Kopp Joins CDD as Deputy Director and Director of Policy

    Leading Privacy Advocate Will Direct CDD’s Work on Big Data and the Public Interest

    Katharina Kopp, Ph.D., will join the Center for Digital Democracy (CDD) on 6 June 2016 as its deputy director and director of policy. Dr. Kopp comes to CDD with decades of experience as an advocate, scholar, policy analyst, privacy expert, and corporate leader. She will develop and oversee a range of new initiatives at CDD, expanding the scope of its work on the role and impact of “Big Data” in contemporary society. Dr. Kopp will focus particularly on developing new policies to advance individual autonomy and consumer protections, as well as social justice, equity, and human rights. She will also play a leadership role in the organization’s ongoing constituency-building and grassroots efforts. Dr. Kopp worked with the Center for Media Education during the 1990’s and served as a key policy advocate during the passage and implementation of the Children’s Online Privacy Protection Act (COPPA). In addition to her work with the Aspen Institute, the Benton Foundation, and the Health Privacy Project, Dr. Kopp served as vice president at American Express, leading its global privacy risk management program. Most recently she was the director of the Privacy and Data Project at the Center for Democracy and Technology. “We are privileged to have Katharina Kopp join CDD,” said Jeff Chester, its executive director. “Her unique leadership qualities and rich background will shape the organization’s agenda in this important new phase of our work. This includes forging partnerships with global NGOs, research institutions, and community-based organizations.” Dr. Kopp, deputy director, added: “I look forward to exploring the effects of technology and discriminatory data practices on democracy and social justice, particularly the effects on individual autonomy and increasing inequality. How to respond to these trends with appropriate public policy will be core to my work. Furthermore, I want to see CDD engaged in shaping the public’s understanding of these processes and to frame the solutions not in individualistic but collective and systemic terms. I believe that this will be key to the success of any policy proposals.” The Center for Digital Democracy is a leading nonprofit organization focused on empowering and protecting the rights of the public in the digital era. Since its founding in 2001 (and prior to that through its predecessor organization, the Center for Media Education), CDD has been at the forefront of research, public education, and advocacy protecting consumers in the digital age.
    Jeff Chester
  • Phone and cable ISPs pose a major threat to the privacy of their subscribers and consumers. They have a growing arsenal of “Big Data” capabilities that eavesdrop on their customers—including families. Internet Service Providers are gathering data on what we do and where we go, using sophisticated algorithms and predictive analytics to sell our information to marketers. As CDD documented in a report released last week, ISPs have been on a data buying and partnering shopping spree so they can build in-depth digital profiles of their customers (such as Verizon (link is external)/AOL/Millennial Media and Comcast/ (link is external)Visible World). Consumers should have the right to make decisions on how their information can be collected, shared or sold. With a set of FCC safeguards, Americans will have some of their privacy restored. We look forward placing on the record all the ways those ISPs now—and will—threaten the privacy of Americans. Several commissioners appear uninformed about the ability of the FTC to protect consumer privacy. The FTC does not have the regulatory authority to ensure privacy of Americans is protected (except in rare cases, such as the Children’s privacy law we helped develop). The FTC’s framework has failed to do anything to check the massive collection of our data that everyone online confronts. It’s the role of the FCC to ensure that broadband networks operate in the public interest, including protecting consumers. Today’s vote reaffirms that the FCC takes its mission to do so seriously.
    Jeff Chester
  • Americans face new privacy threats from the use of their facial and other biometric information, as personal details of our physical selves are captured, analyzed and used for commercial purposes. Facial recognition (link is external)technologies are part of the ever-growing (link is external)data collection and profiling (link is external) being conduced daily on Americans today—whether we are online and offline. Companies want to be able to use the power (link is external) of facial recognition to make (link is external) decisions about us—including how we are to be treated in stores and on websites. Consumer groups have called on industry to support pro-consumer and pro-rights policies that would ensure an individual can decide whether facial and other personal physical information can be collected in the first place. Last June, however, the industry dominated process led by the Department of Commerce refused to support respecting a persons’ right to control how their biometric data can be gathered and used. As a result, consumer and privacy groups withdrew (link is external) from the Commerce Department “stakeholder” convening on facial recognition. These meetings—primarily dominated by industry lobbyists—are part of a White House initiated effort to design “codes of conduct” to ensure American’s have greater privacy rights. But instead of trying to address the concerns of the consumer and privacy community about meaningful safeguards for facial recognition when used for commercial purposes, the Commerce Department merely continued the process without their participation. For the Commerce Department, its priority is to help grow the consumer data profiling industry—regardless of whether Americans face a serious threat to their privacy and the consequences of potential discriminatory and unfair practices. Today, the Commerce Department is considering industry proposals (link is external) on facial recognition that fail to ensure the American public is protected by the growing use of facial data collection for commercial use. The drafts allow unlimited use of our most personal data without effective safeguards. Instead of ensuring basic rights—such as giving people the right to make informed decisions prior to the collection of their facial data—the industry proposes a scheme that would allow it to harvest our faces, skin color, age, race/ethnicity and more without any limit. By allowing such a clearly inadequate and self-serving industry proposal to be considered at all, the Department of Commerce (and its NTIA division) demonstrates it cannot be trusted to protect consumers. It is putting the commercial interests of the data industry ahead of its responsibilities to the American public. The process and the proposals are not reflective of America today. We cannot believe that President Obama (link is external) endorses how his Commerce Department has transformed the idea of a “Consumer Privacy Bill of Rights” into one that really gives carte blanche to the unfettered use of our faces and other highly personal biometric information.
  • Julie Brill faces a formidable task as she tries to balance what she knows are industry-wide practices that undermine privacy with the intense commercial pressures to financially harvest our data. Whether Julie can successfully act as a one woman privacy truth squad is to early to tell. She recognizes all the ways that companies take advantage of consumers today, including tracking them on every device and wherever they are. Consumer groups will expect Julie to push her powerful clients to change how the way they do business today, where—despite lip service to the contrary-- privacy is viewed as an impediment to success in the marketplace. Whether Julie Brill can survive as a privacy Wallanda is to early to tell. She will face an industry that is largely in denial of what they do with our data. Julie does have the skills to walk the privacy tightrope. She has often reconciled her decades long role as a consumer advocate with work to get industry to act more responsibly—especially regarding how they collect and use consumer information. Julie Brill has been an extraordinary FTC commissioner who has played an important role supporting the strongest possible consumer protection actions by the agency. She has a unique, deep and personal relationship with many in the consumer community. She is already missed. The hiring of Julie Brill is not only the recognition by Hogan Lovells that she is extraordinarily well-connected, knowledgable and skilled. Hogan recognizes that powerful changes are now underway, led by the EU, that will potentially transform how consumer information is treated. The new (EU) General Data Protection Regulation (GDPR)—its privacy law—is going to give new rights to the public that will force companies to treat our data differently. The uncertainty of the future of digital trade between the U.S. and EU also plays a role in Julie’s hiring I believe. With Safe Harbor struck down by the EU Court of Justice and privacy advocates threatening to see the same legal outcome for its replacement—the Privacy Shield—Hogan’s clients (and the industry) need someone like Julie who is respected by many powerful EU officials. With the US/EU trade deal that focuses on digital trade and data flows now being negotiated (Transatlantic Trade and Investment Partnership, TTIP), the stakes are enormous in whether the U.S. data companies can win favorable rules.
    Jeff Chester
  • 1. Why has the FTC waited so long to review this serious threat to our privacy? The Federal Trade Commission’s November 16, 2015, workshop (link is external) on cross-device tracking is a examination of a very disturbing practice that emerged several years ago. The online industry’s business model of identifying specific individuals and following them on whatever device they may use (PCs, mobile, etc.), so their behaviors can be analyzed for more effective micro-targeting, is well-known. For example, companies such as Drawbridge, which “track how [an] individual user traverses the web on his or her smartphone, tablet, laptop and PC,” and analyze “billions” of pieces of data on us, has been around since 2010.[1] Cross-device targeter Tapad has been operating since 2011.[2] During the last few years there has been a veritable explosion of cross-device tracking of individuals—illustrating how our privacy has been lost regardless of what device we may use.[3] The FTC should be monitoring much more closely industry developments to expand their data-driven profiling and targeting techniques. The serious erosion of our privacy is reported daily by leading trade publications and is not a secret.[4] The commission—and other agencies responsible for consumer privacy, such as the FCC and CFPB—need to become much more proactive if they are to actually protect the public. 2. Why isn’t the FTC bringing complaints against both Google and Facebook under their respective “consent decrees” for their own cross-device surveillance of consumers? Both Facebook and Google—as the two dominant online marketing companies—have significantly expanded their own collection, analysis, and use of data from individuals for cross-device tracking. Both companies are under 20-year legal agreements with the FTC that is supposed to ensure that their practices protect our privacy.[5] But the commission has been silent regarding a major violation of our privacy, given the range of Facebook and Google cross-device practices. For example, Facebook’s acquisition of Atlas and its incorporation of new ways to engage in cross-device tracking have not been challenged by the agency.[6] Nor has the FTC pursued Google’s cross-device tracking as a consent decree matter.[7] The commission’s consent decrees are only as good as their enforcement. The FTC’s inaction regarding Facebook’s and Google’s expansion of cross-device data harvesting undermines its claims that its decrees actually protect our privacy. 3. How has the FTC’s and Department of Justice’s (DoJ) failure to stop “Big Data” mergers furthered the expansion of cross-device gathering of our information? As one of two U.S. antitrust and competition regulators, the FTC plays a key role reviewing mergers and acquisitions. Yet the agency has approved Big Data-related mergers that have further weakened consumer privacy and expanded the ability of marketers to track our behaviors across devices. While the FTC’s consumer protection and competition bureaus are separate, the commission has a responsibility to protect the public. Even with mergers reviewed by the DoJ, the FTC should speak out against deals that erode privacy and place consumers at further disadvantage through the use of Big Data. For example, Oracle was allowed to acquire both BlueKai and Datalogix—significantly expanding its sources of data used to profile Americans and to engage in cross-platform targeting. Alliance Data Systems was permitted to acquire Conversant, which bolstered its cross-device applications. The FTC should acknowledge that it is helping weaken the privacy of the American public by allowing data-driven mergers to be approved without effective consumer safeguards.[8] 4. Isn’t cross-device tracking and targeting just a part of an ever-growing commercial Big Data surveillance complex that continually gathers and uses all our information? Anyone who follows the online industry recognizes that our privacy is being continually undermined. Every major company has become its own “data broker,” harvesting all the data they directly gather on a person (when you come to their site, for example). They now merge that data with the abundance of so-called third-party information available for sale or use today. A key goal is to engage in what they call “identity management, ”which means using information on us to help influence our actions, purchases, and behaviors. Cross-device tracking is made possible through the unlimited ability companies now have to use our online and offline information without any serious consideration of our privacy.[9] 5. What is the role that Big Data companies and technologies—such as Data Management Platforms (DMPs)—play in cross-device tracking of individuals? The most powerful U.S. companies are using sophisticated data engines and analytics to gather data on individuals. The growing use of technologies such as DMPs, along with the real-time data targeting now embraced by the industry (known as “programmatic”) is at the core of cross-device practices.[10] While the FTC is aware of these practices, it has not taken any actions to protect consumers.[11] 6. Isn’t the gathering of information from our use of mobile phones, including for cross-device targeting, a major privacy violation? The answer is yes. The mobile phone is the digital spy in our pockets that we take and use nearly everywhere. Gaining access and insights from our mobile phones serves as a veritable digital gold mine for brands and advertisers. Marketers continually research how we use mobile devices, in order to help their clients identify our actual or intended location, as well other data about us (such as income, race, ethnicity, and gender). Companies such as Facebook, Google, and many others have developed ingenious ways to encourage consumers to use “apps” that, once they are downloaded, report on our actions. Google, for example, explains they can help marketers understand how to take advantage of what they call a person’s “micro-moments”--when through the use of our mobile phone we reveal we are searching for a product, store, activity or location.[12] Cross-device tracking and targeting is fueled by the unchecked data gathering from our mobile devices.[13] 7. Will the FTC address how cross-device tracking is helping marketers and brands reach us when we are in retail, grocery stores, and other “real-world” locations? As we use our phone or tablet to search for information, download coupons, or scan for price information, these signals allow marketers to learn about our location and quickly connect data they have about us. So-called “hyper-location” tracking enables companies to identify what neighborhoods we live and work in, for example. As stores deploy so-called “beacons,” Wi-Fi-networks, “geo-fences,” and other ways to connect to people in and around stores, the data they gather from our use of multiple devices becomes more complex and valuable to them.[14] Online and offline distinctions are quickly fading, as cross-device tracking merges with sophisticated data targeting services. 8. Can the FTC protect consumers from cross-device tracking when we watch video online? There is an explosion of video consumption, as more people use their mobile devices to watch online video content. Internet-delivered video to TV’s (so-called “over-the-top”) is another key way we see such programming. Incorporating our video viewing as part of the cross-device tracking apparatus is the latest way our media behaviors are being closely observed, whether we watch on small or large-screen devices.[15] 9. Will the FTC investigate how consumers are tracked and analyzed by cross-device “measurement” services? Measurement is built in to today’s tracking and targeting online system. Marketers wish to know whether we see an ad or promotional message and how we responded. Since we use multiple devices, measurement techniques now reflect an analysis of what we do on all our devices. With advances in measurement having a direct impact on our privacy, as well as with the transactions we make, the commission should investigate the impact of cross-platform “attribution” techniques now broadly deployed.[16] 10. Will the FTC call on the online industry to “cease and desist” from cross-device tracking until privacy safeguards can be proposed and implemented? The online ad lobby has—for decades—worked to keep the FTC relatively powerless to protect privacy. It has opposed calls to provide the agency with “rulemaking” authority so it could develop safeguards that would protect the public.[17] The lack of FTC authority to effectively address privacy threats is a key reason why U.S. data-driven marketers are able to expand their commercial surveillance activities. Despite its lack of power to require companies to engage in a moratorium on cross-device tracking, the FTC should use its moral authority. The commission should declare that the use of cookie syncing, probabilistic or deterministic attribution, unique identifiers, and other methods of stealthily following us from device to device should not be permitted. Leading data companies such as Google and Facebook, digital marketing trade groups such as the Interactive Advertising Bureau and Mobile Marketing Association, data brokers such as Axciom, Oracle, and Merkle, and cross-platform companies such as Tapad and Drawbridge should all be asked to support the commission’s call to stop the tracking of individuals across devices. During the period established for the data-gathering moratorium, the FTC should propose safeguards. The commission’s policies should empower individuals to decide whether and how they can be tracked and analyzed on any device. It’s time for action by the FTC. It knows that Americans confront the loss of their privacy—and it should speak out against the eavesdropping practices that enable online companies to gather data on us—whether we use a PC, mobile phone, or even TV. [1] (link is external) [2];http://techcrunch.... (link is external) [3] (link is external); (link is external) (link is external) [4] (link is external); (link is external); “Mobile creativity: Track mobile performance. AdMap. September 2015. Personal copy. [5] (link is external); (link is external) [6] (link is external); (link is external); (link is external); (link is external); (link is external); (link is external) [7] (link is external); (link is external); (link is external); (link is external); (link is external); (link is external) [8] (link is external); (link is external); (link is external); (link is external); (link is external); (link is external); (link is external); (link is external) [9] (link is external); (link is external); (link is external); (link is external); (link is external); [10] (link is external); (link is external); (link is external); (link is external); (link is external); (link is external); (link is external) [11] (link is external) [12] (link is external); (link is external); (link is external); (link is external); (link is external) ; (link is external); (link is external); (link is external); (link is external) [13] (link is external); (link is external); (link is external) [14] (link is external); (link is external); (link is external); (link is external) [15] (link is external); (link is external); (link is external); (link is external) [16] (link is external); (link is external) (link is external) [17]
  • Blog

    Safe Harbor on Data Declared Illegal: Message to U.S.—Time to Enact Privacy Law that Protects Americans and Supports Global Data Protection

    Case illustrates why FTC is legally unable to effectively protect the public and why Safe Harbor cannot be "fixed"

    Today’s historic decision by the European Court of Justice, which overturned the purposely ineffective “Safe Harbor” deal enabling data to flow to the U.S., is very welcome. As one reads the court’ (link is external)s findings, it’s clear that for the EU, fundamental and human rights include the right to have your personal privacy protected. That means from both governmental surveillance (such as the NSA and other intelligence agencies) and also with commercial Internet companies—as Google or Facebook. Advocates always recognized (link is external) that the Safe Harbor agreement brokered by the Clinton Administration was a digital privacy `house of cards.’ All U.S. companies needed to do was to sign up for some inadequate principles that allegedly would protect the EU public. The Federal Trade Commission was supposed to investigate problems. But as CDD demonstrated last year in its complaint to the FTC on how leading U.S. companies were thumbing their data collecting noses at Safe Harbor, the system doesn't really do much of anything. Safe Harbor is run by the U.S. Department of Commerce, whose political loyalties (and revolving door) lie with the data collection industry. The message to America from the EU is clear: enact comprehensive privacy legislation. It has to meet (and should try and exceed) the high bar set by the EU. It can’t be the weak (link is external) and self-regulatory based “Privacy Bill of Rights” proposed this year by the White House. It has to define strong and enforceable rights, including limiting Big Data style collection—which is now a pervasive part of our online landscape. The law should empower an independent privacy commissioner and give the FTC real regulatory clout. The U.S. also should endorse the EU’s framework (link is external) on privacy that is supported by many countries around the world. In its decision, the European Court of Justice reaffirmed what its Advocate-General has explained earlier. That the U.S. Federal Trade Commission does not have the statutory authority and legal powers to protect a person's privacy as required by the EU. In the EU, privacy is a "fundamental right." In the U.S., consumers have really very few such rights online. The court explained yesterday (in referring to the 2000 decision by the EU approving the Safe Harbor deal with the U.S.) that: " Decision 2000/520 does not contain any finding regarding the existence, in the United States, of rules adopted by the State intended to limit any interference with the fundamental rights of the persons whose data is transferred from the European Union to the United States, interference which the State entities of that country would be authorised to engage in when they pursue legitimate objectives, such as national security... Nor does Decision 2000/520 refer to the existence of effective legal protection against interference of that kind. As the Advocate General has observed (link is external)in points 204 to 206 of his Opinion, procedures before the Federal Trade Commission... are limited to commercial disputes..." The Business lobby has consistently fought against legislation that would empower the FTC to regulate privacy and other commercial practices. Consequently, while the commission does what it can (and is very active working to help the public), it cannot address the fundamental issue. U.S. companies gather and use our information in far-reaching, non-transparent and often troubling ways (think all the secret "scoring" of people that goes on to assess how to treat them; or the use of race, ethnicity, income and location used to track and target us, regardless of device, etc.). Safe Harbor cannot be fixed without the U.S. enacting comprehensive privacy legislation that brings it in sync with the EU. The time to do so is way best due. Kudos to Max Schrems (link is external), who brought the case, and is a tireless and effective privacy campaigner. See BEUC (link is external), PI (link is external) and TACD (link is external) statements as well.
    Jeff Chester
  • The Internet as a whole has become an important part of our global public sphere. Internet provides access to a wealth of information and knowledge, and the possibility to participate, create and communicate. This public space made up of internet infrastructures is increasingly threatened from two sides; by the centralization and commercialization through the dominant positions held by giant telecom and Internet companies, as well as by an increasing trend in state regulation and censorship of the net. This poses important questions about how we choose to organize and regulate our digital societies, and how Internet governance models can be developed and implemented to ensure fair and democratic participation. When it comes to the future of the Internet, a key discussion is one of infrastructures; who owns, runs and controls them. The question of regulation, and who oversees the regulators, is made complicated by the transnational nature of the net. As much as people expect a broadly and equitably accessible Internet open to diversity, we are, slowly but surely, moving away from it. Monopolization of Internet infrastructures and services by companies such as Facebook and Google has gone hand in hand with privacy intrusions, surveillance and the unbounded use of personal data for commercial gain. As we all interact in these centralized commercial platforms that monetize our actions we see an effective enclosure and manipulation of our public spaces. Decentralization and democratization of the Internet infrastructure and activities is essential to keep a free, open and democratic Internet for all to enjoy equitably. But can the “small is beautiful”-idea be compatible with the building of state-of-the-art successful infrastructure in the future? The debates around net neutrality, infrastructure neutrality and Internet monopolies reflect the important choices that are to be made. It is essential the EU formulates a comprehensive vision on the internet that addresses the protection of civil liberties such as free speech and privacy, but also the growing commercialization of our digital public spaces and the commodification of personal data with the effect of the market encroaching on all aspects of our daily lives. Only then can it make relevant interventions regarding the Internet and its governance. Let´s discuss how to re-decentralize and reclaim the Internet for all. This conference (link is external) is organised in cooperation with Commons Network and Heinrich Böll Foundation. DRAFT PROGRAMME (TBC) 15:00-15:30 Introduction 15:30-16:45 1st panel, The big picture What, if anything, in the current model of Internet Governance is clashing with a decentralized, resilient internet viewed as a common good? And what steps should be made by policymakers to foster the best environment for decentralized, community managed projects to grow? Confirmed speakers: Renata Avila, Aral Balkan 16:45-17:00 Coffee break 17:00–18:15 Decentralised infrastructure: Examples what examples of local and decentralized projects do we have today and what are the obstacles they face? Confirmed speakers: Edmon Chung, Robbert Mica, Olivier Schulbaum 18:15–18:30 Conclusions and final remarks This conference will be live streamed at: (link is external)>* (link is external) Background & Programme: (link is external) Joint the event community on Facebook: (link is external)> (link is external) Registration: (link is external)
  • Blog

    My personal data, nobody's business but my own:

    Key consumer demands for the trilogue on the General Data Protection Regulation

    Summary: BEUC reiterates the urgent need to put consumers back in control over the way their personal data is processed online and hopes an agreement on the General Data Protection Regulation will be reached under the Luxembourg Presidency. However, the urgency to adopt the Regulation must not take its toll on consumers’ fundamental rights. Weak provisions on fundamental data protection principles (e.g. purpose limitation) and/or allowing too much flexibility for commercial entities to process personal data based on their alleged legitimate interests could have devastating effects for consumers’ privacy, especially if coupled with flawed rules on highly sensitive aspects like profiling. In general terms, we believe that the European Parliament’s first reading position provides a good basis for an agreement. We also welcome the proactive stance taken by the European Data Protection Supervisor, who has provided some useful recommendations. In contrast, the Council’s General Approach contains some provisions that would even weaken current protection standards, a clear red line set out in the beginning of this reform. That being said, we urge the Commission, the Parliament and the Council to be ambitious. The objective is to modernise and improve Europe’s data protection regime, not to merely maintain the status quo and certainly not to weaken existing protection. The outcome of these negotiations shall provide consumers with greater transparency and control over how their personal data is collected and used. Otherwise consumers will be left with little option than to systematically give up their privacy in order to access online goods and services. This would be unacceptable. A robust Data Protection Regulation must comprise: A broad and future-proof scope. Every company doing business in Europe or targeting users based in Europe must comply with EU laws, regardless of the company’s nationality or the place where it is established. Any kind of information that would allow to identify an individual or single someone out as an individual shall be considered personal data, including pseudonymous1 data. Solid data protection principles and strict legal grounds for data processing. Principles such as “purpose limitation” and “data minimisation” are at the core of the EU data protection regime and must not be weakened. The amount of personal data processed should be kept to the minimum necessary. Further processing of personal data for purposes incompatible with those that justified the initial processing should not be allowed. An enhanced set of data subjects’ rights. Strong and clear provisions are needed with regard to fundamental issues such as the information that must be provided to data subjects, profiling and the right to object. Restrictions on user rights should be strictly limited and include sufficient guarantees. A comprehensive enforcement scheme, including effective mechanisms for consumer redress. The Regulation must be effectively and uniformly enforced across all of the EU. It is crucial that consumers can easily access effective mechanisms to seek redress and that consumer organisations are allowed to proactively defend the rights of data subjects. [see attached for rest of this important document]