CDD

Consumer, Privacy Groups decry failure to protect consumer facial and biometric privacy by Commerce Department and industry lobbyists

Alvaro Bedoya, Center for Digital Democracy, Common Sense Kids Action, Consumer Action, Consumer Federation of America, Consumer Watchdog, Privacy Rights Clearinghouse, and U.S. PIRG

The “Privacy Best Practice Recommendations for Commercial Facial Recognition Use” that have finally emerged from the multistakeholder process convened by the National Telecommunications and Information Administration (NTIA) are not worthy of being described as “best practices.” In fact, they reaffirm the decision by consumer and privacy advocates to withdraw from the proceedings. In aiming to provide a “flexible and evolving approach to the use of facial recognition technology” they provide scant guidance for businesses and no real protection for individuals, and make a mockery of the Fair Information Practice Principles on which they claim to be grounded.

That is not surprising. It was clear to those of us who participated in this process that it was dominated by commercial interests and that we could not reach consensus on even the most fundamental question of whether individuals should be asked for consent for their images to be collected and used for purposes of facial recognition. Under these “best practices,” consumers have no say.

Instead, those who follow these recommendations are merely “encouraged” to “consider” issues such as voluntary or involuntary enrollment, whether the facial template data could be used to determine a person’s eligibility for things such as employment, healthcare, credit, housing or employment, the risks and harms that the process may impose on enrollees, and consumers’ reasonable expectations. No suggestions are provided, however, for how to evaluate and deal with those issues. If entities use facial recognition technology to identify individuals, they are “encouraged” to provide those individuals the opportunity to control the sharing of their facial template data – but only for sharing with unaffiliated third parties that don’t already have the data, and “control” is not defined.

Just as there is nothing that “encourages,” let alone requires, asking individuals for consent for their images to be collected and used for facial recognition in the first place, there is nothing that “encourages” offering them the ability to review, correct or delete their facial template data later. The recommendations merely “encourage” entities to disclose to individuals that they have that ability, if in fact they do. Further, if facial recognition is being used to target specific marketing to, for example, groups of young children, there is no “encouragement” to follow even these weak principles.

There is much more lacking in these “best practices,” but there is one good thing: this document helps to make the case for why we need to enact laws and regulations to protect our privacy. If this is the “best” that businesses can do to address the privacy implications of collecting and using one of the most intimate types of individuals’ personal data – their facial images – it falls so short that it cannot be taken seriously and it demonstrates the ineffectiveness of the NTIA multistakeholder process.