As with every other kind of risk-based assessment, a privacy assessment has to start with identifying risk. But in the context of privacy compliance, which risks are most pertinent? Reputational damage? Litigation exposure? Regulatory scrutiny? Financial harm?
Recent guidance from the European Data Protection Supervisor (EDPS) reminds us that sometimes the answer can be “none of the above.”
Context is key.
The EDPS—which monitors and advises EU institutions and bodies on matters relating to the processing of personal data—is not an authority that addresses the practices of private corporations, but its Survey on Data Protection Impact Assessments notes how compliance professionals can (and do) trip up when it comes to conducting DPIAs.
Rights and Freedoms in the EU
By way of background, EU institutions (which are governmental entities such as the European Parliament and the EU Court of Justice) are governed not by the General Data Protection Regulation (GDPR), but by a complementary measure, EU Regulation 2018/1725, sometimes known by the acronym EUDPR. While there are significant differences between the two regulations, the provisions addressing DPIAs are nearly identical. So businesses should take note.
The EDPS Survey summarizes the results of a review of DPIAs implemented by EU institutions under EUDPR Art. 39 (which closely mirrors GDPR Art. 35). The goal was not to “name and shame,” but rather to provide guidance and best practices.
On the topic of risk, the EDPS emphasizes that DPIAs should focus on the “rights and freedoms of natural persons”; not on the “reputational or physical damage” to the data controller. As explained by the EDPS: “[A]lways think about how the processing could affect those whose data you process. What does it do to them? How does it affect them?”
While some of the DPIAs reviewed by the EDPS paid lip service to data subject rights, the EDPS found evidence that risk related to those rights “is a concept difficult to be grasped, as there is no immediate connection between the processing of personal data and how adversely that could affect rights and freedoms.”
For example, a number of DPIAs addressed the use of CCTV along potential routes for political demonstrations (identifying demonstrators as potential security risks), but the EDPS noted that almost all of those DPIAs failed to mention “the potential impact [that] this surveillance measure might have on the freedom of expression and assembly.”
Therefore, a DPIA in the European Union evaluating the use of CCTV in publicly accessible areas beyond a building’s entrance must address whether “the privacy and other fundamental rights of the participants caught on the cameras, including, importantly, their rights to freedom of assembly, are not disproportionately intruded upon.”
Guidance on the GDPR originally drafted by the Article 29 Working Party (and since adopted by the European Data Protection Board) reinforces the focus on high risks to individuals’ rights and freedoms. It further clarifies that, while those rights and freedoms primarily concern data protection and privacy, they “also involve other fundamental rights such as freedom of speech, freedom of thought, freedom of movement, prohibition of discrimination, [and] right to liberty, conscience and religion.”
The U.S. and Other Regimes
Outside the context of the European Union, the notion of risk in impact assessments may take on different meanings.
Risk assessments required by New York’s cybersecurity regulations, for example, focus not on the rights or concerns of individuals whose data is held by financial service companies, but rather on the “particular risks of a company’s business operations” sufficient to inform the design of the required cybersecurity program (23 NYCRR 500.09(a)).
Similarly, Singapore’s Guide to Data Protection Impact Assessments, published by the Personal Data Protection Commission Singapore (PDPC), focuses on organizational operations rather than individual rights. Among the risks cited are security controls, vendor practices, and data retention policies. According to the PDPC, the goal of a DPIA is to foster compliance with Singapore’s data protection law.
Likewise, the Privacy Framework Version 1.0, published in January by the U.S. National Institute of Standards and Technology (NIST), highlights “problematic data actions,” which can result in “noncompliance costs, revenue loss arising from customer abandonment of products and services, or harm to … external brand reputation or internal culture.” While the potential impact to individuals is certainly taken into consideration, NIST notes that the impact assessment “is where privacy risk and organizational risk intersect.”
While the California Consumer Privacy Act contains no requirement for a risk assessment, the CCPA 2.0 ballot initiative (otherwise known as the California Privacy Rights Act), if passed, will authorize the California Attorney General to issue regulations requiring businesses to submit a risk assessment “on a regular basis” to the yet-to-be-created California Privacy Protection Agency in cases where processing “presents significant risk to consumers’ privacy or security.” The risk assessment will require businesses to identify and weigh “the benefits resulting from the processing to the business, the consumer, other stakeholders, and the public, against the potential risks to the rights of the consumer associated with such processing.” Whether processing constitutes a “significant risk” will depend on “the size and complexity of the business and the nature and scope of processing activities.”
In contrast, Brazil’s General Data Protection Act, Lei Geral de Proteςão de Dados (LGPD), follows the lead of the EU by identifying risks as those affecting civil liberties and fundamental rights. While the LGPD does not specify when a DPIA is required, it permits the National Data Protection Authority to require an organization to prepare an assessment when directed (LGPD Art. 38) and to draft regulations on the topic (LGPD Art. 55-J).
Without a doubt, reputational damage, litigation exposure, regulatory scrutiny, and financial harm will play a part in any risk assessment, but businesses should first identify those risks material to the context of the assessment. For purposes of the GDPR and Brazil’s LGPD, that means addressing the individual rights and fundamental freedoms of those whose data is being processed.
But even if a given law or regime does not specifically factor the rights of affected individuals into the risk assessment, businesses may want to consider those rights nonetheless. Doing the right thing is always a good business practice.
If you’re reading this on the Bloomberg Terminal, please run BLAW OUT <GO> in order to access the hyperlinked content.