Google, Apple, Facebook, Amazon, Twitter, Equifax, and Uber. With the multiple enforcements since the European Union’s data privacy law became fully applicable, especially on these corporate behemoths, regulators have been demanding that organizations move beyond compliance with data protection regulations and instead toward demonstrating accountability.
The accountability concept was originally established in 1980 in the Organisation for Economic Co-operation and Development’s Guidelines on the Protection of Privacy and Transborder Flows of Personal Data and has been echoed in the Asia-Pacific Economic Cooperation’s Privacy Framework (IX), the modernized Council of Europe Convention 108+ (Article 10) and in the EU’s General Data Protection Regulation (Articles 5(2); 24), to name but a few.
Essentially, an “accountable” organization must:
- Be both responsible and answerable for the personal data under their control;
- Embed personal data protection into corporate governance;
- Implement appropriate and effective measures for data protection;
- Inculcate an organizational culture of responsibility through training and awareness programs;
- Provide information on its data protection policies and practices to individuals;
- Ensure that data flows across borders under adequate safeguards, and
- Be able to demonstrate proper management and efficiency of personal data protection systems.
A quick glance at publicly available privacy notices turn out to be statements focusing on achieving compliance with current regulations to help individuals understand how personal data is processed in a generally intelligible form. One could explain this phenomenon of organizations providing, at best, mandatory information (i.e. under Articles 12 to 14 of the GDPR), but rarely giving a deeper insight, with liability concerns and the necessity to protect trade secrets.
Practices Paint Difference Picture
In reality, however, authorities have found that instead, practices paint a very different picture, shifting responsibility to individuals and subverting their choices on data processing to serve business interests—despite established policies.
For example, in the Facebook case, the Federal Trade Commission (FTC) found that before revoking third-party developers’ access to Affected Friend data, “Facebook explicitly evaluated whether apps affected by the changes spent money on advertising with Facebook, generated revenue for the company, or otherwise offered something of value such as reciprocal access to user data.”
Some organizations also failed to provide sufficient resources and eroded customers’ trust. According to a House Oversight Committee report, Equifax failed to prioritize a secure technology environment, disaccording with its aggressive growth strategy expanding over multiple companies with large information technology (IT) systems and data; a shortcoming leading to the 2017 hack compromising 148 million individuals’ information. In the hearing, Graeme Payne, former Senior Vice President and Chief Information Officer for Global Corporate Platforms of Equifax, pointed to a lack of investment and measures.
Hence, accountability as a governance model must keep pace with the evolving technological and cultural facets to encourage preventive and effective measures, closely enforced by supervisory bodies.
We term this “accountability plus” and base it on the following four principles:
The first is a duty of care and loyalty. As a “data fiduciary” or “data steward”, organizations must always prioritize their users’ interests and rights, and act as expected from the perspective of a reasonable individual. To accomplish both of these objectives, privacy should become their number one concern beyond mere legal requirements, ensuring proper collection, usage, disclosure and storage of that information.
Clear Governance Structure
Second, corporate executive officers and directors serving on corporate boards must establish robust mechanisms holding executives accountable for their decisions on data protection and privacy, subjecting those decisions to meaningful oversight, and assuming civil and criminal penalties for any violation.
For example, the FTC agreement required Facebook to form an independent privacy committee to make regular assessments of the company’s practices and to remove unfettered control by Facebook’s CEO Mark Zuckerberg over decisions affecting user privacy. Facebook is also required to appoint “compliance officers” responsible for their privacy program, the monitoring and dismissal of which is exclusively reserved to that Committee—not to Facebook’s CEO or employees.
Proper governance structures are also prescribed by China’s Personal Information Security Specification (Sec. 10.1 a) where the management must ensure that the Data Protection Officer (DPO) and Committee is provided with human, capital and asset resources, and that there is no conflict of interest in their activities.
A clear division of tasks between the business routines and the DPO is key. Shortcomings at this point can result in conflicts of interest and lead to sanctions or even to the dismissal of the DPO, as striving for compliance is primarily the task the organization.
DPOs that draft entire data processing agreements, notices, consent forms etc. without any significant involvement of the organization, or respond to requests from data subjects on their own authority (recently criticized by the Belgian Supervisory Authority), exceed their role and will struggle untying the Gordian knot of controlling themselves.
Strong Accountability, Transparency and Control
Third, the European Data Protection Supervisor recalled that the responsibility for handling personal information should be much more proactive and transparent, getting out of the ‘Black Box’ tendency of secrecy on business practices. Maintaining records of processing activities or conducting privacy or data protection impact assessments (PIAs/DPIAs) is one thing, explaining their essence to users in an intelligible and trustworthy way is another.
Fourth, organizations must adhere to ethical standards of mutual trust and confidentiality, focusing on user’s expectations, risk minimization, and business future-proofness. A chart of ethical principles and corporate values, especially on large-scale processing, special categories of personal data, or deployment of AI (see, the Declarations by the ICDPPC, EU Commission, German supervisory authorities) may help individuals to realize better the benefits of technology for society and economy while reinforcing their rights and freedoms.
As data protection practices play catch up with the explosion of the digital age where personal data is being and can be further weaponized, organizations should progress beyond accountability and work towards “accountability plus”. After all, it is only realistic for stakeholders to trust and adopt new technologies smoothly when an organization and its employees live and breathe data protection as indisputable values.
This column does not necessarily reflect the opinion of The Bureau of National Affairs, Inc. or its owners.
Luis Alberto Montezuma (CIPP/C CIPP/E CIPP/US CIPM FIP and Privacy by Design). He is currently deputy assistant to the chair of Colombia’s Data Protection Authority (Superintendent Delegate for the Protection of Personal Data of the Superintendence of Industry and Commerce). He also serves as a member of the IAPP Privacy Bar Section Advisory Board and is co-chairman of the IAPP Bogota, Colombia KnowledgeNet Chapter.
Christopher Schmidt (CIPP/E CIPM CIPT) currently works at the International and European Affairs Department of the Hessian Data Protection Commissioner. He is a German Magister of Law with IT and data protection knowledge and a BTA Certified Blockchain Solution Architect. He regularly speaks on current data protection matters in German, English, French, and Italian.
Qian Li Loke (FIP CIPP/A CIPM) is a consultant at Straits Interactive and an ambassador of the Data Protection Excellence Network. He regularly speaks, writes and trains on data protection with a business flavor.
The views expressed in this article do not necessarily correspond to those of the authors’ respective organizations.