It seems ironic to be using the term “makeover” in the midst of the global pandemic as many of us are in need of a haircut and as my wife says “namaste” to hair dressers during Zoom yoga classes. But the term “makeover” fits transparency, a key pillar of trustworthy data processing which needs be both clearer and more clarified as part of building a foundation for a long-term, workable and adaptable legal framework.
In a recent IAF blog on Demonstrable Accountability, it was posited that trust based on organizational accountability must be advanced. The blog went further to suggest that trusted governance based on accountability by the very organizations who are the stewards of the data, developers of technology and decision makers, was a foundational building block for a long-term, workable and adaptable legal framework. The blog also reiterated that while individual participation is key, a backward step to relying on consent would do little to provide or advance trusted data processing or data protection. Consent as a means to achieve an objective of transparency often does neither. It was flippantly suggested that some policymakers accessorize “consent” with gems like affirmative, express, clear, informed, specific, and separate. Once modified with trendy adjectives, consent is familiar, easy, and dressed up, but it is not ready to go. Why? If people don’t understand well enough to agree, object, or access rights and choices, how can they consent? Therefore, the underlying problem is the type of “openness” these transparency mechanisms have; the problem is not the governing mechanism. In IAF’s recent work in Canada on People Beneficial Data Activities, one of the recommendations to policymakers was the need for organizations to be subject to both internal and external oversight which is a form of verifiable transparency. There are several other reasons why transparency needs a makeover.
First, the objectives of transparency are not clear, and the term itself is not defined. While “being transparent” is an explicit and implicit requirement embedded in the GDPR, and despite many specific areas of communications that are required in the GDPR which must meet a standard of also being fair, transparency is not defined. By extension, the objectives of transparency are not clear. In the Article 29 Data Protection Working Party’s Guidelines on Transparency (A29WP Guidance), this gap is acknowledged. One objective of transparency is to enable trust through “openness.”
Second, transparency is bundled with other requirements and with other communications vehicles. In Europe, the GDPR and the Directive before the GDPR, treat the constructs of consent (article 6(1)(a)) and transparency (article 5(1)(a)) as separate requirements. However, over time and in practice, both constructs have been bundled together in privacy notices so that they either implicitly or explicitly double as consent (e.g. Click I Agree). The A29WP Guidance states “transparency, when adhered to by data controllers, empowers data subjects to hold data controllers and processors accountable and to exercise control over their personal data by, for example, providing or withdrawing informed consent and actioning their data subject rights “. This statement conflates an objective of transparency.
The U.S. has not only been more explicit about combining consent and transparency in “Notice and Choice” models, but this approach has been codified this in some sectoral laws such as the Gramm Leach Bliley Act (GLBA). Not only did regulations under GLBA create standard, template notices, but they have been implemented by literally thousands of financial Institutions. The result is they serve no value either in terms of increasing transparency or in terms of explaining complex data uses. Combining transparency with legally driven communications requirements results in legal communications not at all suited to context or the goal of openness.
Third, the complexity of technology and associated data flows and uses is becoming simply too difficult to meet the simple objective of transparency. For example, in the case of AdTech, the complexity of players and data flows and by extension “explainability”, is perceived to inhibit the ability of individuals to exercise their right to object to receiving an ad or the profiling necessary to deliver the ad. Regulators believe that the processes are so complex and the descriptions so obtuse that individuals are not knowledgeable enough to exercise their rights. This criticism not only bundles the objective of transparency with legal requirements, as noted above, but it also suggests an objective of transparency is “verifiability”.
In Artificial Intelligence (AI), transparency is increasingly subsumed by “interpretability”. As highlighted in their paper Toward Trustworthy AI Development, the authors note “AI systems are frequently termed “black boxes” due to the perceived difficulty of understanding and anticipating their behavior. This lack of interpretability in AI systems has raised concerns about using AI models in high stakes decision-making contexts where human welfare may be compromised. Having a better understanding of how the internal processes within these systems work can help proactively anticipate points of failure, audit model behavior, and inspire approaches for new systems.” But this lack of understanding actually raises a different objective of transparency other than being able to explain what is going on. It suggests in addition to helping people understand the reasons for doing the processing and the means to achieve those objectives, transparency and interpretability help achieve “verifiability.” While the authors note there is a problem that is compounded by a lack of consensus on what interpretability means, the goal is to “verify claims about “black box” AI systems that make predictions without explanations or visibility into their inner workings. They go on to suggest that “a model developer or regulator may be more interested in understanding model behavior over the entire input distribution whereas a novice layperson may wish to understand why the model made a particular prediction for their individual case.” This discussion highlights there are entirely different solutions and goals for each audience and for each objective of transparency.
As these three issues illustrate, If trusted governance is based on accountability by the very organizations who are the stewards of the data, developers of technology and decisionmakers, then more transparency as to their stewardship and decision-making processes would not only be desirable but would be necessary to be both responsible and answerable. This is demonstrable accountability.
Demonstrable accountability reveals the need for a makeover of transparency, especially given the complexities of technology and data processing, the goals of individual participation and the need for more transparency of data stewardship processes. In short, the means and objectives for each of these three objectives of transparency should be rethought, and the current focus on transparency as providing information just to individuals should be broadened. Individuals may want to know what is going on, but including these descriptions in long complex notices, often drafted with legal obligations in mind, fail to actually deliver on any meaningful objective and are next to impossible to provide any meaningful information in complex data or technology scenarios. If one needs an advanced engineering degree to understand the notice, the average individual will not have a clue. Transparency should be thought about from an audience and communications objective. This construct was first explored in the IAF’s Effective Data Protection Governance and leads to three recommendations:
- Individual Participation – Individuals have a right and a desire to know about data processing about them. Individual participation can provide them access to specific choices but does not necessarily do so. Due to the complexity of the different ways data is processed, the different purposes, the different processors and the different technologies that may be involved, the timing, method, content and format is and should depend on the circumstance. Therefore, communication methods should be approached in entirely different ways if the objective is to inform individuals when it should matter. For example, both Microsoft (see Privacy Dashboard) and Google (see Privacy Controls) have a very consumer focused approach to informing and involving their customers. TELUS has recently provided very informative content in a consumer centric way covering Data Analytics. The development and delivery of context-based transparency should be required and should not be conflated with consent or other legal notice requirements. The goal is transparency for “explainability.” Organizations could be required to develop these forms of context-based vehicles of explainability without being told how to do it.
- Privacy Notices – These notices are principally designed to meet a legal objective. They should be drafted as a comprehensive statement for regulators and others interested in the details around processing. They should be accessible to individuals, but individuals are not the targeted audience. They are written by lawyers and are intended for lawyers. The goal is transparency for verifiability (accountability). Organizations could be required to provide these forms of notice, much as they do now, but they should be independent of explainability communications.
- Demonstrable Accountability – Organization should demonstrate transparency to the public, to employees and to the government. For example, on the organization’s website, an organization may disclose the types of complex advanced data activities in which the organization engages, how data are used to achieve each beneficial purpose and the types of third parties to which personal data may be transferred in order to achieve each people beneficial purpose; a description of the governance processes it employs (e.g. policies and procedures) regarding data activities; a description of its assessment process, including how benefits and risks are determined and assessed; what types of oversight are use, etc. The goal is transparency for verifiable demonstrable accountability. Organizations could be required to provide this type of transparency.
Transparency is a lynch pin for both individual participation and demonstrable accountability. How it should be applied as part of a goal of trusted data protection needs to be rethought. These concepts should be considered by policymakers as part of building a foundation for a long-term, workable and adaptable legal framework.
The IAF is committed to an ongoing dialog on the multiple types of effective transparency.
Related Articles