What Does Information Accountability 2.0 Look Like in a 21st Century Data World?

In 1980, the OECD issued Guidelines on the Protection of Privacy and Transborder Flows of Personal Data (“OECD Guidelines”). This was the first international articulation of substantive principles of data protection. Twenty-nine years later, the Global Information Accountability Project sought to articulate the process elements that implement the OECD’s data protection principles.

Accountability is not new to data protection. It has served as a rallying point to articulate existing operational concepts that had been used in practice and to provide a roadmap for future effective implementation. However, up until the Accountability Project, this important principle had not been fully dissected and described.

In the last six years, accountability has been better defined and has been reflected in policy instruments issued in the Europe, the Americas and Asia. Data protection agencies in Canada have provided guidance to define their expectations of accountable companies. Mexico and Columbia have incorporated elements of accountability into their data protection frameworks. The Hong Kong Privacy Commissioner, going beyond the plain language of the Hong Kong law has garnered support for comprehensive programs to implement accountability. Finally, in response to these advances in public policy, many companies have taken important steps toward implementing an accountability program.

In short, a lot of progress has been made. However, just as information flows were quite different six years ago at the outset of the Accountability work to thirty-five years ago when the OECD principles were created, information flows and information use today is even more divergent. Further, with the coming technology boom involving sensors and wearables, coupled with today’s advanced analytical capabilities, these flows and resulting information use are going to change even more dramatically. The question then becomes what does Information Accountability 2.0 look like in tomorrow’s world.

To achieve the full potential of innovative data use within a framework of information under control, a different regime of corporate commitment, polices, practices, risk assessment and monitoring mechanisms will be needed. They should include a doctrine of fairness covering a broader set of rights and expectations. We will need new ways to engage with consumers around contextual information use that provide appropriate autonomy choices and new approaches to remediation and regulatory assurance.

It is worth looking at this framework through the lens of each of the five essential elements of Accountability.

1. Corporate commitment to internal policies (maybe codes of conduct) that link to external criteria – data protection law

Much progress was achieved during phase one of the Accountability work to grow organizational understanding and commitment to internal policies but as this element describes, these polices were and are linked to external criteria centered on today’s law. The challenge is laws and regulations have not yet been updated to today’s (and tomorrow’s) data use world. It is also reasonable to assume that the historical lag of laws and regulations to technology advances will not only continue but widen. However, there are broader implications at play with this element. Advanced data flows, analytics and resulting information use are going to require a different focus of policy and commitment; ones framed around a doctrine of “fairness” not just legal requirements. Key to this doctrine will be considering the value innovative data use bears and a broader set of rights and consumer expectations. There will be many more ethical trust decisions required by organizations relative to information and information use that look to both customer trust and the ability to demonstrate to the policy and regulatory environment that information is under control. Legal compliance will be a bare minimum; however, information use will require a much broader, more nuanced contextual approach.

2. Mechanisms to put those policies into effect, including identifying risk to individuals and mitigating those risks (privacy-by-design)

Over the course of phase one of Accountability work, organizations grew compliance mechanisms along with overall investments in privacy program. The IAPP reports the average surveyed Fortune 1000 company’s privacy program has a budget of $2.4 million. Included in this investment was a growth in risk assessment capability. However, risk was mostly related to mitigating corporate risk. Assessing and mitigating risk to the individual is not yet a core competency of many organizations. As with the need for a broadened set of corporate policies, the practices associated with the policies will need to include a broader, more complete thinking about risks to individuals over and above the requirements of laws. New mitigation strategies and new processes will be required. For example, in an advanced analytics world, the provenance and accuracy of data will become more important as this data is used to affect or suggest decisions on individuals. New de-identification processes will be required to do advanced analytics and to share information. New protection methods will be required for data in transit, particularly for machine-to-machine scenarios. Privacy by Design concepts and programs, which has roots in product and service design will need to evolve to address new information use. Data will become the new product.

3. Internal monitoring to assure mechanisms work

Much progress has been made in terms of growing internal assurance mechanisms; both capacity and capability. The number of resources added to corporate privacy groups has shown tremendous growth and is forecast to continue. The number of Certified Privacy Professionals has grown to close to 7,000 and privacy professionals and organizations have developed sound ways to assure commitments are being met. Today and tomorrow, more information use is being generated out of parts of the organization privacy professionals feel least connected to. New risks associated with information use will be created and will require different assessment and assurance systems than what has matured in product and service parts of companies. The “I” in PIA’s (Privacy Impact Assessments) have largely developed with a backdrop to compliance against legal expectations. The “I’ in PIA will need to evolve to address “Information Impact” in ways that are different than todays’ information use. New input mechanism will be needed that are founded in more than just legal requirements; risk and value analysis against a broad set of individual needs and expectations. The assessment and assurance mechanism will increasingly have to address context-based information use and not just the data itself.

4. Individual participation (transparency and consent where appropriate)

Considerable academic analysis and literature has been written outlining the drawbacks to a reliance on consent, so prevalent in today’s data protection models. At the same time, individuals should and will want to participate in making their own decisions about how information about them is used. This will include how to benefit more fully from the innovative value to them. With the expected growth of machine-to-machine data transfer, sensors and the internet of things, there will be increasing challenges related to individual engagement. New, contextual based ways to engaging with consumers will be required that balance the interests of individuals and the legitimate interests of other stakeholders in various information value chains

5. Standing ready to demonstrate to a regulator on request, and remediation where necessary

Arguably, the largest challenge to an Accountable, 21st Century information governance system, is how will we know it is working and how will remediation work. We will need to evolve to being able to demonstrate risk and impact capability rather than outcome capability based on prescribed legal requirements. The same data will increasingly have and serve very different uses; organizations will not only have to develop new and broadened governance systems, but will also have to be able to demonstrate this. Regulators on the other hand will have to evolve to not only assessing outcomes, but also assessing the capability of organizations to effectively manage and mitigate, and appropriately involve consumers in multiple contextual based information uses.

We have built a foundation around information governance with Information Accountability 1.0 – today and certainly tomorrows information value chains are going to require a different approach to an accountable information governance system. Much of the Information Accountability Foundation’s work is grounded on advancing solutions to address a system that allows for innovation and is viewed as being in control. Work towards defining the components of Accountability 2.0 to the Big Data Ethical Framework to building a Holistic Governance Model will all play a role in ensuring everyone benefit from the innovative value of information use within a control framework of fairness.

About the Author
Peter is the Executive Strategist, Policy Innovation at the Foundation and is also the President of Global Information Governance Solutions. He can be reached at pcullen@informationaccountability.org.

Leave a reply