Author Archives: Peter Cullen

The Need for An Ethical Framework

The vast amount of data made possible and accessible through today’s information technologies, and the ever-increasing analytical capabilities of this data, are unlocking tremendous insights that are enabling new solutions to health challenges, business models, personalization and benefits to individuals and society. At the same time, new risks to individuals can be created. Against this backdrop, policy makers and regulators are wrestling with how to apply current policy models to achieve the dual objectives of data protection and beneficial uses of data.

One of the market pressures emerging is a call to not only have data processed in a legal manner but also in a fair and just manner. The word “ethics” or the phrase “ethical data processing” is in vogue.  Yet, today, we lack a common framework to decide both what might be considered ethical but as important how an ethical approach would be implemented. In their article relative to building trust, Jack Balkin and Jonathan Zittrain posit: “To protect individual privacy rights, we’ve developed the idea of “information fiduciaries.”  In the law, a fiduciary is a person or business with an obligation to act in a trustworthy manner in the interest of another. However, what would be acting in a trustworthy fashion look like? It is an interesting approach that also illustrates the translation over time of ethical models into not just law but also commonly accepted practice (e.g., doctors and the Hippocratic oath).

So, while we do not have a lingua franca, privacy and data protection enforcement agencies are increasingly asking companies to understand the ethical issues that are raised, for individuals, groups of individuals or the broader society, when complex data processing takes place.  These issues go beyond explicit legal and contractual requirements to the question of whether processing is fair and just, not just legal.  Translating ethical norms and values based uses of data into internal policies, mechanisms, and obligations that can be codified into operational guidance and processes, and people-centered thinking, is a challenge for many organizations.

Even before this translation stage, it is key to recognize the word “ethics” or the phrase “ethical approach” does not exist in many privacy or data protection laws. However, synonyms like “fairness” exist today, will be stronger under the EU General Data Protection Regulation and are increasingly being looked at by global regulators.  An ethical framework is a careful articulation of an “ethos” or set of norms and guiding beliefs that is expressed in terms of explicitly stated values (or core principles) and are made actionable in guiding principles.

Today, organizations with mature privacy programs have internal policies that cover their legal requirements as well as requirements that go beyond the law.  Examples include industry standards or positions that an organization has chosen to take for competitive reasons. The policies are usually the basis for operational guidance and mechanisms to put such guidance into place.

While privacy law may be clear, newer requirements, such as assessment processes that address fair and just processing and the impact to individuals, are less so. Organizations are challenged to translate ethical norms for data use into values and principles that become policy and ultimately operational guidance, which includes data processing assessments. Such guidance can serve as guiderails for business units that need to meet ethical standards for data use that go beyond privacy.

While IAF has defined a broad ethical framework as part of the Big Data Ethics Initiative, there is currently a gap in the guidance, specifically the translation of this ethical framework into values and action principles that an organization can express as internal Policy. This is a key connection point in the path to operational guidance such as a Comprehensive Data Impact Assessment (which can include privacy impact assessments and data protection impact assessments) developed as part of the IAF’s Effective Data Protection Governance Project that incorporates ethical data use objectives. This translation step also helps organizations establish the ultimate guiderails they want to use.

As a parallel example, many organizations have standards of (business) conduct. These are often developed or start off with a describable set of Values that are then codified into a set of Principles and ultimately into Policy which serves as the means to communicate to employees their expected behavior. In short, the “Principle” often serves as a key bridge between Values and Policy, thereby creating a meaningful framework that can then be operationalized in the organization.

What is needed to advance this dialogue is a starting point for what an “ethical framework” might look like and how the various layers or levels might be described.  In a pictorial model, such a framework could look like this:

Key to the ethical framework is a starting point for what the Principle (Core and Guiding) layer could look like. Below is an example of what this layer might consist of. It was developed using a combination of the IAF’s Big Data and Ethical Values, the AI principles/values, How to Hold Algorithms Accountable from an MIT Technology Review and “Principles for Algorithmic Transparency and Accountability” from the ACM. They are written in “neutral” language as it is envisioned organizations would adapt them to fit their own environments as well as potentially translate them for external communication as they see fit. They go beyond what are legal requirements.

Ethical Data Use Core and Guiding Principles

  • Beneficial

o   Uses of data should be proportional in providing benefits and value to individual users of the product or service.   While the focus should be on the individual, benefits may also be accrued at a higher level, such as groups of individuals and even society.

o   Where a data use has a potential impact on individual(s), the benefit should be defined and assessed against potential risks this use might create.

o   Where data use does not impact an individual, risks, such as adequately protecting the data and reducing the identifiability of an individual, should be identified.

o   Once all risks are identified, appropriate ways to mitigate these risks should be implemented.

  • Fair, Respectful, and Just

o   The use of data should be viewed by the reasonable individual as consistent, fair and respectful.

o   Data use should support the value of human dignity – that individuals have an innate right to be valued, respected, and to receive ethical treatment.  Human dignity goes beyond individual autonomy to interests such as better health and education.

o   Entities should assess data and data use against inadvertent, inappropriate bias, or labeling that may have an impact on reputation or the potential to be viewed as discriminatory by individual(s).

o   The accuracy and relevancy of data and algorithms used in decision making should be regularly reviewed to reduce errors and uncertainty.

o   Algorithms should be auditable and be monitored and evaluated for discriminatory impacts.

o   Data should be used consistent with the ethical values of the entity.

o   The least data intensive processing should be utilized to effectively meet the data processing objectives.

  • Transparent and Autonomous Protection (engagement and participation)

o   As part of the dignity value, entities should always take steps to be transparent about their use of data. Proprietary processes may be protected, but not at the expense of transparency about substantive uses.

o   Decisions made and used about an individual should be explainable.

o   Dignity also means providing individuals and users appropriate and meaningful engagement and control over uses of data that impact them.

  • Accountability and Redress Provision

o   Entities are accountable for their use of data to meet legal requirements and should be accountable for using data consistent with the principles of Beneficial, Fair, Respectful & Just and Transparent & Autonomous Protection. They should stand ready to demonstrate the soundness of their accountability processes to those entities that oversee them.

o   They should have accessible redress systems available

o   Individuals and users should always have the ability to question the use of data that impacts them and to challenge situations where use is not consistent with the core principles of the entity.


The IAF believes it is important to have a lingua franca that enables a broad dialogue around not just how fair data processing is considered but also how an ethical framework helps implement the resulting values and principles.

Let us know what you think.

Detailed Overview of the Effective Data Protection Governance Framework and Components—A Data- and Use-Based Approach

In the Information Accountability Foundations blog in Sept, the Effective Data Protection Governance (EDPG) project was introduced. The EDPG proposes a re-alignment of responsibilities, the introduction of new obligations and a different way to think about obligations for each participant in increasingly complex information ecosystems. The objective the framework is to better align responsibilities while… Continue Reading

The Data Use Imperatives – Effective Data Protection Governance

We are on a cusp of many dramatically new ways to think about data and data use that will increasingly place pressure on public policy models and organisational governance. This overall challenge was introduced in our blog last month and is the cornerstone of The Information Accountability Foundation (IAF) work on an Effective Data Protection Governance… Continue Reading

EDPG Project: Enhancing Benefits from Information Flows While Improving Regulatory Certainty in a Digital Age

Today’s information ecosystems are complex and set to become even more complicated. Business, today, is making increasing use of information as a means to create new products and services and drive value creation. IoT environments offer a terrific example of this complexity as does the whole area of Big Data analytics, which can involve the… Continue Reading

Enhancing the Benefits of Information Through a Values Based Holistic Approach to Information Governance

Businesses today are increasingly using information as a means to create new products and services and to drive the creation of benefits. Access to data and advanced analytical capabilities are enabling new opportunities for both current information-intensive industries and new players, even those traditionally in core product segments or where there is no direct business-to-consumer… Continue Reading

The Complexity of Information Flows

I recently spoke at two events in Hong Kong: the International Big Data Conference, sponsored by the Privacy Commissionaire of Hong Kong, and the Asia Pacific Privacy Authorities (APPA) conference. There were two observations from the overall dialogue of these events. One is the “evolving realization of reality”. What I mean by this is that… Continue Reading

IAF Launches Effective Data Protection Governance Project to Address Policy Risk

More and more business strategies involve deriving value from the use of information. At the same time, regulators and policy makers are grappling with data-driven innovation, as they seek to address not just what policy response is correct, but how organizations can and should demonstrate the governance of this data. In addition, how individuals will… Continue Reading

What Does Information Accountability 2.0 Look Like in a 21st Century Data World?

In 1980, the OECD issued Guidelines on the Protection of Privacy and Transborder Flows of Personal Data (“OECD Guidelines”). This was the first international articulation of substantive principles of data protection. Twenty-nine years later, the Global Information Accountability Project sought to articulate the process elements that implement the OECD’s data protection principles. Accountability is not… Continue Reading