Category Archives: Governance

Trust Deficits, Bright Lines and Verification

Freight trains heading through a tunnel are pretty hard to miss. The equivalent of a freight train in the world of privacy is a privacy trust deficit related to information use in an observational age driven by analytics. The many, many indicators that have piled up over the recent days, weeks and months are coming at us like a “trust deficit train”, and it is pretty hard to miss. Trust deficit simply means an organisation does not get the benefit of the doubt, since the doubters believe the organisation will put its interests in front of individuals and society when using data in a robust manner.

The U.S. Federal Communications Commission’s (FCC) new privacy regulations for broadband service providers is just the latest example that trust deficits lead to bright-line rules which are generally ineffective in protecting individuals in today’s observational data-driven world. The FCC has essentially said we are going to put oversight of data use based on broadband participation on the backs of individuals who are not equipped to make decisions on what is appropriate and what is not. IAF public comments on the draft regulation contain more detail on our views, but, as my Australian colleague Malcolm Crompton recently argued in Marrakech, “this (privacy) is the only regulatory environment in which policy makers so burden individuals.”

That does not mean I do not support consent, and individuals having meaningful control of data that pertains to them. I just believe means of governance should be effective, and consent is not always the effective means for governance.

Organisations are smart, and they will figure out how to get consent, and overcome the “take it or leave it” prohibition. Just look at Europe and the ePrivacy directive. I visit a site, a pop-up occurs and I grant consent. There is no thought, I just do so. However, backward-facing regulation, heavily reliant on consent is disturbing at a time where the limits on consent are being recognised.

We see some dissonance in privacy and data protection regulators. On one hand, they increasingly understand the limits of consent, as we see in the consent consultation by the Office of the Privacy Commissioner of Canada. They understand society’s need to assure both protection for individuals and the beneficial impacts in health, education, safety and economic growth that comes from robust data use. This increasingly means data protection by design that looks at the full range of human interest, not just autonomy. However, the trust deficit that exists in all markets drives regulators to question whether processes that are subjective in nature, (all assessment processes that require judgement are subjective), can ever consistently be done with integrity. This leads to the conclusion that subjective processes must have some form of scalable, independent verification. At present, we have no good models on how such verification might work, which leads to bright-line rules, like consent.

The IAF has concluded that effective data governance requires a family of assessment frameworks that range from simple triage of new processing to determine the level of assessment necessary, through traditional privacy impact assessments, to ethical big data assessments, and finally, to comprehensive data impact assessments. It is about utilising the right assessment for the specific governance challenge. We further believe that assessment processes must be supported by some oversight process to create trust, turning self-regulation into co-regulation.

I, personally, cut my teeth on co-regulatory processes with the Individual Reference Services Group (IRSG) that developed a set of governance principles in 1997. The principles covered the use of data for the purpose of creating reference and look-up services on individuals. The IRSG principles required organisations to assert they were following those principles so that they would be enforceable under the deceptive practices authority of the U.S. Federal Trade Commission (FTC). Furthermore, the principles required an annual external assessment using criteria developed in cooperation with PricewaterhouseCoopers. A statement of compliance for that assessment would then be made public and sent to the FTC, creating compliance certainty across the ecosystem.

The fourteen companies that came together to create the IRSG principles did not do so for altruistic reasons. An industry player had created a product that provided social security numbers (SSNs) to customers without either masking the data or qualifying the users and the uses. One could buy one’s neighbour’s SSN. Public concern threatened a legislative solution that would ban SSNs in a bright-line fashion. The co-regulatory process offered a far more nuanced governance over SSNs and fixed the trust deficit at the time.  IRSG eventually disappeared after the Gramm-Leach-Bliley Act was passed, regulating the primary sources of SSNs previously covered under the IRSG principles. However, the lessons learned from that process are still applicable today.

Today’s trust deficits are related to many public concerns. Data security concerns have led to data breach notification laws. Adoption of broad observation technologies has led to rules, such as the FCC privacy regulations, that are intended to stop tracking for certain entities.

The European General Data Protection Regulation (GDPR) creates some flexibility for data-driven research that was not part of the Directive. However, that flexibility requires clarifying guidance from the EU member states. The flexibility also requires the types of assessments that we are developing at the IAF. Viktor Mayer-Schonberger and Yann Padova argue, in ”Regime Change? Enabling Big Data Through Europe’s New Data Protection Regulation,” that this limited flexibility in the GDPR is the opening for further reform in data governance. That will only happen if there is an oversight process for assessments that break through the trust deficit.

Towards that end, the IAF is proposing a project to create options for oversight of the assessments required by modern data driven processing. We need corporate participation first, and then participation by other stakeholders, including regulators. The driver for business is not altruism, but rather the corporate need for rigorous processes that are trusted. As we have seen with the FCC, trust deficits produce backwardly focused bright-line rules that neither protect people or facilitate innovation that improves our lives.

IAF/FPF Side Event at ICDPPC – 18 October 2016

Technology, Challenges and Effective Governance Side Event, Co-Hosted by Future of Privacy Forum and IAF DATE: 18 October 2016 TIME: 14.15 to 16.15 LOCATION: Roseraie Room, Conference Center of the Palmeraie Golf Palace EVENT: 38th International Data Protection and Privacy Commissioners Conference ADMISSION: To register for this event, contact us at, or RSVP via… Continue Reading

IAF Holds Brainstorming Event on Ethics, Created Data, AI and Accountability

On 13 September, IAF held its Annual Brainstorming Meeting. HP Inc. hosted the event. This year’s sessions focused on ethics and data protection, created data and accountability, and artificial intelligence and governance. Below are documents related to the meeting and photos of the event. Meeting Documents Brainstorming Session Agenda Session Attendee List Background Reading IAF… Continue Reading

EDPG Project: Enhancing Benefits from Information Flows While Improving Regulatory Certainty in a Digital Age

Today’s information ecosystems are complex and set to become even more complicated. Business, today, is making increasing use of information as a means to create new products and services and drive value creation. IoT environments offer a terrific example of this complexity as does the whole area of Big Data analytics, which can involve the… Continue Reading

Enhancing the Benefits of Information Through a Values Based Holistic Approach to Information Governance

Businesses today are increasingly using information as a means to create new products and services and to drive the creation of benefits. Access to data and advanced analytical capabilities are enabling new opportunities for both current information-intensive industries and new players, even those traditionally in core product segments or where there is no direct business-to-consumer… Continue Reading

IAF Will Hold Event at 37th International Privacy Conference

The Information Accountability Foundation will host a side event entitled “Data Stewardship for a 21st Century Data-Driven World: Ethical Big Data Assessment, Holistic Governance Beyond Big Data and Enforcement Models” at the 37th International Data Protection and Privacy Commissioners Conference. The discussion will be held on 29 October during 16:00 – 18:00. Following the meeting,… Continue Reading

Moving Big Data Governance Forward

On 14 July, The Information Accountability Foundation (IAF) convened a very successful meeting hosted by the Garante in Rome to discuss how big data ethical assessments might be subject to enforcement. Big data raises numerous challenges not only for today’s data protection and privacy laws but also for the new regulation that might be enacted… Continue Reading

Legal, Fair and Just – The Benchmark for Big Data Analysis

Last month, the IAF presented our big data assessment process to industry representatives in Washington, D.C. One of the attendees, really trying to be helpful, asked why would any U.S. company conduct an ethical assessment of a big data project, since there really are not many restrictions in the use of data to develop insights.… Continue Reading

IAF Discussion on Big Data Ethical Assessment | 22 July | New York City

IAF will host a meeting entitled “Big Data Ethical Assessment: A System for Assuring Big Data Processes are Legal, Fair and Just.”  The event will take place on 22 July 2015, 8:30 AM, at Dentons LLP, 1221 Avenue of the Americas, New York.  The 90-minute session is designed for privacy lawyers, chief privacy officers, chief… Continue Reading

IAF Discussion on Big Data Governance | Washington, DC | 18 May

The Foundation will host “Discussion on Big Data, Governance, and Regulation” on Monday, 18 May, during 10-11 am, ET. The meeting will take place in Washington, DC. A recent WSJ article entitled Big Data Looms as Next Battle in Europe underscores the friction and uncertainty caused by the changing nature of how digital assets for… Continue Reading