Category Archives: Consent

Trust Deficits, Bright Lines and Verification

Freight trains heading through a tunnel are pretty hard to miss. The equivalent of a freight train in the world of privacy is a privacy trust deficit related to information use in an observational age driven by analytics. The many, many indicators that have piled up over the recent days, weeks and months are coming at us like a “trust deficit train”, and it is pretty hard to miss. Trust deficit simply means an organisation does not get the benefit of the doubt, since the doubters believe the organisation will put its interests in front of individuals and society when using data in a robust manner.

The U.S. Federal Communications Commission’s (FCC) new privacy regulations for broadband service providers is just the latest example that trust deficits lead to bright-line rules which are generally ineffective in protecting individuals in today’s observational data-driven world. The FCC has essentially said we are going to put oversight of data use based on broadband participation on the backs of individuals who are not equipped to make decisions on what is appropriate and what is not. IAF public comments on the draft regulation contain more detail on our views, but, as my Australian colleague Malcolm Crompton recently argued in Marrakech, “this (privacy) is the only regulatory environment in which policy makers so burden individuals.”

That does not mean I do not support consent, and individuals having meaningful control of data that pertains to them. I just believe means of governance should be effective, and consent is not always the effective means for governance.

Organisations are smart, and they will figure out how to get consent, and overcome the “take it or leave it” prohibition. Just look at Europe and the ePrivacy directive. I visit a site, a pop-up occurs and I grant consent. There is no thought, I just do so. However, backward-facing regulation, heavily reliant on consent is disturbing at a time where the limits on consent are being recognised.

We see some dissonance in privacy and data protection regulators. On one hand, they increasingly understand the limits of consent, as we see in the consent consultation by the Office of the Privacy Commissioner of Canada. They understand society’s need to assure both protection for individuals and the beneficial impacts in health, education, safety and economic growth that comes from robust data use. This increasingly means data protection by design that looks at the full range of human interest, not just autonomy. However, the trust deficit that exists in all markets drives regulators to question whether processes that are subjective in nature, (all assessment processes that require judgement are subjective), can ever consistently be done with integrity. This leads to the conclusion that subjective processes must have some form of scalable, independent verification. At present, we have no good models on how such verification might work, which leads to bright-line rules, like consent.

The IAF has concluded that effective data governance requires a family of assessment frameworks that range from simple triage of new processing to determine the level of assessment necessary, through traditional privacy impact assessments, to ethical big data assessments, and finally, to comprehensive data impact assessments. It is about utilising the right assessment for the specific governance challenge. We further believe that assessment processes must be supported by some oversight process to create trust, turning self-regulation into co-regulation.

I, personally, cut my teeth on co-regulatory processes with the Individual Reference Services Group (IRSG) that developed a set of governance principles in 1997. The principles covered the use of data for the purpose of creating reference and look-up services on individuals. The IRSG principles required organisations to assert they were following those principles so that they would be enforceable under the deceptive practices authority of the U.S. Federal Trade Commission (FTC). Furthermore, the principles required an annual external assessment using criteria developed in cooperation with PricewaterhouseCoopers. A statement of compliance for that assessment would then be made public and sent to the FTC, creating compliance certainty across the ecosystem.

The fourteen companies that came together to create the IRSG principles did not do so for altruistic reasons. An industry player had created a product that provided social security numbers (SSNs) to customers without either masking the data or qualifying the users and the uses. One could buy one’s neighbour’s SSN. Public concern threatened a legislative solution that would ban SSNs in a bright-line fashion. The co-regulatory process offered a far more nuanced governance over SSNs and fixed the trust deficit at the time.  IRSG eventually disappeared after the Gramm-Leach-Bliley Act was passed, regulating the primary sources of SSNs previously covered under the IRSG principles. However, the lessons learned from that process are still applicable today.

Today’s trust deficits are related to many public concerns. Data security concerns have led to data breach notification laws. Adoption of broad observation technologies has led to rules, such as the FCC privacy regulations, that are intended to stop tracking for certain entities.

The European General Data Protection Regulation (GDPR) creates some flexibility for data-driven research that was not part of the Directive. However, that flexibility requires clarifying guidance from the EU member states. The flexibility also requires the types of assessments that we are developing at the IAF. Viktor Mayer-Schonberger and Yann Padova argue, in ”Regime Change? Enabling Big Data Through Europe’s New Data Protection Regulation,” that this limited flexibility in the GDPR is the opening for further reform in data governance. That will only happen if there is an oversight process for assessments that break through the trust deficit.

Towards that end, the IAF is proposing a project to create options for oversight of the assessments required by modern data driven processing. We need corporate participation first, and then participation by other stakeholders, including regulators. The driver for business is not altruism, but rather the corporate need for rigorous processes that are trusted. As we have seen with the FCC, trust deficits produce backwardly focused bright-line rules that neither protect people or facilitate innovation that improves our lives.

Big Data Ecosystem, Fairness and Enforcement

Big data has become an increasingly scary phrase for all stakeholders in data protection. For privacy advocates, it often means loss of control, asymmetrical power and hidden discrimination. For regulators, it often means regulatory round pegs in operational holes of different sizes, in constantly moving locations, with mismatches that begin with vocabulary. For companies, it… Continue Reading

IAF Participates in OPC of Canada Stakeholder Meeting

On 6 October, the Office of the Privacy Commissioner of Canada (OPC) convened a stakeholder meeting in Halifax, Nova Scotia, for a discussion that included OPC’s strategic privacy priorities for the next five years. In particular, the meeting was part of a dialog on consent and privacy that the OPC launched with the publication of… Continue Reading

Autonomy and Fair Processing: Proportioning the Governance

The free flow of data to facilitate individual, business and societal needs as well as to facilitate the protection of individuals to whom the data pertains has always required proportionality. That proportionality is demonstrated in privacy law that protects both individual autonomy and fair processing. The constant evolution of information and communications technologies further emphasises… Continue Reading

Look North

Canada, from a data protection perspective, has often been the bridge between U.S. harms-based approaches to privacy and European rights-based approaches to data protection. Canada is again showing its leadership, and this time has done so in the discussion and consultation paper released by the Office of the Privacy Commissioner on 11 May  2016. The title… Continue Reading