Comprehensive Data Impact Assessment Master Project

Problem Statement

Today’s digital world increasingly uses data pertaining to individuals for purposes that serve societal, personal and corporate needs but are beyond the understanding and/or reasonable expectation of an individual. There is also a growing agreement that consent is not fully effective in governing such data and data use. Many national laws include limited exemptions for processing when consent is unavailable, while others, notably European law, provide legal justification based on the legitimate interest of an organisation when it is not overridden by the interest of the individuals. However, these exemptions are limited, unclear and/or outdated; for example, legitimate interests require a balancing procedure that has yet to be developed. Moreover, neither exemptions nor legitimate interests as currently practiced fill the gap between the processing that is taking place and the processing that with legal certainty may take place.

Privacy Impact Assessments (PIAs), which emerged in the 1990s as a means of assessing a project against Fair Information Practice Principles (FIPPs), also do not fully address this gap. PIAs work just fine for many projects. As project data intensity increases, however, there are an increasing number of data use areas and legitimising processes that require more expansive assessments.

Last year, the Information Accountability Foundation (IAF)[1] launched the Comprehensive Data Impact Assessment (CDIA) Master Project. The CDIA Master Project is a component of the IAF’s work on the Effective Data Protection Governance Project and serves as the umbrella for a number of subprojects, described below, that convene research and collaborate with industry and regulators to address this vacuum between what is needed and what is currently available and acceptable. In May 2016, IAF started working with Canadian business to develop an assessment process that might be used to create confidence that data used beyond individual understanding is being used in a legal, protected and appropriate manner in a Canadian context, and to demonstrate how that determination was reached.

Now, the IAF intends to addresses the current gap in Europe created by the passage of the General Data Protection Regulation (GDPR). The regulation requires organisations to practise privacy by design. As part of privacy by design, organisations will determine the types of impact assessments that are needed, such as balancing interests where legitimate interest is the legal basis to process and Data Protection Impact Assessments (DPIA) where there are high risks to individuals. By defining the balancing process required under the legitimate interests justification (Article 6) and DPIA requirements (Article 35), IAF will develop an assessment process that achieves the proper balance while recognising the full range of societal and personal issues that are in play when data is processed. This process will help organisations determine whether data-driven activities which are beyond the understanding and/or reasonable expectation of the individual achieves that balance in a manner that guards against paternalism and demonstrate how that determination was reached.

Opportunity

Given the heightened demand by global regulators to define and implement balanced assessments for a variety of current and emerging regulatory needs, there is an opportunity for business to help collaboratively fill this demand and to achieve more effective and efficient data protection.

Overview

The CDIA Master Project includes four distinct subprojects plus a communications plan that links to all four subprojects. Funders may support the Master Project or participate in one or more of the four subprojects. They are:

  1. Development of a Canadian assessment process for data used beyond expectations of the individual. This project began in May 2016.
  2. Development and socialisation of a legal basis for legitimate interests in Colombia. Date TBD.
  3. Creation and socialisation of a EU GDPR Assessments Project that consists of a EU Legitimate Interest Assessment (LIA) and a EU Data Protection Impact Assessment (DPIA). The EU GDPR Assessment Project will define the balancing test required by the GDPR and regulator expectations of what is legal, fair and just. This project launches in the fall of 2016.
  4. Establishment of a connection between LIAs and unfairness assessments for purposes of interoperability between the United States and Europe. Initial discussions have already been held in San Jose, California, and Washington, DC.

Project Descriptions

Canada
The IAF in partnership with AccessPrivacy is developing a Canadian-specific assessment process for big data and other processing that is not appropriately governed by consent. The partners are working with an industry group to customise the IAF ethical big data assessment. The initial phase of this project will conclude early in 2017 with a multi-stakeholder consultation. IAF received a funding grant from the Office of the Privacy Commissioner of Canada for this multi-stakeholder evaluation process.

Colombia
Colombia’s data protection law requires consent for all processing of personal data with very limited exceptions. This legal provision has been an impediment to data-driven innovation. The Colombian Superintendent of Industry and Commerce will likely recommend amendments to the Colombian Congress to reflect current concerns. Among the suggested provisions will be a section on legitimate interests. IAF will provide guidance to the policy discussion on legitimate interests and trustworthy assessments to achieve processing pursuant to legitimate interests that is legal, fair and just. The Latin American partners of Baker McKenzie will provide logistical support and legal expertise to the project.

Europe
The GDPR will go into effect in Spring 2018. The GDPR still requires the appropriate legal basis for any processing that is taking place. However, where it is not fully effective, consent can no longer be stretched to be the basis for legal processing. In addition, other parts of the GDPR require assessing risk. A model assessment process will be needed to inform the debate on how LIAs and DPIAs should be implemented. The balancing process being developed by the Canadian project is a logical starting point for the development of the balancing process for the LIA. This development will require European participation. The goal will be to develop an assessment process in late 2016 that will be socialised throughout 2017 as part of the development of implementation guidance for the GDPR.

United States
Privacy in the United States is enforced most often under the Federal Trade Commission Act, which prohibits unfair and deceptive practices. The FTC has a test for determining whether a practice is unfair. The FTC has often required organisations under consent decrees to create accountable privacy programmes that include an assessment of the risks to consumers created by processing data. The orders do not discuss how such assessments should be conducted. However, according to senior FTC staff, organisations should take note of how the FTC conducts an unfairness test. There is great similarity between an unfairness assessment and a balancing/LIA, and there is room for the similarities of the tests to facilitate interoperability. IAF, with hosting by the Future of Privacy Forum, held a Washington, DC, session in May 2016 to continue a discussion that began in March 2016 in Santa Clara, California with the Conference Board Chief Privacy Officers Council. Further work on this front is planned.

[1] The IAF is a tax-exempt non-profit research and education organisation under Section 501(c)(3) of the United States Internal Revenue Code whose mission is forward-looking, balanced information policy.