Category Archives: Consent

Demonstrable Accountability and People Beneficial Data Use

Data driven societies and economies must create a means for data that pertains to people to be used in a productive and protected manner.  Those methods include processing so complex that people need specialized knowledge beyond many people’s ability in order to effectively govern that processing through consent.  If consent is beyond a person’s capabilities and is then less effective for permissioning data processing, then a trustworthy alternative to consent must be found.  An alternative is especially  necessary when processing creates real value for people, groups of people and society. 

Use of an alternative means of permissioning does not mean a reduction in transparency.  Transparency, robust corporate data governance, subject to the law, and internal and external oversight are necessary to hold organisations accountable.  The Canadian Ministry for Innovation, Science and Economic Development (“ISED”) shares the IAF interest in this issue and helped fund a multi-stakeholder project to suggest “A Path to Trustworthy People Beneficial Data Activities.”[1] 

Canadian private sector national and provincial privacy laws typically require consent in order to process personal data.  Newer privacy laws such as the European Data Protection Regulation specify that personal data processing must be based on a legal basis, and consent is only one of the six legal bases.  The IAF’s task was to set forth the elements of an accountability process for situations where consent is not effective, but the processing creates tangible, well-documented benefits for people.  This process is demonstrable accountability which, among the requirements, includes an assessment process that balances the risk of harm and the benefits to people.

The components of a demonstrable accountable process include:

  • Designating senior officers who are accountable for People Beneficial Data Activities
  • Conducting People Beneficial Impact Assessments (PBIAs)
  • Achieving an enhanced standard of transparency covering People Beneficial Data Activities and their associated governance processes
  • Having specific internal oversight
  • Being subject to independent external oversight
  • Keeping records of People Beneficial Data Activities
  • Protecting individual rights and implementing transparent redress systems

The IAF suggests all of these components must exist for people beneficial processing to be trustworthy.

The IAF report includes recommendations to ISED for them to consider as they draft any proposals related to existing Privacy laws that might be considered by  Parliament:

  • Explicitly recognize People Beneficial Data Activities as serving a legitimate purpose because on balance the activities are beneficial to people when the risks to people are reduced to an acceptable level.  Limiting People Beneficial Data Activities to those provided within the “conditions of the supply of a product or service” might contribute to personal data not being used for people beneficial purposes. 
  • Explicitly move beyond consent as the primary authority to process personal information for People Beneficial Data Activities where consent is not fully effective.
  • Explicitly recognize People Beneficial Data Activities as a new or expanded authority to process personal information beyond reliance on the concepts of consent and legitimate purposes tied to the provision of products and services. This recognition would lessen reticence risk (i.e. a reduction of any inhibition to data driven innovation) and would provide more benefits to stakeholders because these activities have not been limited to those provided within the “condition of the supply of a product or service,” as long as the People Beneficial Data Activities meets all the elements of demonstrable accountability and are aligned with the objectives, culture, and values of the organization. 
  • Expressly provide a method for determining what data activities are people beneficial and provide clarity regarding this processing through public policy. This express provision will reduce reticence risk to people, society and organizations and help to implement Canada’s Digital Charter while protecting the privacy of Canadians.
  • So that People Beneficial Data Activities are transparent, expressly provide in policy requirements the elements of demonstrable accountability:  be accountable, conduct a people beneficial data impact assessment (PBIA), be transparent, have internal oversight, be subject to independent external oversight, keep records, and protect individual rights.

The IAF also suggests that further work is needed to determine the degree and type of effective oversight and governance for organisations processing data for people beneficial uses. Effective oversight and governance  are necessary for this type of processing to be fully trusted. 

Canadian businesses contributed to the research and the development of the report and participated in a multi-stakeholder session that included academics, privacy advocates and regulators.

While this project was specific to Canada, the IAF believes the findings are applicable to other jurisdictions.  Europe is struggling to make legitimate interest work as a trusted legal basis to process data, and the United States is only beginning the public policy process for advanced data use.  The IAF will be updating its prior work on legitimate interests based on the people beneficial data activity report.

The project report, which includes a model assessment process, can be found here

Please let us know what you think.

[1] The IAF is solely responsible for the report findings.  They do necessarily reflect the views of ISED, participants, or the IAF Board. 

Knowledge Discovery Alone Is Not a Similarly Significant Effect

New knowledge drives mankind forward.  Sometimes the knowledge is used wisely; sometimes it is not.  Sometimes inappropriate uses have negative impact on individuals.  Data, much of it relating to individuals, is key to the generation of knowledge.  However, it is important to separate out the distinct functions of data driven knowledge creation, knowledge discovery, and… Continue Reading

Accountability is As Enforceable as Any Other Privacy Management Mechanism

Accountability has increasingly become the nucleus of effective data protection in a world where the observation of people is critical to how machines and systems work and drives advanced analytics.  Canada was the first country to explicitly capture accountability as part of its privacy law, and therefore actions in Canada have impact beyond Canada.  Now… Continue Reading

Legitimate Processing Invented here in the U.S.

Privacy law began in the United States when Congress enacted the Federal Fair Credit Reporting Act (“FCRA”) in 1970.  While framed as a consumer protection law, it was most certainly fair processing legislation.  FCRA established rights of access, correction, and accuracy, but most importantly it created the concept of permissible purpose.  The concept of permissible purpose is the seed… Continue Reading

Fourth Privacy Legislative Wave

Privacy legislation again is a hot topic in the United States. The California Consumer Privacy Act has added to the pressure provided by the European Union General Data Protection Regulation (GDPR). Think tanks, trade groups and consumer organizations are all proposing frameworks for the United States. The Information Accountability Foundations (IAF) is one of those… Continue Reading

IAF Releases U.S. Privacy Framework Discussion Document

The time is right to discuss an updated privacy framework for the United States that maintains the ability to think and learn from data while also protecting individuals in a highly observational digital ecosystem. Respected voices from all sides of the privacy debate are seeing the unintended consequences, from controls that are not keeping up… Continue Reading

Trust Deficits, Bright Lines and Verification

Freight trains heading through a tunnel are pretty hard to miss. The equivalent of a freight train in the world of privacy is a privacy trust deficit related to information use in an observational age driven by analytics. The many, many indicators that have piled up over the recent days, weeks and months are coming… Continue Reading

Big Data Ecosystem, Fairness and Enforcement

Big data has become an increasingly scary phrase for all stakeholders in data protection. For privacy advocates, it often means loss of control, asymmetrical power and hidden discrimination. For regulators, it often means regulatory round pegs in operational holes of different sizes, in constantly moving locations, with mismatches that begin with vocabulary. For companies, it… Continue Reading

IAF Participates in OPC of Canada Stakeholder Meeting

On 6 October, the Office of the Privacy Commissioner of Canada (OPC) convened a stakeholder meeting in Halifax, Nova Scotia, for a discussion that included OPC’s strategic privacy priorities for the next five years. In particular, the meeting was part of a dialog on consent and privacy that the OPC launched with the publication of… Continue Reading

Autonomy and Fair Processing: Proportioning the Governance

The free flow of data to facilitate individual, business and societal needs as well as to facilitate the protection of individuals to whom the data pertains has always required proportionality. That proportionality is demonstrated in privacy law that protects both individual autonomy and fair processing. The constant evolution of information and communications technologies further emphasises… Continue Reading