Author Archives: Martin Abrams

Demonstrable Accountability and People Beneficial Data Use

Data driven societies and economies must create a means for data that pertains to people to be used in a productive and protected manner.  Those methods include processing so complex that people need specialized knowledge beyond many people’s ability in order to effectively govern that processing through consent.  If consent is beyond a person’s capabilities and is then less effective for permissioning data processing, then a trustworthy alternative to consent must be found.  An alternative is especially  necessary when processing creates real value for people, groups of people and society. 

Use of an alternative means of permissioning does not mean a reduction in transparency.  Transparency, robust corporate data governance, subject to the law, and internal and external oversight are necessary to hold organisations accountable.  The Canadian Ministry for Innovation, Science and Economic Development (“ISED”) shares the IAF interest in this issue and helped fund a multi-stakeholder project to suggest “A Path to Trustworthy People Beneficial Data Activities.”[1] 

Canadian private sector national and provincial privacy laws typically require consent in order to process personal data.  Newer privacy laws such as the European Data Protection Regulation specify that personal data processing must be based on a legal basis, and consent is only one of the six legal bases.  The IAF’s task was to set forth the elements of an accountability process for situations where consent is not effective, but the processing creates tangible, well-documented benefits for people.  This process is demonstrable accountability which, among the requirements, includes an assessment process that balances the risk of harm and the benefits to people.

The components of a demonstrable accountable process include:

  • Designating senior officers who are accountable for People Beneficial Data Activities
  • Conducting People Beneficial Impact Assessments (PBIAs)
  • Achieving an enhanced standard of transparency covering People Beneficial Data Activities and their associated governance processes
  • Having specific internal oversight
  • Being subject to independent external oversight
  • Keeping records of People Beneficial Data Activities
  • Protecting individual rights and implementing transparent redress systems

The IAF suggests all of these components must exist for people beneficial processing to be trustworthy.

The IAF report includes recommendations to ISED for them to consider as they draft any proposals related to existing Privacy laws that might be considered by  Parliament:

  • Explicitly recognize People Beneficial Data Activities as serving a legitimate purpose because on balance the activities are beneficial to people when the risks to people are reduced to an acceptable level.  Limiting People Beneficial Data Activities to those provided within the “conditions of the supply of a product or service” might contribute to personal data not being used for people beneficial purposes. 
  • Explicitly move beyond consent as the primary authority to process personal information for People Beneficial Data Activities where consent is not fully effective.
  • Explicitly recognize People Beneficial Data Activities as a new or expanded authority to process personal information beyond reliance on the concepts of consent and legitimate purposes tied to the provision of products and services. This recognition would lessen reticence risk (i.e. a reduction of any inhibition to data driven innovation) and would provide more benefits to stakeholders because these activities have not been limited to those provided within the “condition of the supply of a product or service,” as long as the People Beneficial Data Activities meets all the elements of demonstrable accountability and are aligned with the objectives, culture, and values of the organization. 
  • Expressly provide a method for determining what data activities are people beneficial and provide clarity regarding this processing through public policy. This express provision will reduce reticence risk to people, society and organizations and help to implement Canada’s Digital Charter while protecting the privacy of Canadians.
  • So that People Beneficial Data Activities are transparent, expressly provide in policy requirements the elements of demonstrable accountability:  be accountable, conduct a people beneficial data impact assessment (PBIA), be transparent, have internal oversight, be subject to independent external oversight, keep records, and protect individual rights.

The IAF also suggests that further work is needed to determine the degree and type of effective oversight and governance for organisations processing data for people beneficial uses. Effective oversight and governance  are necessary for this type of processing to be fully trusted. 

Canadian businesses contributed to the research and the development of the report and participated in a multi-stakeholder session that included academics, privacy advocates and regulators.

While this project was specific to Canada, the IAF believes the findings are applicable to other jurisdictions.  Europe is struggling to make legitimate interest work as a trusted legal basis to process data, and the United States is only beginning the public policy process for advanced data use.  The IAF will be updating its prior work on legitimate interests based on the people beneficial data activity report.

The project report, which includes a model assessment process, can be found here

Please let us know what you think.


[1] The IAF is solely responsible for the report findings.  They do necessarily reflect the views of ISED, participants, or the IAF Board. 

Knowledge Discovery Alone Is Not a Similarly Significant Effect

New knowledge drives mankind forward.  Sometimes the knowledge is used wisely; sometimes it is not.  Sometimes inappropriate uses have negative impact on individuals.  Data, much of it relating to individuals, is key to the generation of knowledge.  However, it is important to separate out the distinct functions of data driven knowledge creation, knowledge discovery, and… Continue Reading

IAF Releases Model Legislation Summary

The California Consumer Privacy Protection Act went into effect on January 1, and a ballot initiative to update that law is slated for November.  State privacy legislation has been reintroduced in Washington state, and Nevada is following.  Privacy Shield may be overturned by the European Court of Justice, and more countries are adopting legislation that… Continue Reading

Digital Activities go Beyond Privacy and Data Protection

Sunday, November 10, the New York Times ran a story on the ability of bad persons to hide and distribute child pornography on the Internet.  Tuesday, November 12, the New York Times ran a story on a unit of Google assisting Ascension, the second largest U.S. health organization, to mine data on millions of patients… Continue Reading

IAF Releases “Advanced Data Analytic Processing – 2019 Update”

Central to the work of the Information Accountability Foundation is the concept that using data to discover new insights about people raises a different set of risks than using data to make decisions about people.  That foundational idea was first explored in a paper published by the Centre for Information Policy Leadership entitled “Big Data… Continue Reading

The Fair Information Policy Development Vacuum

Over the past decade, policy development in the data protection field has been very robust, with some good and bad results and some results that are a muddle.  Yet, with all this activity, there still seems to be a sense that there is a policy vacuum that cries to be filled.  In the simplest terms,… Continue Reading

Hard Law Must Mesh With Contextual Based Fair Use

The 41st International Conference of Data Protection and Privacy Commissioners (ICDPPC) will take place in late October in Tirana, Albania. The ICDPPC is a conclave of enforcement agencies seeking the means to create commonalities in a world where data flows ubiquitously.  ICDPPC will explore a key dilemma that challenges the growing community of data protection… Continue Reading

Trust Deficit Acceleration Means Trust but Verify

Dirty diesel cars, opiates, income disparities, and institutional failures.  The trust deficit caused by these abuses or plain mistakes seems to be accelerating beyond red to bright red.  This acceleration has huge ramifications for new privacy laws and for interpretations of existing laws. The IAF staff recently visited a privacy regulatory agency to discuss how… Continue Reading

A great Visionary Has Died

My good friend Giovanni Buttarelli has passed away far too soon.  Giovanni was the European Data Protection Supervisor when he died on 20 August.  However, I wish to think of Giovanni as a visionary and philosopher.  You will often see Giovanni’s words quoted in IAF research papers.  Those words, “data should serve people” is more… Continue Reading

Privacy Law’s First Objective Is That Data Should Serve People – The U.S. Opportunity To Get Privacy Legislation Right

Laws to govern the data age are extremely hard to draft.  Policymakers will encounter this  when they revise competition law to deal with data rich conglomerates.  They have already tried to address this in the privacy area through the European General Data Protection Regulation (GDPR) which recently had its one-year anniversary. However, there already are… Continue Reading