Since a multi-stakeholder group first defined them in 2009, the essential elements of accountability have become well established in the field of data protection. These elements are reflected in guidance from data protection and privacy authorities in Canada, Hong Kong, Australia, Colombia and France. They have been adopted as key elements in the European General Data Protection Regulation set to go into effect in 2018. They have also become a part of the emerging guidance for law enforcement and on national security data uses.
What has made the essential elements so useful is that they are technology and data neutral and that they were developed and perfected by a multi-stakeholder dialog. Yet, with the recent developments in areas such as artificial intelligence, machine learning, IOT and Big Data, there are a number of sound reasons why this dialog should be re-engaged.
The observational nature of data enabled by a sensor-rich set of ecosystems, coupled with the advancing analytical capabilities, has made information ecosystems much more complex. The speed, variety and volume of these systems have exponentially increased. While this was predicted, the reality is much richer than any forecaster might have anticipated. Smart cars, smart meters, smart shirts, smart pills, smart medical devices all generate data stored in the cloud just to operate. However, this data is increasingly accessible to a broad range of participants and available for a broad range of uses. This data and its use requires a broader and deeper level of governance to ensure it is responsibly managed in an accountable way. For example, this accountability requires policies that link not just to established criteria, but also to emerging ethical standards that cover a full range of interests to all stakeholders. In short, data must be created and used in a legal, fair and just way.
This observational data will be processed. This means new mechanisms to govern both “thinking with data” as well as “acting with data” will be required. Many of these information systems will be learning systems that self-correct and evolve without human intervention. These systems will need ethical analysis built into the processes themselves. Privacy by design, part of a fully accountable methodology, will mean something very different with these self-learning systems.
Increasingly, all data has the potential to impact individuals. What are the requirements for a new governance model that achieves accountable, responsible data stewardship? How would this be demonstrated? Internal governance including monitoring will need to be as sophisticated as the systems they monitor. Does this mean new guidance for data stewards?
Transparency and individual participation is different when observation is an element of effective operation. What type of participation is possible and will be effective? How will this align with the objectives of autonomy and dignity?
To be effective, enforcement will likely need to be distributed throughout these ecosystems. This likely means a different process for regulators. It also could mean regulators will need to delegate some of their role to others in the ecosystems. What would the rational expectations be related to those new trust agents?
Today, while there is considerable global focus on the GDPR, the Information Accountability Foundation (IAF) believes it is now time to re-activate the Accountability Dialog, with the initial specific goal of updating the commentary associated with the essential elements. Planning for the dialog will take place during the remainder of this year (2016). In the first half of 2017, IAF will convene a small, multi-stakeholder dialog with the goal to provide content to the 2017 International Conference of Data Protection and Privacy Commissioners and to then facilitate ongoing discussions. Ultimately, the IAF’s objective is to achieve an updated view of the elements of accountability and a shared understanding of what it means to be an accountable steward of data in today’s digital economy.