Elizabeth Denham, the United Kingdom’s Information Commissioner published a blog 16 August, busting the myth that consent would be required for all processing under the General Data Protection Regulation (GDPR). In addition to the GDPR-consent myth, over the years, many businesses have actually relied predominantly on consent as a means to achieve the lawful processing of data. However, the blog made it clear that under the GDPR and supported by her office’s previous guidance, “the GDPR sets a high standard for using consent”. She went on further to state that business should be planning now for other means of achieving legitimacy through legitimate interests, and not wait for regulator guidance. The IAF agrees with that position, and so do a number of other privacy tool creators.
Trust in credible assessment processes is now in play, so it is now time for the business community to demonstrate that it will make responsible use of legitimate interests as the means to permission innovative data use. Responsible means a balancing of the full range of stakeholder interests, meeting the true test for fair processing. This is extremely important because it is innovative and complex data uses that challenge the core elements of consent. The future for innovation and creating real value for people via advanced analytics, artificial intelligence, machine learning, and the harnessing of observational data, is dependent on what the business community puts forward now. If not, an ePrivacy regulation or other policy or regulatory approach could be in enacted that will run counter to the flexibility built into the GDPR. In Europe, we are confronted with “affirmative consent only” in ePrivacy Reg, in Latin America we are seeing next generation privacy laws that look to the past, and in the US, we are grappling with “affirmative consent only” state legislation that could be signed into law by the respective state’s governors. These “affirmative consent only” policy approaches simply are not practicable in our complex future world of advanced analytics, machine learning and artificial intelligence.
To address the gap in the practical use of legitimate interests to demonstrate accountability, the IAF has developed a legitimate interests assessment process that supports businesses application of the GDPR. The IAF framework includes the regulatory analysis and considerations for business process that are necessary to demonstrate responsibility. For the assessment portion of the framework, the IAF worked with TrustArc and thanks them for their collaboration. Additionally, the IAF policy analysis is helpful in creating the justification to use legitimate interests assessment processes being proposed by the UK Data Protection Network and Nymity. It builds on the IAF’s work on ethical assessments and ethical big data assessments in Canada.
This work is not just important for Europe. The Ibero-American Data Protection Network has adopted standards for next generation privacy law in this region. The standard includes legitimate interest. However, it still needs to move through the political process in each and every country, and push against legacy laws that were almost completely consent based.
Asia too is looking to next generation laws, and the flexible implementation of laws just enacted, as one finds in Japan.
Against this movement towards flexible data use is the political sense that people are losing control of their data in a highly observational world. We see this not only in the proposed ePrivacy Regulation in Europe, we also see this in state legislation in the United States such as California’s AB 325 that forces “affirmative consent” on a specific sector of digital players (broad-band ISP providers) and Illinois’ Geolocation Privacy Protection Act that requires that an “entity may not collect, use, store, or disclose geolocation information from a location-based application on a person’s device unless the entity receives prior affirmative, express consent after adequate notice.” Consent will remain an important part of overall data governance as will individual participation. However, it is not fully effective in protecting the full range of personal rights and interests that come into play as data and data analytics becomes ever increasing part in how many things work. For example, personal bots in mental health treatment and anti-collision braking are both dependent on observable data though connected devices.
The balancing answer, where all rights and interests are considered, is assessment processes, conducted with competency and integrity, that can be communicated to the public and demonstrated to regulators. The 39th Conference of Data Protection and Privacy Commissioners taking place September 28th and 29th in Hong Kong will have a whole afternoon dedicated to consent and governance when consent is not fully effective. The afternoon will include a concurrent session focused on trust as it relates to assessments. This session will be the beginning of work to design the elements that are necessary to make these assessments trustworthy. While effective assessment processes will be key so will developing trust that these assessments are being done with competency and integrity. The IAF has begun work on this element having received a generous grant from the Office of the Privacy Commissioner of Canada to explicitly explore trust in assessments.
In conclusion, every corporate privacy team is in an all hands-on deck rush to build the compliance programs necessary for GDPR. A big piece of that will be mechanisms to justify the use of data that pertains to people. Being able to demonstrate that legitimate interests will be used responsibly is not just a necessary tactical move to assure compliance, it is a strategic move necessary to assure companies will have the ability to use observational data to build benefits for people. The future for data use is being built now.