Category Archives: Information Accountability Foundation

Knowledge Discovery Alone Is Not a Similarly Significant Effect

New knowledge drives mankind forward.  Sometimes the knowledge is used wisely; sometimes it is not.  Sometimes inappropriate uses have negative impact on individuals.  Data, much of it relating to individuals, is key to the generation of knowledge.  However, it is important to separate out the distinct functions of data driven knowledge creation, knowledge discovery, and the application of the resulting knowledge, knowledge application. The GDPR  encourages knowledge discovery but also requires the accountable process of impact assessments that identify risks to individuals and that make sure those risks are documented.  The best public policy is one that encourages knowledge discovery, even at the commercial level, and knowledge application in a legitimate and fair fashion. 

The IAF team is concerned that society is sleep walking into an era where knowledge discovery will be precluded by a restrictive reading of the data protection law, especially with respect to knowledge application.  For that reason, the IAF filed comments on the UK ICO’s draft code of practice on direct marketing (Draft Code).  The IAF is concerned that the Draft Code suggests the mere processing of data to generate insights, without a sense of tangible negative effects, will be considered to have consequential effects.  By extension, the requirements of the GDPR relative to these effects extend these same requirements to knowledge discovery where there is often less direct impact to individuals.  For example, one issue in the comments is profiling to segment markets.  What is at issue is when does profiling have legal or significantly similar effect?  Part of the IAF’s comments are as follows: 

Underlying the IAF concerns are the differences between privacy and data protection as fundamental rights.  The right to privacy relates to individual autonomy and family life while the right to data protection relates to the risk to people arising out of the processing of data pertaining to them.  The right of individuals to control their data, as a privacy right, is always important, but it is particularly so in instances where individuals should have the ability to protect themselves and their families and to form and socialize new ideas with a small circle of chosen friends.  Consent as a governance mechanism works most effectively in situations where individuals knowingly provide data.  Increasingly, data have their origin either in individuals’ interaction with the world (observed) or in the insights that come from processing data (inferred).   The legal basis for that processing increasingly is legitimate interests or fulfillment of a contract.  In those instances, the processing must be fair.  Fairness includes transparency, and transparency is challenging in the direct marketing ecosystem.  There is room for improved transparency in the direct marketing ecosystem.  Fairness also requires a series of assessments to determine that data bring value to people and do not cause actual harm.  The General Data Protection Regulation (“GDPR”) created data protection impact assessments (“DPIAs”) to make sure organisations considered both benefits and harms to stakeholders when processing data.  Individuals benefit from competitive markets, so it is reasonable to consider whether less competition because of overly cautious interpretations of data protection law creates harms to individuals that are tangible.

As stated earlier, observation has become overly ubiquitous in today’s society.  The IAF believes that the movement to limit third-party cookies will have some societal benefits in this area.  However, even with those changes, the technology and processing behind market segmentation will be complex and understanding that process will not be most individuals’ main concern.  So, the role of organisations and regulatory agencies becomes more important.  Organisations must conduct assessments at almost every stage of the processing and must be able to demonstrate those assessments were conducted in an honest and competent fashion.  Regulators most oversee and enforce substantially enough so organisations believe the likelihood of enforcement is high.

The segmentation process uses probability to segment individuals into cohorts of those likely to do something and those that are not likely to do so.  Segmentation logically fits into the GDPR’s definition of profiling.  The GDPR requires consent where the profiling has legal and similarly significant effect.  It is IAF’s view that a lack of individual awareness of the robustness of the processing alone does not meet the test of being a similarly significant effect.  Similarly, significant effect may come from the actual use of insights to make decisions.  DPIAs are designed to identify similarly significant effects, justify or mitigate them, and document the outcome.  The IAF sees indications in the Draft Code that the ICO is leaning in the direction of finding that the processing of data for segmentation has significant impact on the individuals the data pertains to. The impact on the societal value brought by direct marketing by requiring knowledge discovery to be subject to consent would be negative and therefore have a negative impact on individuals. 

While these comments are directed at the ICO, there are indications that other data protection authorities may have similar views.  Knowledge discovery may create insights that are detrimental to individuals when used in an inappropriate fashion. This type of potential risk is why the GDPR is “risk based” and requires assessment of risk.  But to restrict profiling and knowledge discovery to only where consent is an effective governance process creates reticence risk.  The assessment and balancing of risks through processes as outlined in the GDPR, conducted honestly and competently, is the better answer.  The IAF will be scheduling a policy call on March 19 to discuss the issues raised by the Draft Code.

Hard Law Must Mesh With Contextual Based Fair Use

The 41st International Conference of Data Protection and Privacy Commissioners (ICDPPC) will take place in late October in Tirana, Albania. The ICDPPC is a conclave of enforcement agencies seeking the means to create commonalities in a world where data flows ubiquitously.  ICDPPC will explore a key dilemma that challenges the growing community of data protection… Continue Reading

The Incubation of Trust Based Governance Systems Should Be Encouraged

Increasingly, around the world, there is a shift in the way privacy legislative and regulatory approaches are being developed. This is in response to the recognition that much of our collective future economic growth, benefiting both individuals and society, will be driven through digital transformation. Most recently, Canada’s Digital Charter has recognized that digital transformation… Continue Reading

Privacy Law’s First Objective Is That Data Should Serve People – The U.S. Opportunity To Get Privacy Legislation Right

Laws to govern the data age are extremely hard to draft.  Policymakers will encounter this  when they revise competition law to deal with data rich conglomerates.  They have already tried to address this in the privacy area through the European General Data Protection Regulation (GDPR) which recently had its one-year anniversary. However, there already are… Continue Reading

Accountability is As Enforceable as Any Other Privacy Management Mechanism

Accountability has increasingly become the nucleus of effective data protection in a world where the observation of people is critical to how machines and systems work and drives advanced analytics.  Canada was the first country to explicitly capture accountability as part of its privacy law, and therefore actions in Canada have impact beyond Canada.  Now… Continue Reading

The Glacial Movement of Privacy and the Implications to Accountability

As we see more and more draft privacy laws being introduced in Congress and in state legislatures, an increasing number of enforcement actions in Europe, and more media interest in perceived privacy abuses by big tech companies, it may seem strange to equate privacy to the consistent, continuous movement of glaciers. The pace today seems… Continue Reading

Data Stewards Not Fiduciaries

The FTC held its hearing on its Approach to Consumer Privacy on April 9 and 10 In Washington DC. At that Hearing, the questions discussed included: what are the existing and emerging legal frameworks for privacy protection, and if the U.S. were to enact federal privacy legislation, what should such legislation look like?  In response… Continue Reading

Foundational Issues for New Privacy Law in the United States

A federal privacy law in the U.S. seems increasingly likely.  When?  It is not yet clear.  However, we can say with much certainty that in the coming months we will see many draft laws that will join the ones we have already seen from Senators, members of Congress, Intel, CDT and others.  The current series… Continue Reading

Data Driven Knowledge Creation Needs to be Protected

Our collective desire to have a space where we are free from observation is increasingly under pressure from modern technology, and our confidence that data that pertains to us will be used fairly is in a deficit mode. At the same time, data are being used to create new knowledge by gaining insights that would… Continue Reading

A Pivot (Back) to Accountability

In a recent article, Sheila Colclasure, Senior Vice President and Global Public Policy Officer at LiveRamp, wrote:  “If you want your company to exist now and in  the future, you will have to think and act with data. . . . With this [responsibility] comes accountability . . . . Business leaders must think strategically… Continue Reading