We are in the midst of an explosion in the generation of observational data that feeds artificial intelligence, big data, and the operation of the ever expanding Internet of Things. In 2007, when I began working on big data, the pundits claimed we would have a doubling of data created reaching 5 Exabytes per year. More recently, some experts predict data generation will reach 44 Zettabytes a year by 2020. A Zettabyte is a thousand Exabytes. So in 2007, we were talking about a doubling to 5 Exabytes, while we are talking about 44,000 Exabytes by 2020. If that isn’t an explosion, then I don’t know what is. However, this raises the question of whether this level of observation equates to the surveillance described by Oscar Gandy, Jr.
Well before the first consumer browser, in his book “The Panoptic Sort” (published in 1993), Gandy considered all information about individual status and behaviour to be potentially useful in the production of intelligence about a person’s economic value. He also articulated the clear issues related to living in a society where everything a person does is captured and recorded in a digital form. He also discussed the risks to freedoms from constant surveillance. If observation is surveillance, then today we live in the world Gandy described.
Gandy would have made a great speaker at this year’s International Conference of Data Protection and Privacy Commissioners (ICDPPC) in Marrakech. Gandy retired in 2006, but his book clearly articulated the issues in the room. Both at the ICDPPC’s closed session, which touched on artificial intelligence, and the open session, where the issues ranged from data protection as a development issue to government use of private sector data and digital education, observational technology was the elephant in the room.
Observation has become a feature necessary for technology to work. Autonomous cars require constant observation to avoid running into each other, medical devices require monitoring to work properly and with prescribed medications, and the more “smart” phones know the “smarter” they are. Individuals are not able to avoid observation since technological development has increasingly relied on sensing.
So, if observation of the individual is not avoidable, how are the consequences that Gandy described avoided? Can the knowing from behaviour that has been observed and digitalised be limited to what is appropriate? I believe that limited seeing is an essential component of fair processing, and fair processing has always been a component of data protection and privacy law. As consent at point of collection becomes more limited, as it has in an observational world, then fair processing becomes ever more important. It is the need for fair processing that has made accountability a more prominent feature of law and practice.
In 1993, Gandy speculated that people would self-sensor if under constant surveillance. However, in 2016, there is a difference between surveillance and observation. The Oxford Dictionary defines surveillance as close observation, especially of a suspected spy or criminal. I increasingly believe that the purpose for observation matters. When observation is conducted for the purpose of control, as it might in a police state or a prison, it does have impact on sense of self and freedom of expression. However, when observation is a feature of how something or a system operates, it can be less intrusive by design. This is where fair processing comes into play.
Over the past three years, the Information Accountability Foundation (IAF) has conducted research on how to utilise observational data in a fashion that is not exploitive and damaging of a sense of self. The work has included Dynamic Data Obscurity as a means of mitigation and Effective Data Protection Governance as means to assure innovation with protection. The IAF’s work on assessment processes is keyed to this proposition that observation should not be surveillance, as has its work on government use of private sector data.
As 2017 approaches, the IAF will re-energise the Global Accountability Dialog with a focus on the Essential Elements related to policies and enforcement. There will be working parties exploring aspects of accountability such as external oversight, a requirement of trusted fair processing. The challenge is to make oversight fair and scalable.
I was moved by Gandy’s book when I read it 22 years ago. The IAF takes as a given that the observational world exists and will become more so. There is nothing minimal about a universe that is expected to generate 44 Zettabytes of data.
The Marrakech conference featured a side event on Privacy Bridges. One of those bridges is accountability. The consensus in the room was that accountability would take care of itself, much more than the other bridges. While wonks, like me, will push that work, we are now at the stage where all stakeholders need to be involved in designing the key components of digital accountability that assures trust. This is particularly so for data dependent organisations in areas such as Europe where new rules are being generated as we speak. It is now time for action by responsible organisations to assure Oscar Gandy’s vision of the world does not come true.
 Anthony Adshead, “Data set to grow 10-fold by 2020 as internet of things takes off”, Computer Weekly, http://www.computerweekly.com/news/2240217788/Data-set-to-grow-10-fold-by-2020-as-internet-of-things-takes-off.