As information policy experts, we have always known that data can be misused to create misery. The direct marketing industry began when snake oil salesman sold sucker lists. Yet the ability to think with data has contributed to drug discoveries, safer cars, lighter airplanes, and greater opportunities. We have always known that observational technologies combined with math can increase the risks to fundamental rights and interests. But the output can also enhance fundamental rights and interests. The reporting on Cambridge Analytica is a very robust demonstration of the potential harms. Very smart people can create the tools to manipulate not just people but the ecosystem in which they think. We should in no way under estimate the ability for people to cause societal risks. However, that does not change the basic premise that a more crowded world requires more knowledge creation through data and math. We still want better healthcare, less congestion, better education and efficient markets.
What an event like Cambridge Analytica means is that we need to double down on sound process and, yes, more inter and intra organizational oversight. The IAF’s forthcoming work on how the virtues or objectives of Research Ethics Boards (the Canadian equivalent of Independent Review Boards) might be applied to data analytics finds that commonized processes dictated by robust standards for reviewing the appropriateness of processing is necessary. This example of oversight is built off Comprehensive Data Impact Assessments that explore the full range of interests and issues for all stakeholders. Furthermore, as suggested by IAF’s work on effective data protection governance, it is now time to consider codes of conduct for advanced analytics overseen by accountability agents funded in a manner consistent with both the value that can be released by processing and risk from some agents.
The IAF fully agrees with Eduardo Ustaran’s analysis in today’s IAPP Daily Dashboard. Ustaran said: “These revelations expose data practices that have been an area of concern for regulators for a while. Essentially, they feed the regulators’ worst fears about the digital economy. ……. We can expect a tightening in the level of tolerance by policy makers and regulators for data collection practices though our daily interaction with technology.”
The push for bright-line rules will be strong. But there are surely no guarantees. Even with the best assessment processes, people who misrepresent may get access to data they shouldn’t or use it in questionable ways. At the end of the day, a data driven age must be driven by data. It is time to double down on the processes necessary to assure we don’t trade the risk that comes from the miss-use of data by people who will manipulate to get their way for reticence risk – value creating potential that becomes not realized. Both risks must be managed. We still need to think with data.