Who Is Paying the Policy Debt?


There is a technology concept called the “tech debt.”  It accumulates over time when software development applies the easy and quick incremental answer to complete or  change a project or fix glitches in a system.  Over time, one short term fix is layered one upon another, compounding over time.[1]   Eventually, the interest on that debt needs to be repaid. 

There is a corollary in the information policy world that could be referred to as “policy debt.”  It occurs in both the public and corporate policy arenas and links to the innovation cycle, where any break in the cycle, be it data, information, knowledge or action, impacts business resiliency.  The policy debt has been accruing interest since the mid-1990’s, and the interest payment seems to be coming due now.

Western liberal democracies are based on capitalism and its corporate structure.  Corporations have shareholders that elect boards to govern the corporations for the benefit of stakeholders.  Those boards are responsible for four governance principles:  Strategy, financial performance, business resiliency and compliance.  Which of those four principles are touched by current regulatory direction and privacy laws that lag digital technologies?  Obviously, compliance is first, with billion-dollar fines for failure to comply.  But the debt payment does not stop there.  Regulators are now requiring that data be purged, and that the software developed with the data be erased as well.  In Denmark, regulators required that schools brick student computers linked to the cloud.  These actions trigger business resiliency.  In the latest Meta case, the European Data Protection Board questioned whether Meta’s advertising-based business model is actually a legitimate business strategy.  Three out of four governance principles are triggered by regulatory attempts to pay down the policy debt. 

What follows is from my first-hand experience with the policy world dating back to 1988 when I went to work at TRW as consumer policy director.  I visited Brussels when the Data Protection Directive was being drafted, sat at the table when the U.S. FTC and States took on the credit reporting industry, and most important as the regulation of the consumer Internet was debated in Washington DC meeting rooms.  It is that Internet debate where the policy debt acceleration began. 

Third party cookies were introduced in the mid-1990’s as a means to facilitate an advertising funded consumer Internet.  This approach required triggers on browsers that linked to tracking software stored on consumers’ hard drives.  Was this tracking within the public commons or was it comparable to families’ homes, where some level of seclusion was expected?  The policy decision was to not answer this question and to push the answer to the future when the consumer Internet was better established.  Instead, the focus was on transparency in the form of privacy notices that, over time, would get increasingly long and dense.  The interest due on this policy decision has been accruing.

That policy indecision facilitated nearly two generations of digital and economic growth – new products and services, the rise of new brands and jobs — all fueled by personalized advertising. It made possible many new business models and in some cases disinformation and manipulation. Eventually it focused policymaker attention on that policy debt. That first generation of observational technologies became not just a means for facilitating advertising; it also became central to how things actually work in the interconnected world in which we live.  Smart cars, medical devices, cyber security, fraud detection, communications, and the whole Internet of Things was and is dependent on observation.  How is a means for supporting a digital ecosystem fueled by observation created without paying this debt?

Many academics and others have given this observation a name: “surveillance capitalism” which suggests simple solutions to paying down the policy debt’s accrued interest.  Terms like data minimization, do-not-track, do-not-sell become part of the policy vocabulary.  But simple answers rarely work.  As Professor Dan Solove points out in his new article, “Data is What Data Does:  Regulating Use, Harm and Risk Instead of Sensitive Data,” legislation based on use, harm, and future harm in the form of risk is complicated, and regulating on the proposition that some data is more sensitive than other data is simple but not effective. 

Some would say that the EU GDPR was designed to create a pathway to paying the policy debt, but the essential nature of observation and the new technologies that it facilitates have made implementation of the plain language of the GDPR and GDPR inspired legislation in other regions less than optimal.  The GDPR was intended to be risk-based, but the GDPR did not define the risks that are to be considered as organizations manage digital processes.  Data protection and privacy speak to three tasks:  assure some a space not subject to observation, where private thoughts and family life might prosper; allow for people to define themselves and not be defined completely by their digital waste; and have fair outcomes when data is processed.  Does the risk-based approach place the emphasis on personal controls over observation and processing – data subject rights, or on the fairness of outcomes? Both sides of that equation are important, but risk management requires prioritizing one over the other. 

The IAF work on “risk of what” has led us to understand that visualizing risk is not one outcome versus data subject rights, but rather dependent on a stakeholder’s first impression of what is most at issue.  Stakeholders go beyond the data subject and the controller and include parties impacted by the processing that are not the active participants. So, a risk-based policy system must be stakeholder based.

There is a great deal of evidence that privacy regulators are doubling down on data subject rights, with a narrow focus on one stakeholder, the data subject. Recent cases have focused on narrowing legitimate legal bases, requiring transparency with conflicting values of simple and complete, and necessity that reaches to the legitimacy of business processes.

So, as we celebrate privacy week, let’s spend a minute thinking about the policy debt.  We might think about new policy models that embrace the complexity of the digital age, considering all stakeholders’ interests and make sure the policy debt is paid in a fashion that is equitable to all in the many roles they play:  data subject, patient, employee, citizen, student, shareholder, pensioner, etc.

[1] Technical Debt https://www.productplan.com/glossary/technical-debt/