In 2021, the Policy Currents will Blow Hard from Numerous Directions

A generation ago privacy was much simpler.  By “a generation ago,” I mean the beginning of this century.  Yes, the early years of the Internet as a consumer medium and a newly enacted EU data protection directive were forces to reckon with, but the complexities of California’s data breach law, smart phones, big data and the EU’s ePrivacy Directive and General Data Protection Regulation (GDPR) didn’t exist yet.  Additionally, the growth in analytical driven data (and technology) use, such as artificial intelligence (AI), hadn’t been experienced. 

The IAF’s charge is to consider policy answers about factors that will emerge in the next 18 to 36 months, with the assumption that under 18 months policy will be fairly stable.  That is no longer a fair assumption.  Data breaches like Solar Winds, court cases like Schrems II, decisions like the one in Singapore that contact tracing data might be used for criminal investigations change the calculus.  For the IAF to meet its mandate, it needs to consider all the cross currents that impact policy related to impactful data flows. Stakeholders such as regulators, policymakers, NGOs and companies also must consider these cross currents. Toward that end, the IAF team aggregated all of these cross currents existing at the beginning of 2021.  Several non-IAF colleagues reviewed and contributed to a list of these issues.  These cross currents, which influence the initiatives undertaken by the IAF, include (not necessarily in order of importance):

  • Data insecurity
  • Observation, tracking, context, and control
  • Data as an asset, advanced analytics, and norming
  • Global data transfers
  • Public interests versus individual privacy
  • Organizational operational friction
  • Similar regulatory operational friction
  • Degradation of enforcement and regulatory controls
  • Impact of non-governmental entities
  • Unsettled direction of privacy and data protection law
  • Public trust diminished by misinformation enabled by social networks and other media
  • Emergence of dominant digital players
  • Changing global balance of power

This list is set forth in greater detail in an Appendix at the end of this article.

When counted, there are thirteen different conflicting currents.  All are important, but thirteen inputs are hard to manage.  So, the staff at the IAF took those thirteen currents and derived five themes that could be used to understand the frictions that get in the way of mapping more productive information policy:

1.  Accelerating trust deficit.  The deficit is reflective of a broader societal distrust in institutions ranging from decreased trust in aircraft design and vaccination creation to fears that an honest election is impossible.  In data protection, individuals’ mistrust begins with their data not being used to serve their interests, frequent data breaches. For regulators’, it is their impressions that organizations can’t be trusted to be accountable.  Outcomes of this distrust is reflected in regulators expressing a lack of confidence relative to the efficacy of legitimate interest assessments and in court cases like Schrems II. 

2.  Dissonance between policy objectives and political rhetoric related to privacy.  Policy makers say they want a digital future where advanced analytics drive economic growth, efficiency and global competitiveness.  Such an ambition requires more effective laws to govern data use pertaining to individuals.  At the same time, reading the distrust of people, policymakers announce privacy reforms that look back to 1980 and not forward to 2030.  This same dissonance is seen in organizations.  While organizations look for legal certainty, they covet a digital future that changes everything.  Digital ambitions require flexibility, and flexibility is inconsistent with certainty.  Flexibility is not inconsistent with sound demonstrable process, but such processes are hard to implement and hard to oversee.

3.  Operational Overdrive.  Data protection authorities are overwhelmed by the volume of complaints they receive, guidance they must draft, audits they must conduct, negotiations in which they must participate and new technologies they must understand.  The magnitude of this work drives them to a state of operational overdrive.  The same result is happening to the data protection offices in organizations.  There is today’s California law and tomorrow’s changes to it.  There is the GDPR and Brazil which are to be followed by China’s privacy law developments.  There are Schrems II supplemental measures and investigations by the FTC.  On one hand, a vision to lessen dissonance should be created; on the other hand, because of flat budgets, what activities will be cut to make room for new ones.  Organizations find these decisions incredibly difficult to make.

4.  Resource Mismatch.  Traditionally, privacy enforcement agencies and privacy officers were staffed by lawyers, investigators and administrators.  Today, they must be staffed by project managers, ethicists, scientists, technologists and operational experts and they still have the same number of lawyers, investigators and administrators.  Today’s data environment requires people who can make judgments in areas that are increasingly grey against a broad cross section of interests, and there is a resource mismatch between what is necessary and what is funded.  Each day that mismatch lingers the friction between the cross currents listed in this article grows.  

5.  Conflicting Cultures.  Last, and in many ways the most important, are conflicting cultures.  There are obvious conflicts between Western European concepts of individual sovereignty and Asian concepts of community harmony.  However, there also are the cultural conflicts between civil and common law, independence and collegiality, and rights pluralism and a focus on singular rights.  Organizations have cultural conflicts between compliance and scientific (research) curiosity; such conflicts are to be expected. However, there must be a means for harmony between cultures, and the lack of that harmony spins the conflicting currents into wind shear.

These are the IAFs’ view of how cross currents turn into actionable trends.  Your input is important to the IAF team.  Please let me know if you have additions or corrections to this list by emailing me at mabrams@informationaccountability.org

Appendix – Cross Currents in Detail

  • Data insecurity
    • Cybersecurity Nation-State actors are getting more aggressive
    • Data breach notifications are overwhelming regulators.
  • Observation, tracking, context, and control
    • Observation technologies are becoming increasingly necessary for the implementation of health, public safety, security, service and Internet of Things device solutions.
    • The relationship between observation and AdTech tracking and data use are often convoluted in the policy/advocacy arena creating friction for both observation technologies and marketing.
      • This conundrum is often framed in the following manner:  advertising (and its support) is necessary for competitive markets but is observation (and by extension data) necessary for effective advertising?
    • Use in context is increasingly based on trustworthy organizational decision making but trust in that decision making is in a deepening deficit.
    • The more data origination is observed or inferred the more individuals lose control over that data and the less transparent data use is to all parties. Nevertheless, consumers still expect seamless experiences.
  • Data as an asset, advanced analytics, and norming
    • More organizations recognize that data is an asset that must be utilized aggressively through advanced analytics, including AI, to stay competitive.
    • Impacts of decisions made based on flawed analytics are becoming more visible. AI and Machine Learning (ML) exasperate this challenge.
    • Automated decisions well within risk parameters are impacted by bright-line rules related to fears about profiling.
    • Human reliance on machine generated conclusions which are probability based exists even when there is human involvement.
  • Global data transfers
    • Data localization acceleration has been powered in part by the effect of Schrems II.
    • More jurisdictions, like Quebec, are demanding adequacy with their laws and cultures.
    • Difficulties in bringing the right parties together to create accountability norms for government use of private sector data for national security interests continue to exist.
  • Public interests versus individual privacy
    • Governments increasingly use new technologies to carry out mass surveillance of citizens.
    • Stress between governments and courts over use of private sector data for national security and law enforcement interests is increasing.
    • Stress is caused by private sector opposition to government reliance on their data and by resulting conflicting regulations.
    • New and enduring challenges exist for organizations processing personal information and privacy of those affected caused by the COVID-19 pandemic.
  • Organizational operational friction
    • Privacy offices increasingly are consumed with implementation of new laws, regulations, interpretations, and court cases at a time when greater adoption of data as an asset requires more strategic intervention by them.
    • Internal data governance tends to be siloed, and this isolation impacts accountability.
    • New skills in IT, Data Management, Data Science and Governance will be required and integrated within organizations to support technology driven data application/use governance (e.g., AI).
    • Compliance costs are increasing while budgets for resources, training, and operational priorities are marginalized resulting in potential harm to the organization as well as the individual and in sub-optimization of value creation.
  • Similar regulatory operational friction
    • Regulators must balance mandates to write new guidance, interpretation and responses to legislators while also investigating and enforcing the law.
    • Appropriations grow at a very slow pace, and resources need to be reallocated to understand new technologies and business processes.
  • Degradation of enforcement and regulatory controls
    • Over stressed and under resourced regulatory agencies are grappling with very complex balancing equations and are resorting to bright-line answers.  This is due to lack of:
      • Time to develop strategy and vision
      • Trust in controllers – particularly in complex data environments
      • People resources
      • Harmonization of legal cultures in a global arena
      • Trust in organizational capabilities and accountability
      • Understanding the complexities of the digital landscape, impact on individuals and the intersection of the problem they are seeking to solve
    • Sometimes inflexible legal mandates to respond to every individual complaint limit the resources available for strategic self-initiated investigations.
    • A perceived lack of discretion to interpret context against the full range of interests, not just autonomy (including beneficial interest) exists.
    • Lack of meaningful enforcement to date causes frustration for some stakeholders that have invested heavily in compliance based on the threat of enforcement.
    • Uncertainty about the role for global or regional organizations, such as OECD, COE, APEC and GPA exists.
  • Impact of non-governmental entities
    • NGO direct action has impacted the ballot initiatives in California.
    • Direct NGO lawsuits are increasing in Europe and potentially in other jurisdictions.
  • Unsettled direction of privacy and data protection law
    • European and Californian models are evolving.
    • Canadian draft is muddled.
    • Singapore is culturally specific.
    • U.S. federal privacy legislation isn’t innovative and is slow paced.
    • Impact of China’s regulatory path and approach is uncertain.
    • There are multiple non-data protection driven laws in areas such as AI.
    • Increasing intersection of competition law, telecom rules, data and system. security, and content moderation cause inconsistent application of the law.
    • Regulatory guidance increasingly acts as a proxy for law.
    • Growing frustration by all stakeholders with regulatory structures that increasingly disappoint are raising questions about data protection effectiveness.
  • Public trust diminished by misinformation enabled by social networks and other media
    • In government
    • In business
  • Emergence of dominant digital players
    • During the 25 years of the observation age, a number of dominant players have emerged in different markets and in different geographies.
    • Late industrial age competition law has been found wanting in achieving fair markets in an observational age.
    • Focus is on dominant technology providers and their power.
    • These developments are resulting in a redefinition of competition law.
    • There is a debate among regulators and policy makers whether to stick to the current hands-off antitrust approach or whether it is necessary to move to a more interventionist approach which takes account of the value of personal data.
  • Changing global balance of power
    • China’s assertion of its global role even in data protection is based on a very different vision of the sovereignty of the individual versus sovereignty of the polity.
    • U.S. leadership is declining.
    • Questions about Europe as the norm maker for data use increasingly are being asked.