There Is Only One Opportunity for Initial Design of the First Data Protection Agency in the United States

This blog reflects the views of Marty Abrams

The style, substance and leadership of data protection agencies make a difference.  Every agency understands its charge to protect individual rights.  In Europe, it is to protect autonomy, seclusion and fair processing.  In California, it is similar but expressed in terms of informational privacy interests and autonomy privacy interests.[1]

An increasingly critical task for these agencies is parsing the inter involvement with other personal rights.  For example, the nature and substance of the Singapore PDPC and the French CNIL, both of which are recognized as excellent agencies, defines how they confront the balancing of rights and interests.  Since this parsing explains the rules for innovation in an age of observation driven analytics, it should be front of mind for the architects of any new agency.

The fifth largest economy in the world, California, will soon be creating a brand-new data protection agency.  This U.S. state is larger than all but four EU member states.  California has more tech headquarters than any political entity other than China.  Voters enacted the California Privacy Rights Act of 2020 (CPRA), and the voters, in enacting the CPRA, also have voted in favor of a dedicated California Privacy Protection Agency.  This agency will have broad powers to educate the public, formulate regulations, enforce the law, advise the legislature, and participate in international venues.   The agency will have $5 million to work with in its first year, and $10 million in subsequent years.  That amount is not large (the Irish Commissioner’s budget is $21.5 million); however, there is a possibility that the agency will be able to obtain the proceeds from enforcement actions.  Everyone should care about how this agency is structured and led because its actions will impact both the rights of consumers and the responsibilities of businesses and because it has the goals of strengthening consumer privacy and giving attention to the impact on business and innovation.

Individual rights to privacy are very important in a digital age but are not absolute.  Section 3 of the CPRA and Recital 4 of the GDPR make this clear.  These rights must be considered against individuals’ other substantive issues such as health, education, employment, and economic wellbeing.  Furthermore, it is increasingly understood that privacy needs to be considered in light of the interests of other individuals.  The global pandemic makes this point even clearer when one’s individual interest in autonomy may conflict with another’s interest in avoiding COVID-19. 

The charge to protect privacy is crystal clear in the CPRA.  However, the responsibility to balance the full range of issues is inferred by the concept of proportionality.  This balancing of interests creates the hardest tests for regulatory agencies. Such challenges do not mean ignoring individuals’ collective rights to autonomy and transparency.  They mean reasoned approaches that accommodate data uses for knowledge creation where risks, for example to fair processing, are minimal should be employed.

Richard Thomas, when he was the UK Information Commissioner, stressed that agencies must be selective to be effective.  This approach means agencies must prioritize individual rights based on how many persons are impacted and based on the impact of the harm.  Some of the risks relate to procedural rights that if not enforced make regulations ineffective.  For example, ineffective transparency makes it very difficult for many persons to exercise rights to object to processing, gain access to their data, and understand purposes for which data will be used.  However, there are concerns about procedural rights that are on the margin.  There are times when focusing on process risk distracts from the risks of insufficient data to protect people from disparate treatment and real harms to health, financial status, education and even freedom.  Selective to be effective means a focus on these impacts.  So, with all the activities that are open to a privacy agency, which activities should they select?  Based on those activities, for which talents should they recruit?  The Information Accountability Foundation (IAF) believes the time to consider those questions are when an agency is being formed, not after the agency already has allocated all of its resources. 

The IAF, as a research and education agency has worked with agencies worldwide.  The IAF team has seen the agency evolution from data base registrars to powerful enforcement agencies.  It would be nice to see an agency structured to deal with the challenges of the future and not just the present or the past.   There is no better place to start than a discussion on the new California agency.

This discussion will begin but not end with the IAF policy video call on 17 December.  If you would like to attend that call, please let Stephanie know at spate@informationaccountability.com.  That call will feature very diverse views.  It is the IAF’s intent to continue the discussion in 2021 as the agency is in a formative stage.  The IAF will hold a multi-stakeholder dialog in May 2021.  Please let us know what you think by contacting Marty at mabrams@informationaccountability.org.


[1] De la Torre, Lydia F., “California’s Constitutional right to privacy,” medium.com, October 15, 2020