There Is Privacy Law Innovation in the United States

U.S. states are leading innovation in data protection law and regulation.  Four states (California, Colorado, Connecticut, and Virginia) have enacted laws that require data protection assessments (DPAs), and three states (Indiana, Tennessee, and Montana) have passed legislation requiring DPAs which are awaiting their governors’ signatures. These DPAs consider the benefits to a broad range of stakeholders. The full range of potential adverse processing impacts to consumers, and the mitigation necessary to offset those potential adverse impacts. This is where the innovation lies.

Colorado, additionally, has gone a step further and adopted rules which specify the content of the DPAs. Colorado’s new privacy rules go into effect in July, and they are a game changer.  Part 8 of the Colorado Rules is entitled “Data Protection Assessments.”  Rule 8.02 is entitled Scope, and it states:

A data protection assessment shall be a genuine, thoughtful analysis of each Personal Data Processing activity that presents a heightened risk of harm to a Consumer … that : 1) identifies and describes the risks to the rights of a consumers associated with the processing; 2) documents measures considered and taken to address and offset those risks, … 3) contemplates the benefits of the Processing; and 4) demonstrates that the benefits of the Processing outweigh the risks offset by safeguards in place.

Notably, the assessment must reach beyond just the individual consumer.  Section A.5. of Rule 8.04, which specifies the DPA content, states:

The core purposes of the Processing activity, as well as other benefits of the Processing that may flow, directly and indirectly to the Controller, Consumer, other expected stakeholders, and the public.

Section A.5 requires the DPA to look not only at the benefits associated with the processing to the controller and the consumer, but other stakeholders as well.  To do that the DPA will have to actually catalog who those stakeholders might be.

Section A.6 then lists the adverse consequences that the DPA needs to assess against (contained in the table below):

Colorado examples of risks to the rights of consumers that may considered in a DPA
Constitutional harms, such as speech harms or associational harms
Intellectual privacy harms, such as creation of negative inferences about an individual based on what an individual reads, learns, or debates
Data security harms, such as unauthorized access or adversarial use
Discrimination harms, such as a violation of federal antidiscrimination laws or antidiscrimination laws of any state or political subdivision thereof, or unlawful disparate impact
Unfair, unconscionable, or deceptive treatment
A negative outcome or decision with respect to an individual’s eligibility for a right, privilege, or benefit related to financial or lending services, housing, insurance, education enrollment or opportunity, criminal justice, employment opportunities, health-care services, or access to essential goods or services
Financial injury or economic harm
Physical injury, harassment, or threat to an individual or property
Privacy harms, such as physical or other intrusion upon solitude or seclusion or the private affairs or concerns of Consumers, stigmatization or reputational injury
Psychological harm, including anxiety, embarrassment, fear, and other mental trauma; or
Other detrimental or negative consequences that affect an individual’s private life, private affairs, private family matters or similar concerns, including actions and communications within an individual’s home or similar physical, online or digital location, where an individual has a reasonable expectation that Personal Data or other data will not be collected, observed, or used.

Section A.7 covers mitigation measures and Section A.8 requires a balancing of benefits against the risks described in Section A.6. and the measure used to reduce those risks.

The IAF team has looked at requirements in Europe and other jurisdictions, and none contain the breadth of parties to be considered and a description of what significant risk might entail. 

On the one hand, there are few companies that have the current capacity for these assessments.  On the other hand, Part 8 of the Colorado Rules begins to recognize that processing is not just about the consumer, as a data subject, and the controller.  It recognizes that complex processing requires an assessment that looks horizontally, both through the organization and externally, to find the appropriate multi-factor balancing.  This type of balancing will be required to bridge the differences between legacy privacy governance systems, and fair advanced processing including machine learning and artificial intelligence (AI).

As for innovation, the Colorado rules will be studied and considered by other jurisdictions.  The IAF team believes that similar rules on assessments likely will be adopted in California and will cascade from there. 

The IAF has initiated a new project called the “Colorado Project” that will develop an assessment template based on Colorado Rule 8 and expected future regulations in California.  The Colorado Project will include a multi-stakeholder dialog to be held most likely in Colorado.  The IAF June retreat in San Francisco also will include a discussion on the impact of these DPAs on the way fair AI is balanced.