Enacting Privacy Legislation Requires Defining Desired Obtainable Outcomes

There is little debate that the United States needs a comprehensive privacy law. There also is little debate that the U.S. is no closer to enacting such legislation than it was twenty years ago. Many have argued that the issues of federal preemption and private rights of action are the impediments to the enactment of such legislation.  If the rest of the legislation were agreed on, I think those issues are solvable ones.  The fundamental problem is what privacy legislation actually should try to solve.  If the purpose of legislation is to protect against negative outcomes, then those outcomes need to be identified.  So, what is the purpose of comprehensive privacy legislation? 

I read Neil Richards new book “Why Privacy Matters” over the holiday break.  He makes the case that privacy does matter because it is about the power that comes from knowledge generated from human data and who wields that power. 

After reading the book, I went back and listened to NTIA’s listening session related to protected classes and advanced analytics.  There is general agreement that individuals cannot govern the power that comes from human data by reading informing notices. 

Richards also suggests that the term “abusive” join the terms “unfair” and “deceptive” in the tools that the FTC has to work with.  So, what does “abusive” mean? 

In thinking about that issue, I went back to the IAF’s model legislation, the FAIR and OPEN USE ACT.   The footnotes in the model legislation link the model legislative language to the sources for much of that language.  The footnote to the term ADVERSE PROCESSING IMPACT states:  

“The IAF Model does not use the terms “harm” or “injury.” Instead, the IAF Model defines a broad concept of “Adverse Processing Impact.” The definition of Adverse Processing Impact aligns with the approach to privacy risk and “privacy problems” codified in the National Institute of Standards and Technology’s publication, NIST Privacy Framework: A Tool for Improving Privacy Through Enterprise Risk Management, Version 1.0 2020 (“NIST Privacy Framework”). NIST defines privacy events as “potential problems individuals could experience arising from system, product, or service operations with data, whether in digital or non-digital form, through a complete life cycle from data collection through disposal. NIST Privacy Framework at p, 3. NIST identifies the range of problems an individual can experience as a result of processing as ranging from dignity-type effects such as embarrassment or stigmas to more tangible harms such as discrimination, economic loss, or physical harm. Id. The definition of Adverse Processing Impact is also generally consistent with NIST’s Catalog of Problematic Data Actions and Problems, which is a non-exhaustive, illustrative set of problematic data actions and problems that individuals could experience as the result of data processing.”

I suggest that the term “adverse processing impact” as defined and used in the model legislation describes the negative outcomes that we, as a society, want to manage and prevent. 

ADVERSE PROCESSING IMPACT.— The term “Adverse Processing Impact” means detrimental, deleterious, or disadvantageous consequences to an Individual arising from the Processing of that Individual’s Personal Data or to society from the Processing of Personal Data, including—

  1. direct or indirect financial loss or economic harm;
  2. physical harm, harassment, or threat to an Individual or property;
  3. psychological harm, including anxiety, embarrassment, fear, and other mental trauma;
  4. inconvenience or expenditure of time;
  5. a negative outcome or decision with respect to an Individual’s eligibility for a right, privilege, or benefit related to-
    1. employment, including hiring, firing, promotion, demotion, reassignment, or compensation;
    2. credit and insurance, including denial of an application, obtaining less favorable terms, cancellation, or an unfavorable change in terms of coverage;
    3. housing;
    4. education admissions;
    5. financial aid;
    6. professional certification;
    7. issuance of a license; or
    8. the provision of health care and related services.
  6. stigmatization or reputational injury;
  7. disruption and intrusion from unwanted commercial communications or contacts;
  8. discrimination in violation of Federal antidiscrimination laws or antidiscrimination laws of any State or political subdivision thereof;
  9. loss of autonomy[1] through acts or practices that are not reasonably foreseeable by an Individual and that are intended to materially-
    1. alter that Individual’s experiences;
    2. limit that Individual’s choices;
    3. influence that Individual’s responses; or
    4. predetermine results or outcomes for that Individual; or[2]
  10. other detrimental or negative consequences that affect an Individual’s private life, privacy affairs, private family matters or similar concerns, including actions and communications within an Individual’s home or similar physical, online, or digital location, where an Individual has a reasonable expectation that Personal Data or other data will not be collected, observed, or used.

I also suggest that in managing against adverse processing impacts, we create the means to use human data in a flexible manner to create real value for people. 

So please think about the definition of “adverse processing impacts” and think about how you would create a risk management program to manage against the risk of adverse outcomes.  

January 27 the IAF will hold a special Privacy Week session of our monthly “Strategy and Policy Call” about adverse processing impact.  Look for the Save the Date. 

[1] The concept of “loss of autonomy” is widely recognized in many bills and frameworks including the NIST Privacy Framework, which provides that, “[l]oss of autonomy includes losing control over determinations about information processing or interactions with systems/products/services, as well as needless changes in ordinary behavior, including self-imposed restrictions on expression or civic engagement.” Catalog of Problematic Data Actions and Problems.

[2] The IAF Model applies the well accepted drafting convention that “or” means “either or both”, or if there is a series of items, “anyone item or combination of items”.