Category Archives: FTC

Guidance and Un-Legislated Law

In 2016 and 2017, the Article 29 Data Protection Working Party (WP29) adopted Action Plans which set forth its global implementation strategy related to the General Data Protection Regulation (GDPR).  Pursuant to these Action Plans, the WP29 has produced seven Guidelines and has indicated it will produce at least eight more.  As the data protection community digests the massive amount of guidance coming from both the WP 29 and the individual data protection authorities, the impact of this guidance is not new; for better than thirty years, regulator guidance has changed markets. What has changed is the velocity, descriptiveness and volume of this guidance and by extension its impact on markets.

One of the most consequential privacy decisions made was by the U.S. Federal Trade Commission in the 1980’s when it issued an unofficial staff opinion that credit prescreening was permissible because the minimal invasion of privacy was counter-balanced by the consumer benefit of competition in credit card markets.  That decision, pursuant to the federal Fair Credit Reporting Act, while specific to pre-approved credit offers, unleashed the power of data driven advertising in the United States by creating an example about how data driven marketing can be effective.  The broad and deep credit files of Americans were used to granularly segment markets.  By 2001, one bank executive told me that his 60-terabyte prospecting file made it possible to pick the right credit product, out of the 6,000 he offered, for each consumer in the United States.

The key concepts in credit marketing were soon expanded to other markets as well.  With the introduction of a consumer Internet in 1993, digital marketing soon had the rich overlay of highly granular data to segment markets in a fraction of a second.  However, the whole movement began with that FTC opinion issued back in the 1980’s that balanced minimal privacy invasion against much more competitive credit markets.

Today, as we rush towards GDPR implementation, there is more and more guidance, and we should be cognizant of the impact of that guidance on how organisations will operationalize data protection law requirements.  Moreover, regulatory guidance is not just a European issue.  The FTC has been holding workshops and issuing reports since the late 1990’s.  Those reports have had real impact on data use.  The recent Canadian Office of the Privacy Commissioner report on its consent consultation has business guidance that will impact privacy practices well before recommendations are or are not enacted into law.  Recent guidance on data transfers from the Superintendent of Industry and Commerce in Colombia has caused a vigorous debate among all stakeholders.

The IAF as a research organisation focused on the future finds some guidance very helpful and finds other guidance backward focused.  Our views on the WP 29 draft guidance on profiling and automated decision-making can be found here.

To date, the IAF, as an organisation, has not stepped back to actually ask hard questions about the role of guidance in shaping the legal, fair and just use of data at a time where technology continually changes the way data is created, collected, and used.  In January that focus will change.  The IAF will hold a policy call with its stakeholders to begin exploring the role of guidance and the role of policy centers in responding to that guidance.  That call will be followed by a brainstorming session in the Spring of 2018.

As always, your comments are welcome.

Accountability Does Work

143,000,000 people were the victims of a recent data breach when their data was stolen from Equifax, a company that has an obligation to keep their data safe. Data security is tough. The bad guys only need to be successful once, while companies need to win every time. However, from the perspective of many consumers,… Continue Reading

Fairness and Unfairness Moving Farther Apart

Fairness has become a huge data protection policy driver in Europe and the Americas.  Fairness is often hard to define in definitive terms, but the parameters of fairness are well known.  A fair data application creates identifiable value for individuals, mitigates risks to those individuals, and confirms the data is used within the context of… Continue Reading

Trust Deficits, Bright Lines and Verification

Freight trains heading through a tunnel are pretty hard to miss. The equivalent of a freight train in the world of privacy is a privacy trust deficit related to information use in an observational age driven by analytics. The many, many indicators that have piled up over the recent days, weeks and months are coming… Continue Reading

Legal, Fair and Just – The Benchmark for Big Data Analysis

Last month, the IAF presented our big data assessment process to industry representatives in Washington, D.C. One of the attendees, really trying to be helpful, asked why would any U.S. company conduct an ethical assessment of a big data project, since there really are not many restrictions in the use of data to develop insights.… Continue Reading

An Alternative Approach to Establishing Legitimacy in the U.S.

In an earlier post, I suggested that an alternative to notice and consent should exist in the United States and posited that the alternative could be balancing of interests relying upon the FTC’s unfairness authority.[1] The Administration’s Discussion Draft of the Consumer Privacy Bill of Rights of 2015 (“Draft Law”)[2] suggests other bases for such… Continue Reading

Legitimate Interests as an Alternative to Notice and Choice in the U.S.

The PCAST Report on Big Data and Privacy recognized the problems with notice and consent (i.e., often notices are unread, their legal implications are not understood, their terms cannot be negotiated). Yet, since notice and consent is so deeply rooted in current practice, rather than explore alternatives to notice and consent, the PCAST Report explored… Continue Reading