More Comprehensive US Privacy Laws are Inevitable — What Do We Want them to Be?

“Data should serve people, people should not serve data.” With those words Giovanni Buttarrelli summarized the intersection of privacy and ethics. Privacy law and regulatory approaches are best when they don’t frame privacy as a conflict between users of data and those that are impacted by the processing. This model is being increasingly challenged with regulatory guidance issued as a complement to the GDPR. Privacy law is best when it forces organizations to be responsible and demonstrate the details of how they are responsible. Core accountability concepts such as privacy by design, ethics by design, and balancing that recognizes the full range of all stakeholders’ rights and interests are the future of data stewardship governance.

The new California law, AB 375 never once uses the term accountability. And while the new California law and the GDPR makes more likely U.S. comprehensive privacy legislation, it isn’t clear that either the California law or the GDPR and specifically the regulatory guidance response creates a path that will assure the full range interests that will enable data driven innovation so key to economic growth. Neither the GDPR, which is very EU constitutionally centric, or California Assembly Bill 375 is the answer for balanced protection and data driven innovation.
The California law is an update to the American legacy notice and choice approach. There are new rights to consumer access to their data, and a delineation of data sales versus data driven fulfillment. There are special protections for children. The broad application of what data is covered and which consumers are protected adds to its sweeping impact. And the law will be costly, requiring system changes to facilitate meeting the law’s consumer requirements. But the law isn’t next generation and will likely dampen data driven innovation as organizations seek to minimize their risk exposure. While this may ultimately drive to US Federal Legislation, the larger risk is a fragmented “me too” like laws in other States.
Since the early 1970’s the U.S. legacy system has been on a path to split the two parts of privacy, Autonomy and Fair Processing. Autonomy has been served by notice and the ability to object, while fair processing has been the product of sector specific laws. There are new requirements in the California law relative to autonomy and consumer control, but not a novel approach. We now will have an access requirement in the U.S. that goes beyond healthcare and credit reporting. Furthermore, the law provides consumers with the right to understand how data markets work, and prohibit the sale of the data that pertains to them. However, there is no requirement that transparency be effective in helping individuals understand how data is being used. Digital markets are hard to understand, and advanced analytics systems even more so. The IAF has conducted numerous workshops on how digital markets work to link and target, and education of these complex environments requires time and the best teachers available. We are not suggesting the new law isn’t rigorous and expensive, but not ground breaking. It is not progressive and not suitable to enable data driven innovation that economies are increasingly dependent on. The resulting compliance systems will be costly and the organizational compliance staffing required will be significantly greater than today, but will data use be fairer for all stakeholders? In fact, there is a very real possibility beneficial uses, even those that do not directly affect a specific individual could be caught up in this process making companies think twice about the cost of this sort of data use.
The California law doesn’t require fair processing. Instead it requires data use be transparent with the ability for individuals to exercise rights to limiting some data use. This is necessary but not sufficient. Yet responsible and accountable data processing is increasingly the backbone of modern fair data protection. The GDPR has specific individual protections, but it also has legitimate interests, privacy by design, a balancing of interests, and the potential for codes of conduct for assuring that stakeholders get protections when using data in an innovative manner that creates real value. Despite its flaws, the GDPR creates a better infrastructure for accountability.
Canada may be on the brink of revising its private sector law at the federal level. While consent and accountability are both built into the fabric of Canadian privacy it too needs to reflect today’s more intensive data driven scenarios to facilitate the benefits of this data use while effectively ensuring individuals are at the center. However, their law and tradition creates room for data uses that are responsible and in context. The concept of “beneficial applications” has been suggested in Canada as a means for creating real social value from data while achieving an appropriate level of protection.
In Hong Kong work commissioned by the Privacy Commissioner, the IAF is working on ethical data stewardship models and ethical assessment guidance for big data and advanced analytics. Singapore will revise its digital legal infrastructure in a fashion that is compatible with modern accountability. Accountability is also a feature on the newer privacy laws in Latin America, and will be part of the next generation of laws there as well.
The California legislation with its focus on the sale (broadly defined) of data will have impact not only in the other 49 states, but in other locations as well. Limiting the sale of data is an easier and a more attractive concept for policy makers to understand than accountable and enforceable data stewardship. The resulting risk is that limits on data sales will capture the soul of future legislation in places like Canada. But, if our goal is to assure data serves people rather than people serving data, we have work to do.
The concept of responsible data use and in particular “beneficial activities” needs to be explored, and processes developed to demonstrate that activities are truly beneficial. Ethical assessment frameworks need to be developed and socialized. Trustworthy oversight mechanisms that fill a trust gap that goes beyond what regulators might have resources to provide need to be developed, tested and discussed. This research needs to be done and done quickly.
Other states will likely follow the California lead, but with what model? Consumer power, as captured by AB 375 is important and transparency is a necessary prerequisite, however these alone are not enough. When legislation is drafted, those of us that believe that data should enhance innovation need to be ready with solid research and examples. We can’t and should not back away from data intensive activities that involve advanced analytics and artificial intelligence. Being people centered means creating value for people. We need to create economic and social value while preserving a space free from digital predestination. We need to be ready with accountable ethical assessment tools and oversight models that are trusted. We need means to use data for beneficial purposes from data across platforms for uses never anticipated.
These are the themes the IAF is and will continue to explore over the next 18 months. We need others to join us in this quest for something better.

Posted in