I was well into writing a blog related to the Federal Trade Commission’s advanced notice of proposed rulemaking (ANPR) on “Commercial Surveillance and Security” when the FTC announced it had sued Kochava, a data aggregator whose services are location based. This suit follows the Federal Communication Commission’s announcement that FCC Chair Jessica Rosenworcel had released information from 15 mobile phone providers on their use of location data. Chair Rosenworcel said, “This information and GEO location data is really sensitive. It’s a record of where we’ve been and who we are.” These press releases by the FTC and the FCC complicated my blog writing.
From every side and direction, the seeing and recording of behavior is being framed as surveillance, specifically “commercial surveillance”. The most common example of surveillance is location data. Regulator after regulator is focused on location data. To be perfectly honest, it isn’t the location; it is the context of the location. What information other than the device is being tracked? Is the location a hotel? A store? A bar? A health center? A women’s health center?
But while location is the issue of the day, the debate will impact all observation. Any smart device that has sensors is in scope. All smart devices are observing, and they are observing for a set of purposes. Furthermore, seeing can be separated from remembering. However, not all location information needs to be treated the same. Location information can be considered special; boundaries can be put around its use. Location data is special, for all the reasons noted by the FCC chair.
The FTC’s ANPR is another step in the necessary global discussion of what fully balanced data protection means in an observational age. The ANPR led me to reread the chapter on surveillance in Professor Neil Richards’ book “Why Privacy Matters.” My favorite statement from that chapter is: “. . . a more sophisticated understanding of surveillance reveals a problem: we need an explanation for when and why surveillance is dangerous and when it is not.”
The New York Time recently printed an opinion piece by Alex Kingsbury, We’re About to Find Out What Happens When Privacy is All But Gone. Kingsbury starts off with the fact that all smart phones by their very nature are surveillance devices. Observation is a necessity if the phone is to work. It needs to know where the user is to function as a phone. While some apps need to be observational, others do not need to be. The navigation app could work like old static paper maps, but users would not be happy. Maps and navigation systems need to know the user’s location in order to function for the user’s purposes, but sometimes location-based ads are serving the user’s purposes, like the pizza restaurant nearby, and sometimes they are serving someone else’s, like an algorithm that predicts if I can be enticed to eat pizza in the first place.
The FTC’s ANPR uses the term “Commercial Surveillance.” Use of the emotionally loaded term “commercial surveillance” is an issue. I prefer to use the terms “observation” and “observed data” because they better describe what the roles watching and recording mean in a connected age where some observation is necessary for things to work, some observation is recorded for nefarious purposes, and other observed data is repurposed for political control. While I understand the attractiveness of the term” commercial surveillance,” it is not useful because it places too much emotion in a discussion that requires dispassion.
The term “surveillance” suggests a target and an intended outcome. Observation is a more neutral word. In 2014, when I wrote the “Taxonomy of Data Based on Origin,” I used the term “observed data.” The governance of observed data is intended to answer the question raised by Neil Richards “When is the use of the observed data dangerous and when is it not?”. The New York Times opinion piece is useful but not nuanced enough. Data policy governance covers three privacy interests: seclusion, autonomy, and fair processing. Observation is a growing feature of our digital age that overwhelms seclusion, making governance for autonomy and fair processing ever more important. I absolutely agree that there is frivolous observation, where observation is a power, not an empowerment, tool. But, by framing the ANPR as about corporate surveillance, an emotional driver is triggered that gets in the way of precise policy making.
The IAF’s comments on the ANPR will start with the premise that data should serve people and do so with respect for the full range of peoples’ interests impacted when data is created and used and used to create even more data. Professor Richard’s point is front and center in this discussion, but I would frame it differently. I would start with a recognition that observation is necessary for connected technologies to work. It also is necessary to create new insights. In order for both points to be understood, it is necessary to ask several questions: When is observation targeted in a manner that controls others’ basic interests? When should observation be off limits to protect secluded space? How are only appropriate levels of autonomy and processing used so that the use is fair? How are processes governed, illuminated and addressable by external oversight and enforcement? Should processes be governed so they serve the individual’s interests and not someone else’s? When is retention a matter of asymmetric power? The nuance in these questions, and the answers to them, are important.
Related Articles