How obsessive self-tracking is eroding privacy for everyone
Sociologist Deborah Lupton says the growing use of health tracking technology is conditioning society to reveal more personal information about themselves, often giving it to corporations interested only in turning a profit.
Self-tracking is rapidly changing how people think about their bodies. Using gadgets and sensors to chart everything from heart rates to moods is altering attitudes about how much information people are willing to reveal about themselves, sociologist Deborah Lupton said, and even what they're willing to let corporations collect and analyze.
But all too often, says Ms. Lupton, a Centenary Research Professor at the University of Canberra in Australia, consumers aren't aware of how their information is used after it's collected. In some cases, she said, tracking can look like it's a benefit when it's actually "about furthering corporate interests."
Lupton explores the privacy implications and the broader cultural changes associated with our obsessive tracking in her forthcoming book, "The Quantified Self: A Sociology of Self-Tracking Cultures." I recently spoke with her about the privacy consequences of personal data tracking. Edited excerpts follow.
Selinger: What are people currently tracking and what do they hope to accomplish?
Lupton: Today, people are tracking anything that mobile devices and wearables facilitate examining. While the technology is new, the self-tracking ethos isn’t. It’s been around for a really long time. The ancients talked about the importance of self-reflection and insights gained from thinking about habits that can change who you are as a person.
The real crux of the latest self-tracking practices, which cover everything from quality of sleep to exercise, moods, sexual activity, and work productivity is that technology automates a lot of the data gathering and analysis. People don’t have to put in a lot of effort to track, and social media makes it easy to share what’s learned with others. My research suggests people really enjoy seeing detailed data about their behavior, using software that presents clear visual results, and sharing accomplishments with others.
Selinger: Are people self-tracking without even realizing it?
Lupton: The Facebook "like" economy, and the highly metricized culture of sharing on social media more broadly, draws people into self-tracking. You can’t really be on Facebook and opt-out of seeing how many likes your posts get.
Selinger: What’s the difference between self-tracking and being tracked by others?
Lupton: To some extent, self-tracking is a voluntary practice, and that makes it different from the "dataveillance" Edward Snowden pointed out, corporations taking data we didn’t realize we were giving them, or hackers stealing information. But the voluntary aspect is changing, and self-tracking is starting to get pushed on people, sometimes coercively. That’s important to realize. Self-tracking is expanding beyond people’s personal, private reasons for doing it, like wanting to improve their health, to situations where people have less choice about whether they want to confront their numbers. Health insurance and life insurance companies are very interested in inviting people to upload self-tracking data. Car insurance companies are customizing their insurance premiums and risk assessments based on self-tracking information. In the future, you might be required to provide some of this information.
Selinger: Are there any corporate practices going on right now that we should be concerned about?
Lupton: Much of the corporate wellness tracking information that employers in the United States receive focuses on the good employee as the healthy worker who is productive and doesn’t generate medical expenses. Fitbit and Jawbone certainty send this message. The discourse of teamwork – teams who compete with and motivate each other to become healthier – has become a big part of what’s now called social fitness.
Selinger: Are you suggesting self-interested companies are exploiting pro-social rhetoric and ideals?
Lupton: Yes, there’s pernicious aspects of self-tracking embedded in their representations. It’s all about the value of being a productive worker, even when it appears otherwise. In my book I discuss how the Virgin Pulse website provides corporations with wellness programs that encourage workers to track their sleep – when they’re out of the office and in their own homes. The ideal is a well-rested workforce is a productive workforce. And so self-tracking which can look like it’s being done to help you really is about furthering corporate interests. Beware of overt discourse that features happy, self-tracking workers.
Selinger: Taylorism with a happy face?
Selinger: Is it significant that even when people choose to self-track, they often select commercial tools and applications?
Lupton: Yes, proprietary data can be stored on the cloud, where it becomes vulnerable to hacking, and it can be used by second and third parties for purposes that users don’t intend and are ignorant of. The entire data economy is based on fluid information that can be transformed and repurposed in ways that we can’t keep tabs on or control.
Selinger: Has limited consumer understanding played a role here?
Lupton: Yes. There have been instances where people’s self-tracking detail appeared on the Internet without intending for this to happen. For example, people have used Fitbit to track their sexual activity and had this information come out. They didn’t realize they needed to make their profiles private to avoid such data leakages.
Selinger: Are most self-tracking consumers informed about risk?
Lupton: What I’m finding from all of the different groups I’m talking to about personal data is that, one the one hand, people don’t really have a clear sense of how it can harm them, and, on the other hand, they really get pleasure from using tools that require detailed knowledge about them.
Selinger: If people could see their situation through the eyes of a sociologist like yourself, how would things look?
Lupton: It would be interesting for more people to use browser extensions that show which companies use your search information and harvest your data for purposes like advertising. That can be a real eye opener. In a big data research project that I did with Mike Michael, we asked participants to write up a timeline of all the ways their data is collected from when they wake up in the morning until they go to sleep. After doing this activity, they said hadn’t really thought about how they were tracked before because they hadn’t been asked to engage in the task!
Selinger: So, are the invitations to self-track more pervasive and powerful than the requests to reflect on the implications of self-tracking?
Lupton: Yes. People now talk about data literacy as a new form of literacy and expect school children to learn about it as part of their general education. But at the moment, average adult users don’t have a high form of data literacy because they haven’t been confronted by the implications of routinely agreeing to the terms and conditions of all the services they use. Unless people have been the victim of data breaches, they tend to think privacy problems are just problems for other kinds of people – celebrities who get their iCloud accounts hacked or cheaters who use online dating services.
Selinger: How early should these conversations be introduced in school? Depending on the age level and topic, some parents will object and say they’re the ones who should be establishing parameters.
Lupton: A lot of parents don’t have data literacy to know what the problems are. We’re talking about a generation that uploads ultrasound images and childbirth videos to Facebook, Youtube, and Instagram and then continues on generating digital profiles of their children into infancy and beyond, putting up, for example, videos of their kids learning to walk. By engaging in these practices, parents aren’t thinking about the implications for their own children’s privacy. By the time an eight year old asks for a Facebook account, there already are plenty of images of them on that platform posted by their proud parents. That’s why I’ve argued parents need to first confront the digital presences they create for their children – even out of the best intentions, like pride and pleasure.
As to the schools, well, they’re giving young kids iPads and encouraging teachers to use all kind of student tracking software. I’m working on a project on student tracking in physical education and it’s clear that those teachers aren’t aware of the implications of how the data they collect about their students can invade privacy. In fact, they’re not yet thinking about how, down the line, their own jobs might be assessed differently by administrators who analyze the information they’ve been collecting through tracking tools.
Selinger: Would you say that when schools encourage students to use self-tracking devices they’re actually indoctrinating them into self-tracking culture?
Lupton: Yes, these practices tell students that the more data that’s gathered about them the better off they’ll be. This is part of the big data phenomenon. The idea we keep hearing is that more data is better than less. But of course the more data that’s generated about you, the more data sets there are that can be aggregated by data miners who are trying to create detailed profiles. Most people don’t grasp how easy this can be or how revealing the portraits are.
Selinger: Any other advice to give about how people can better protect their privacy?
Lupton: Before signing up for a self-tracking app, think about whether it’s necessary to give all of the information that’s requested. You might not want to give your real birthdate, for example.
Selinger: Are you saying it’s OK to engage in what Finn Brunton and Helen Nissenbaum call obfuscation – disrupting surveillance with misinformation?
Lupton: Yes, they can do that.
Selinger: Are you saying they should?
Lupton: If people are concerned about their privacy they should think carefully about whether all of the details an app requests are essential for the software to perform its core functions. Protecting yourself by being a little dishonest with your personal information is not a bad thing.