“I closed my rings!”: An Interactive Journey Through the Datafication of Fitness Tracking

Margaret Baker
12 min readMay 3, 2021

By Margaret Baker and Emma Stanley

Photo courtesy of Unsplash.

Abstract

This work aims to explore the nature of smartwatches and fitness trackers in terms of data privacy and surveillance, and is paired with an interactive video for additional information regarding options individuals have when protecting their data in order to communicate these skills to a large audience. In the article, the problematic infrastructure of data gathering and surveillance is explored, creating a discussion as to how we can protect our most vulnerable users of these devices. Individuals who lack data literacy skills are often targeted by major companies who are tracking their data, as these users do not understand their options when it comes to protecting their data. In this interactive video, the concepts from the article are explored in a narrative format, and the audience can engage with the video to learn more about how to protect themselves against companies and other individuals who are preying on their current lack of data literacy.

Introduction

Smartwatches and fitness trackers have taken the consumer technology wearables market by storm, offering participants the opportunity to datify and track their health and personal fitness goals while also often providing connections to other important devices including cell phones, headphones, and other wearable devices. In fact, Gartner (2021) reports global spending on wearable devices topped $68.9 billion dollars in 2020, with smartwatches and wristbands earning nearly 40% of the market share of wearable devices. Additionally, Pew (2020) reports that about one-in-five American use a smartwatch or fitness tracker to compile data about their health and fitness. However, wearers of these devices are often left unaware of the specific types of data collected by the technologies; there exists a significant disconnect between the average user’s understanding of the tracking of personal health and fitness data and where/how/when that data gets used. In the same Pew (2020) study, users who utilize fitness trackers “are far more supportive of sharing data from these devices with health researchers than those who don’t use these devices. Roughly half of users of fitness trackers (53%) say this is acceptable….” Data from fitness trackers and smart watches is being collected by companies, sold and exploited for its usefulness in a variety of arenas. Additionally, individuals’ privacy resides at risk in providing this data to companies, and on a macro level, nation state’s security can be at risk because of this data. Due to the concerns and problems of data mining, ethics and accessibility, we have constructed an interactive video project that seeks to drive users to action towards data governance and data justice for smartwatch and fitness tracking data. By asking users to consider their own technological usage and the implications of data collection, the video offers participants choices to either opt out data collection or learn more about how their data is being used where they cannot opt out. By empowering individuals to continue to use their technology with an informed mindset, this project provides recognition to issues sounding datafication of the body via smartwatches and fitness trackers in the United States.

Related Work

Photo courtesy of Unsplash.

In greater US culture, some politicians and media outlets are beginning to raise awareness about issues surrounding datafication and privacy (see GDPR restrictions in the EU and the CCPA in California) but a large gap in consumer awareness still exists regarding the datafication of the individual body in society. According to a news update from the University of Pennsylvania, the California Consumer Privacy Act “regulates how data is collected, managed, shared and sold by companies and entities doing business with or compiling information about California residents. Some observers contend that as no business would want to exclude selling to Californians, the CCPA is de facto a national law on data privacy, absent an overarching federal regulation protecting consumer information” (University of Pennsylvania, 2019). Ultimately, this regulation became a massive turning point for the rest of the United States. These restrictions still have their shortcomings, however. For example, in Ogury’s (2019) survey of nearly 300,000 individuals across the globe one year into GDPR restrictions, 52% of respondents answered that even after reading consumer consent notices and privacy policies, they still did not understand the usage of their data (prnewswire). Additionally, a staggering 37% of respondents were still unaware of what the acronym GDPR meant. (prnewswire)

Restrictions like GDPR and CCPA can assist in benefitting the positives to aggregated data collection, as well. For Kenneth Cukier (2016), despite the “dark sides” to big data,

“What we find is that when we have a large body of data, we can fundamentally do things that we couldn’t do when we only had smaller amounts. Big data is important, and big data is new, and when you think about it, the only way this planet is going to deal with its global challenges — to feed people, supply them with medical care, supply them with energy, electricity, and to make sure they’re not burnt to a crisp because of global warming — is because of the effective use of data” (1:47).

Despite many of the horrific elements of gathering significant data sets, there are sometimes beneficial elements, such as helping humanity gain access to knowledge and better their lives physically. The opportunity to use data for good should be not overlooked in the midst of the charged issues surrounding data justice.

However, as Madhumita Murgia (2017) explains, there are undoubtedly issues and concerns which must be addressed. In her TedTalk she explains,

“I already knew about my daily records being collected by services such as Google Maps, Search, Facebook, or contactless credit card transactions. But you combine that with public information such as land registry, council tax, or voter records, along with my shopping habits and real-time health and location information, and these benign data sets begin to reveal a lot, such as whether you’re optimistic, political, ambitious, or a risk-taker. Even as you’re listening to me, you may be sedentary, but your smartphone can reveal your exact location, and even your posture. Your life is being converted into such a data package to be sold on. Ultimately, you are the product” (4:00).

The final phrase in that section of her talk stood out in a very stark manner — we are the product. We are our data. We are being sold, and we need to be aware of how our data is being used, and demand justice.

Furthermore, data justice and data feminism speak to how a disproportionate few hold the keys to the data kingdom. Essentially, this project aims to increase data literacy within the general public and empower individuals to advocate for data governance, which has been a primary research area within the academy. For instance, Lina Dencik et al. note that “data justice is a response to prominent and rather limited perspectives on the societal implications of data-driven technologies that have tended to focus on issues of efficiency and security on the one hand and concerns with privacy and data protection on the other” (Dencik et al. 2016). Dencik et al. also recognize that “democratic procedures, the entrenchment and introduction of inequalities, discrimination and exclusion of certain groups, deteriorating working conditions, or the dehumanisation of decision-making and interaction around sensitive issues” cause the primary issues within data collection, perpetuating the notion that we “need to position data in a way that engages more explicitly with questions of power, politics, inclusion and interests, as well as established notions of ethics, autonomy, trust, accountability, governance and citizenship” (Dencik et al. 2019). Ethics, autonomy, trust, accountability, governance, and citizenship are deeply embedded within power dynamics that exist in the infrastructures of data privacy. According to researcher Deborah Lupton, on a sociological level, the ways that we have tracked personal data have changed over time, as individuals used to use a diary or a journal to track personal data (Lupton 2016, p. 4). This lack of dissemination (as the journal was typically private and physical — not networked on the internet) had a level of privacy that does not exist in the modern age, specifically with regard to personal fitness trackers.

Since data is now so readily accessible online, who can we trust with our data, and who is the most vulnerable? According to Wiebke Maaß, in quoting Chai et al., “elderly people are particularly vulnerable to cyber attacks” (Maaß, 2011, p. 241, Chai et al., 2008). Maaß also notes that the number of elderly individuals using facets of the internet and internet-enabled devices are increasing each year, and their lack of data literacy is what leaves them open to these attacks (Maaß 2011). In effect, this vulnerable population needs support in an interactive and accessible manner.

Methods

Interactive video was chosen as an ideal platform to facilitate and provoke discussion of critical awareness of data being tracked and stored by fitness tracker makers. A variety of types of interactive video exist, however this particular piece takes the form of an interactive video quiz, where every few minutes users are prompted with guided questions to consider the impacts smartwatch and fitness tracker technology may be having on their daily lives. The quiz format helps to keep users engaged in the content, while also asking them to consider novel ideas about their own data. Ultimately, the goal of this video was to create a thoughtful teaching tool in an interactive and digital format broadly targeted at any smartwatch and fitness tracker users. Three versions of the video have been created — one without the interactive elements, the interactive version, and then a screen recording of the video with the interactive elements. These three versions give users a plethora of options from which to choose in order to be able to access the video effectively.

In order to craft the video, a variety of check steps were completed to ensure accuracy and feasibility. First, storyboarding of the video project was completed which included research gathering of relevant, recent, and specific statistics related to smartwatch and fitness tracking technologies. Additionally, brainstorming for the storyboards included assessing opportunities for user inputs and interactive elements of the video. Based on the completed storyboards a script was crafted to complement the visual and narrative elements. Upon completion of this script, filming began. Primarily composed of a fictionalized narrative following a user’s workout interacting with smartwatch technology, each scene mimicked opportunities where users might interact with their smartwatch or fitness tracker technology. The completion of filming indicated the beginning of the editing process. After compiling a narrative structure, the video was uploaded into Vizia, an interactive video platform designed to help content creators embed interactive elements on top of traditional video. Additionally, discussion prompts were created and added to the end of the video to spur both individual and small groups where the video might be played to help synthesize the piece’s information and to inform users of opportunities for justice and action. The completed video project is embedded below in each of its forms — interactive, non-interactive, and screen-recorded interactive.

Interactive Video (opens in a new tab)

Non-Interactive Version

A non-interactive version of the video.

Screen-casted Version

A screen-casted version of the video displaying interactivity.

Impacts

A variety of audiences would benefit from critically engaging in data privacy conversations, but this project was specifically designed for vulnerable populations — i.e. those who lack data and digital literacy skills, such as not understanding where their data goes, or what is being done with it. As we learned, according to Maaß, elderly individuals are the most vulnerable population on internet-enabled devices (Maaß 2011). Providing a structured environment for these vulnerable members of the population via video encourages the viewers to consider the impacts of their own data collection choices; inclusion of guided discussion questions empowers participants to examine how they choose to utilize smartwatch and fitness tracker technology in the future. Additionally, the layers of interactivity applied to the video guides users through thoughtful decision-making processes regarding their own personal data collection, and the greater implications of that data collection for themselves and for nation-states. Using Medium, a public-facing platform to host the video, encourages viewers to share the post via social media, text, and other online platforms in order to facilitate a broader cultural conversation about data privacy.

Photo courtesy of Unsplash.

Designed with user interactivity in mind, this project is tailored to engage the audience’s curiosity and understanding of data collection via smartwatches and fitness trackers. As users of these technologies leave digital “breadcrumbs” of information of their daily routines, including tracking of sleep, diet, exercise, and fitness among other areas, devices simultaneously collect additional incidental data including location tracking, pattern recognition and related metadata (Pingo and Narayan, 2016). Pingo and Narayan (2016) acknowledge the significant privacy concerns associated with this collection stating, “…combination of personal data collected in different databases pose growing privacy risks and threats in two distinct ways: (i) using information other than its originally perceived approved purpose, and collected at one point in time (“spot information”), and (ii) combining the fruits of “spot information” to piece together information with other data to generate new information for profiling purposes.” User’s information created at individual points in time, what Pingo and Narayan consider “spot information”, is aggregated into sets; these sets allow companies, buyers, and others to understand patterns, behaviors and identities of user profiles on a global scale. As viewers come to understand the implications of the lack of privacy surrounding data generated by these lifelogging devices, they are prompted with opportunities for action including data justice through advocacy in government and lobbying in the video piece. As data is never neutral or independently created, stored or aggregated, users are also called to critically consider their personal data literacy and data freedoms — to take control of their information and privacy through practical applications including reading terms and conditions when signing up for new products and services, by changing privacy and security settings on devices to protect data confidentiality, and exploring alternative tools for communication and recording. Empowering individuals to continue to use their technology with an informed mindset is at the heart of this project. As Weiser (1991) predicted, the age of ubiquitous computing has arrived, and with this age comes an increased responsibility to protect and understand the data created by computing power.

Conclusion

As the Netflix documentary The Great Hack says it, “And my question to everybody else is, is this what we want? To sit back and play with our phones as this darkness falls?…So, as individuals, we can limit the flood of data that we’re leaking all over the place. But there’s no silver bullet. There’s no way to go off the grid. So, you have to understand how your data is affecting your life” (1:47:37). This project seeks to bring recognition in a similar way — understanding how datafication of the body via smartwatches and fitness trackers is affecting both individuals and the United States at large. Smartwatch and fitness tracker wearers must work together to advocate for data governance and data justice of personal health information (PHI) via lobbying on federal, state and local levels. Additionally, users should consider as many privacy and sharing restrictions as possible to ensure as little of their data is being translated out of their control. Although the world has moved into an age of ubiquitous computing, this project and others like it offer individuals an opportunity to become empowered users within this ubiquity, to understand the implications of their own data collection and to move towards action. We look forward to future governance and greater enlightenment about datafication in the United States in the coming days.

References

Chai S, Rao HR, Bagchi-Sen S, Upadhyaya SJ (2008). ‘Wired’ senior citizens and online information privacy. Paper presented at the tenth ETHICOMP international conference on the social and ethical impacts of information and communication technology, Mantua, 24–26 Sept 2008.

Cukier, K. (2016). Big Data is Better Data. TED Talk. (16 min).

Gartner forecasts Global spending on wearable devices to Total $81.5 billion in 2021. (n.d.). https://www.gartner.com/en/newsroom/press-releases/2021-01-11-gartner-forecasts-global-spending-on-wearable-devices-to-total-81-5-billion-in-2021.

Lupton, Deborah (2016). “Introduction.” The Quantified Self.

Maaß, Wiebke (2011). “The Elderly and the Internet: How Senior Citizens Deal with Online Privacy.” Privacy Online: Perspectives on Privacy and Self-Disclosure in the Social Web.

Murgia, M. (2017). How Data Brokers Sell Your Identity. TED Talk. (16 min).

Netflix. (2019). The Great Hack. United States.

Ogbury LTD. (2019, May 23). GDPR One Year On: Survey Findings Show Consumer Awareness with Data Use is Concerningly Low. https://www.prnewswire.com/news-releases/gdpr-one-year-on-survey-findings-show-consumer-awareness-with-data-use-is-concerningly-low-300855176.html?tc=eml_cleartime.

Pingo, Z., & Narayan, B. (2016). When Personal Data Becomes Open Data: An Exploration of Lifelogging, User Privacy, and Implications for Privacy Literacy. Digital Libraries: Knowledge, Information, and Data in an Open Access Society, 3–9.

The University of Pennsylvania (2019). “Your Data Is Shared and Sold… What’s Being Done About It?” Knowledge@Wharton. https://knowledge.wharton.upenn.edu/article/data-shared-sold-whats-done/

Vogels, E. A. (2020, August 14). About one-in-five Americans use a smartwatch or fitness tracker. Pew Research Center. https://www.pewresearch.org/fact-tank/2020/01/09/about-one-in-five-americans-use-a-smart-watch-or-fitness-tracker/.

Weiser, M. (1991). The computer for the 21st century. In: Scientific American (pp. 94–104).

--

--