With the rise of Big Data use in the health sector, the datafication of the traditional health ecosystem has been creating new power asymmetries and disrupting traditional ethical norms. Previously, we have discussed how an alternative data approach like Small Data can address the tension between data sharing and issues of privacy, taking personal health trackers as a point of reference.
As a human-centered approach to data, Small Data focuses on the individual as the center of data collection, analysis, and utilisation. Under this premise, the successful use of health data is determined by the achievement of an individual’s desired health outcomes and impacts.
Beyond the motivation to advance self-knowledge and promote self-improvement, people also participate in collective pooling of their self-tracked health data to gain more meaning and insights from their data by comparing their data to those who share similar goals, health conditions, or emerging data patterns. Others look at this activity as a form of data activism where they participate in contributing to “the greater good” with their data being used for research purposes.
Personal health and wellbeing data, for example, can have social and political impacts when pooled to track a global pandemic like COVID-19, to demonstrate the link between physical and physiological stress with daily rhythms, and to understand the impact of neighbourhoods and the built environment on the individual and community wellbeing.
But, how do we know that the health data we share to contribute to community efforts are not used and appropriated beyond our intention and consent? How should data architecture and technologies be designed to ensure an individual’s control over their data, the use, and the outcomes of their data-related processes?
Similar questions were discussed at the IDIA2020 virtual conference last week during the keynote session, Privacy by Design for the Next Billion by Payal Arora. Professor Arora puts forward three proposals as theoretical considerations for operationalising the “privacy-by-design” principle in technology designs and data architecture. One of the proposals is to decenter data ownership and consent while re-centering diversity and ethics in the privacy-by-design paradigm.
In this regard, Open Humans, a participant-driven data project which facilitates the sharing of people’s details about their health and medical statuses across data sources for supporting co-created research initiatives, is a good example of the proposal introduced by Professor Arora. Open Humans adopts the privacy-by-default principle. All personal health data stored in the platform are only accessible to the data donors themselves unless they opt in to making the data available to be used in other studies within the platform on a project-by-project basis. This granular consent option gives people control over their data, who can use their data, and to what end. The platform also enables people to conduct self-research, shape research models, and develop new tools for new types of data.
While it remains to be further examined if Open Humans’ design caters to the privacy values across cultures and contexts, it can be observed that the design of the platform is informed by Small Data and privacy-by-design approaches. Through initiatives like Open Humans, it is possible to link individuals’ self-oriented motivation for collecting and sharing personal health data for understanding themselves with their socially oriented motivations, while ensuring that the use of technology and data is consistent with the ethical standards of society.
Suggested citation: Debora Christine., "Amplifying the Collective Capabilities of Personal Health Data through Human-Centered Design," UNU Macau (blog), 2020-04-06, https://unu.edu/macau/blog-post/amplifying-collective-capabilities-personal-health-data-through-human-centered.