A Black man and an Asian Woman looking directly into the camera and wearing hospital gowns, as lines of numbers and code spill down behind them.
Image credit: Dall-E by OpenAI

Editors’ note: This piece is from Nonprofit Quarterly Magazine’s winter 2024 issue, “Health Justice in the Digital Age: Can We Harness AI for Good?”


In today’s rapidly evolving landscape, digital health innovations, driven by advanced technologies such as artificial intelligence and robotics, are reshaping clinical research in some promising ways. However, the pressing need for equity as relates to both the tech industry and healthcare makes digital health a very complex challenge. This article explores the current state of data, data’s role in digital health and clinical research, old and emerging concerns vis-à-vis health inequities, and a vision for an equitable digital future that empowers patients and ensures inclusivity in clinical research.

The Complex Landscape of Data Ownership, Rights, and Controls

In the digital age, vast amounts of data are generated and processed, raising critical questions about data ownership, rights, and controls. These questions include how we define these concepts, how they are interpreted, and how they work in practice.

Data ownership typically lies with the entity that creates or collects the data, though this can vary based on context and agreements.1 Data rights encompass privacy, access, rectification (the right to have one’s personal data corrected if inaccurate or incomplete),2 and portability (the right to transmit or receive one’s personal data in an accessible format),3 regulated by laws such as General Data Protection Regulation in Europe,4 California Consumer Privacy Act in the United States,5 and Health Insurance Portability and Accountability Act (HIPAA)6 for health data in the United States. Data controls involve technical measures like encryption and organizational policies to protect data integrity and security.7

The definitions of data ownership, rights, and controls demonstrate how they are interconnected and often require a balance between individual privacy and the needs of organizations and society. These needs include consent,8 data protection, transparency,9 data sharing,10 accountability,11 and trust.12 Trust in data differs between organizations and individuals, as organizations tend to focus on legal requirements, while individuals consider several aspects. An independent report by BritainThinks highlighted that for individuals, the why and who of data use are crucial, with the what and how being less significant if the why is justified.13 Trust in data can affect perceptions and reactions to reported data, with historical events notably influencing this trust. What follows is a regrettable event in medical research related to this topic.

Data as a Human Right: The Henrietta Lacks Legacy

The concept of data ownership is gaining traction. Many find inspiration in the story of Henrietta Lacks, an African-American cancer patient whose tumor cells were taken without her consent in 1951—leading to a multibillion dollar industry, while her family and descendants dealt with poverty.14 Henrietta Lacks’s narrative highlights several ethical concerns, including informed consent, health data privacy, transparency/ communication with research participants, and the commercialization of data derived from individuals. The case stands out as one of the most contentious in the history of clinical research and medicine.

The story of Lacks has fueled the argument for data ownership as a human right. One US company has even proposed that the United Nations adopt the first decentralized human right: the right to legal ownership of one’s human data.15 If this were to be adopted, individuals would own their data as property and could profit from selling access to, or even full ownership of, their data for personalized services or research.16 Incentivization, such as financial ones, could lead to exploitation, since individuals may not understand their data’s true value; thus, education, support, and transparency would be essential to protect people’s interests and minimize misuse.17 A decentralized human right as proposed above would require a clear framework with protections that include informed consent, encryption, access restrictions, audit trails, and regulatory oversight.

The definitions of data ownership, rights, and controls demonstrate how they are interconnected and often require a balance between individual privacy and the needs of organizations and society.

Companies are leveraging blockchain and AI to compensate individuals for their personal data.18 Platforms like Nebula Genomics and Hu-manity.co are exploring ways to help individuals manage and monetize personal data.19 Blockchain offers secure storage and smart contracts for controlled data sharing and compensation. In health data, non-fungible tokens (NFTs) are being proposed by researchers to serve as digital contracts that allow individuals to oversee access to their health records, potentially democratizing health data control and promoting transparency.20 The concept of monetizing personal and healthcare data using blockchain and NFTs is emerging but is still in early adoption, facing issues of mainstream availability and regulation. Selling data may create privacy inequalities, potentially turning privacy into a luxury and fostering a black market. Without regulation, this commercialization could worsen inequities in clinical research.21

Integrating Data Ethics and Diversity into the Narrative

In the United Kingdom, organizations can use personal data without consent under a rule known as “lawful basis.”22 Following notable events that sparked public outrage—such as the discovery that general practitioners’ surgeries in England were sharing patients’ complete medical records,23 the National Health Service’s sharing of patients’ data with Google DeepMind without explicit consent,24 and the (non–health related) Cambridge Analytica scandal25—the conversation in the United Kingdom has transitioned from data ownership to themes of control (the right to control one’s data, even in the absence of proprietary rights) and consent.26 (This contrasts with the US narrative, which focuses on data ownership and proprietary rights.) This raises the question: is it possible to have control without ownership? The answer is yes, especially regarding data. This concept is becoming increasingly significant as policymakers develop data ethics frameworks that prioritize transparency, fairness, and accountability.27 Various approaches, including data stewardship,28 data governance/accountability,29 consent management,30 data trusts,31 and regulatory oversight32 are being utilized to achieve these goals.

Not all data are generated, valued, or treated in the same way. Medical data are particularly valuable, presenting challenges and opportunities for minoritized communities. The case of Henrietta Lacks indicates the need for trustworthy processes that consider data ethics. Data ethics frameworks have encouraged the partnership between the National Institutes of Health (NIH) and the Lacks family, which is a prime example of data sharing agreements (DSAs).33 This agreement allows controlled access to HeLa cell data (the name given to Henrietta Lacks’s cells) while respecting the family’s wishes.34 It demonstrates how data sharing can be done ethically and transparently. Ensuring data quality is a foundational aspect of effective data sharing, impacting everything from operational efficiency to legal compliance and user trust.35 The quality of ethnicity data can be affected by miscoding, perceived importance, and biased interpretation.36 Systematic bias often arises from incorrect coding during data collection, which can occur when patients provide information to the NHS.37 For example, individuals may choose anonymity regarding their ethnicity, or healthcare professionals may interpret it inaccurately, deepening health disparities.

We aim to prevent issues like those faced by Henrietta Lacks and her family. Our goal is to promote awareness and encourage inclusive practices in global digital health and clinical research.

Race has been used in clinical diagnosis and decision-making for a long time.38 Race-based medicine uses race as a proxy for biological differences, resulting in harmful treatment patterns for minoritized racial and ethnic groups, further contributing to health disparities.39 Scholars and health justice advocates critique this approach and advocate instead for race-conscious medicine, which emphasizes racism as a key determinant of health and encourages providers to focus on relevant data to reduce health inequities.40 The ethical problems vis-à-vis race-based medical data are increasing as digital technology use grows.41 Digital health uses technology for healthcare delivery, patient monitoring, and wellness management, relying on the data for improving operations, outcomes, research, and innovation. Digital health data use should leverage data for better health without ignoring concerns over exploitation of racialized—and indeed other marginalized—communities. Race-conscious medicine, data ethics, community engagement, and trustworthiness42 should be key considerations for digital health.

Digital Health Colonialism

In today’s world, vast amounts of health data are collected, processed, and analyzed. These data practices reveal various opportunities for enhancing healthcare delivery, workflow processes, and overall efficiency—but as discussed earlier, quality of ethnicity data remains challenging. Data colonialism—the exploitative and unequal relationship that exists between data-rich entities or companies and data-poor countries or communities43—allows large technology companies to create data monopolies, hindering competition and innovation while reinforcing existing power imbalances. Such practices of extraction, control, and exploitation of data from marginalized communities or nations by more powerful entities parallel historical colonialism.44 This dynamic exacerbates inequities worldwide: as of 2020, only 43 percent of the least developed countries had implemented data and privacy protection legislation, compared to 96 percent of European nations, leaving them particularly susceptible to exploitation.45 In regions lacking data privacy laws, researchers from high-income countries may perform studies they would avoid in their home countries. Exploitative practices in this context include “helicopter research,” where researchers from affluent or privileged backgrounds conduct studies in lower-income areas without local participation, and “ethics dumping,” where privileged researchers perform unethical experiments in less-privileged environments that have weaker ethical standards and oversight.46

The phenomenon of “digital health colonialism”—whereby the Global North monopolizes the supply of digital health technology—can hinder economies of the Global South, especially in Africa (where there are plenty of examples of helicopter research and ethics dumping),47 from developing their own digital economies, manufacturing capabilities, and other domestic industries.48 Additionally, US and Chinese multinational corporations are establishing an imperial-like control over digital ecosystems, leading to increased surveillance and disproportionate influence over economics, politics, and culture.49 The Global North’s dominance in the digital realm has led to monopolistic data monetization, characterized by platform capitalism and surveillance capitalism, where user data is collected without full consent.50 This exploitation exacerbates inequalities in the Global South, fostering dependency and raising concerns about digital colonialism, which highlights the need for urgent attention and action.51 Data colonialism raises significant ethical issues, including the need for informed consent, fair compensation, and community control over data in clinical research.

Data colonialism…allows large technology companies to create data monopolies, hindering competition and innovation while reinforcing existing power imbalances.

In the context of global digital health, there are four main areas that illustrate digital colonialism. The first is unregulated health data extraction52—or the exploitation of personal health data by external entities without the individuals’ consent or benefit. Even with consent, it may not be fully informed due to individuals’ limited understanding of the protocols of data sharing, usage, and storage. These data are often gathered via health apps, wearable devices, and electronic health records, leading to privacy issues, loss of control over personal health information, and potential discrimination. The second area is the perpetuation of existing power imbalances.53 Communities or countries with marginal resources may lack the means to fully leverage digital health technologies or use their health data for research, policymaking, or healthcare service enhancement. The third area is data monopolization.54 Dominant tech companies in the digital health sector can collect extensive health data, gaining substantial competitive edges. This can foster monopolistic behaviors, stifle competition and innovation, and widen the gap between data-rich and data-poor groups.55 And the fourth area is unethical research design and methodology practices, which can impact the ethics of data vis-à-vis global digital health.56 The way that data are currently gathered and used can obstruct equitable medical research and public health efforts. For example, insufficient and/or biased data-capture from underrepresented groups or developing nations can delay the creation of specific health interventions and further entrench and even worsen existing health inequities. Incomplete data can overlook essential information necessary for understanding health conditions that affect specific ethnic groups,57 such as sickle cell anemia, which primarily impacts people of color.58 As previously noted, poor data quality can result in incorrect conclusions.

By acknowledging these areas of data inequality in digital health, we can begin to address them. Sharifah Sekalala and Tatenda Chatikobo, researchers in the areas of global health law and inequality and digital colonialism, respectively, propose a decolonial digital health agenda that challenges the simplistic view of digital innovation as a solution for health justice.59 They advocate for reimagining digital health by focusing on Indigenous and intersectional theories, prioritizing local contexts, and emphasizing regulatory infrastructures as sites for struggle and resistance. This approach evaluates who are in fact the beneficiaries of digital health systems, prioritizes community voices and those with lived experience, and establishes strong regulations to address the social harms of unethical/inequitable digitization.60 Recent debates on decolonizing global health focus on addressing power imbalances and knowledge hierarchies that reinforce colonial ideologies.61 Many individuals feel they cannot opt out, but enabling communities to take charge could enhance their benefits from digital health.62

Henrietta Lacks’s HeLa cells became the pioneering “immortal” cultured cell line, which is now extensively used in laboratories across the world. These cells have played a crucial role in research related to cancer, COVID-19, HIV, Parkinson’s disease, and much more.63 Many groundbreaking discoveries may not have been possible without them; however, we must also recognize the ethical concerns surrounding the way these cells were acquired. The history of clinical research is rife with contentious narratives and clashing moral principles such as HeLa cells.64 Shifts in societal values have a significant impact on the laws and regulations governing clinical research. Currently, the field is shifting toward incorporating digital health and data to propel advancements. In this context, we present a case study on decentralized clinical trials (DCTs) and explore how new technologies could help address persistent inequities in clinical research.

Decentralized Clinical Trials: A Promising Innovation

Clinical trials investigate biomedical or behavioral interventions with human participants, and are highly regulated.65 Ethical codes and guidelines66 were created for clinical trials as a result of unethical practices and human experimentation such as those carried out in Nazi Germany during World War II,67 the prescribing of thalidomide—a morning sickness drug that caused birth defects—in the 1960s,68 and the US Public Health Service (precursor of the Centers for Disease Control and Prevention) syphilis study at Tuskegee.69

The way that data are currently gathered and used can obstruct equitable medical research and public health efforts.

Traditionally, clinical trials are reliant on in-person visits, which can be burdensome due to travel and costs. To mitigate these challenges, companies are creating digital health tools and remote data-collection technologies. Decentralized clinical trials allow activities to occur outside traditional sites, such as homes or local facilities, by sending healthcare professionals to patients and/or using technology.70 Using technologies like telemedicine and wearables, DCTs can enhance access to treatments, reduce trial duration and costs, improve participant diversity, and lower environmental impact.71 Regulatory focus is increasing, with agencies like the FDA providing guidelines for integrating DCTs, allowing for more remote assessments.72

More important, flexible and hybrid work arrangements have surged in popularity post-COVID-19, revealing a divide between how marginalized groups and those not marginalized experience office—and other—settings. There is a clear racial divide vis-à-vis people of color, who favor remote work so as to escape microaggressions and feelings of exclusion in office settings.73 The Runnymede Trust, the UK’s leading race equality think tank, reports that 75 percent of women have experienced racism at work, with 27 percent facing racial slurs, and 61 percent feeling pressured to change themselves to fit in.74 People of color also face prejudice in physical healthcare settings.75 This highlights one of the reasons why people of color may choose to avoid hospitals and clinics. Their experiences range from discriminatory remarks to systemic racism in healthcare, which is well-documented in areas such as maternity care, particularly with the disparities in maternal death rates between white individuals and people of color.76 DCTs could open up clinical trials for groups that have previously been excluded from clinical research by improving access to research opportunities for those in underserved areas as well as offer an alternative to hospital environments where racial discrimination occurs.77

However, DCTs, as with all digital technology, have limitations such as data integrity (if data are collected from multiple sources, it could complicate data management and integrity, which could create data quality issues);78 consent (obtaining informed consent remotely can be challenging, which could prompt another Henrietta Lacks situation);79 digital literacy (not all participants have equal access to the necessary technology or the digital literacy to participate in DCTs, which could enhance digital colonialism);80 and the physician–patient interaction (ensuring the safety of participants can be more complex in DCTs due to the lack of direct oversight by clinical staff, for example remote monitoring may not always detect adverse events promptly. Additionally, we do not know how racism in healthcare could materialize in a digital setting in future).81 Early attention to these issues is crucial to prevent further exclusion of underserved patient groups. Addressing these concerns will be crucial for realizing the full potential of DCTs in promoting clinical trial equity.

DCTs and Diversity in Clinical Trials

Some patient subgroups, such as women versus men, may react differently to medical treatments, highlighting the need for diversity in clinical trial participants.82 This diversity ensures that trial populations accurately represent the patients who will use the medicine, making the results more generalizable.83 Historically, clinical trial cohorts have been lacking in diversity, even though the scientific community is aware that women, children, and racial and ethnic minorities are underrepresented.84 There are many reasons that have led to skewed representation of patient populations in research,85 too many to discuss in this article; however, there is growing awareness about race and ethnicity representation in clinical trials.86 As well as the unethical practices and human experimentation that have occurred in clinical trials that exploited or excluded underrepresented groups such as women and minorities, there are other significant factors, such as lack of representation of clinical trial staff;87 fear and mistrust (for good reason);88 and socioeconomic realities, including access to healthcare, financial and time constraints89—all of which have contributed to the negative perception of clinical trials among these groups.90 Recently, the healthcare industry has focused on increasing diversity in clinical trials, partly due to recommendations by regulatory agencies.91

People of color also face prejudice in physical healthcare settings….Their experiences range from discriminatory remarks to systemic racism in healthcare, which is well-documented in areas such as maternity care.

DCTs present ethical challenges92 but also offer benefits for diverse enrollment. Researchers from the University of Birmingham recommend eleven strategies focused on trial design, support, and reporting to enhance equity and inclusion in DCTs.93 Solely digitizing and decentralizing trials won’t resolve underrepresentation—addressing barriers like historical mistrust and misconceptions requires broader initiatives.94 Despite the rapid expansion of using digital tools and services to allow participants to participate remotely from their home or community setting, there is a lack of analysis regarding their impact on equity in clinical research.95 Identified barriers and other challenges hinder the achievement of sufficient diversity in clinical trials, making it crucial to critically evaluate the current role of DCTs in promoting diversity and inclusion.96 We suggest that further research be done in this area.97

Moving Toward Inclusionary Digital Health

To harness the full potential of data and digital health in clinical research, we must prioritize data ethics, social justice, community engagement, equitable digital health technologies, and inclusive trial design. This involves developing the following ethical practices:

  1. Informed Consent. Individuals and communities should have control over their health data and be fully informed about how data will be collected, used, and Consent processes should be transparent, understandable, and respect different groups’ cultural norms and values. The consent format should be accessible to all educational backgrounds, and incorporate imagery, text, and audio so that all learning styles are accommodated.
  2. Data Sovereignty. Efforts should be made to empower communities and countries to exercise sovereignty over their own health This may involve creating data governance frameworks that prioritize local ownership, control, and benefit sharing.
  3. Ethical Partnerships. Collaboration between technology companies, researchers, policymakers, and communities should be based on principles of equity, mutual benefit, and respect for local Partnerships should ensure that the data benefit the communities from which they are collected.
  4. Capacity Resources should be allocated to build the capacity of marginalized communities and countries to leverage digital health technologies and effectively use their health data for their own benefit.

As well as the unethical practices and human experimentation that have occurred in clinical trials that exploited or excluded underrepresented groups…there are other significant factors, such as lack of representation of clinical trial staff.

It is important to acknowledge that the above represents merely a starting point. A commitment to caring for individuals through clinical research necessitates a critical reevaluation of how we conceptualize and implement clinical research, as well as how we design digital tools and services. With our experience working in clinical research, we recognize the role that data and digital health will play in the future, especially in clinical trials. We see the advantages of decentralized methods for patients who live far from hospital sites, as well as for people of color and other marginalized groups, who can participate without facing microaggressions or direct racism and other abuses that may occur in traditional hospital settings. We chose DCTs as a case study due to challenges in patient recruitment and retention in clinical trials, which has prompted the integration of decentralized elements in the clinical research industry.98 Addressing the intersection of technology and healthcare, particularly regarding data and digital health, requires careful planning, robust digital infrastructure, and ongoing oversight to ensure safe and ethical clinical research and other practices. While recognizing the benefits of HeLa cells, we aim to prevent issues like those faced by Henrietta Lacks and her family. Our goal is to promote awareness and encourage inclusive practices in global digital health and clinical research. There is an urgent need to prioritize data quality, digital equity, and inclusivity in clinical research. While DCTs are not the only solution, if done correctly, they represent one valuable approach. A variety of methods will be necessary to establish comprehensive guidelines in this area.

***

Racial and ethnic minority groups have faced exploitation in clinical research that has led to a lack of trust, such as the example of Henrietta Lacks. Technology has the potential to both empower and educate. Digital platforms can serve as valuable resources to inform the public about Henrietta Lacks’s significant contributions to clinical research. Websites and applications can share her story, along with the ethical issues it presents, ensuring that her legacy is respected and remembered. As we advance into the digital health era, it will be crucial to collaborate with both the education and healthcare sectors to address unconscious biases and comprehend the specific challenges that people of color and other marginalized groups encounter in clinical research. Addressing issues like digital sovereignty and advocating fair data practices is vital. By promoting ethical and inclusive digital health advancements, we can empower individuals, improve outcomes, and reduce health disparities, fostering a more equitable digital future that ensures clinical research and indeed all healthcare practices serve the interests of all rather than just a select few.

In the spirit of self-reflexivity, we acknowledge our standpoint. We have both worked in clinical trials and identify as women of color with African heritage. We made efforts to understand existing biases and assumptions during our analysis and writing process; however, it is likely that our ethnoracial backgrounds influence our interpretations.

 

Notes:

  1. Dame Ottoline Leyser and Genevra Richardson, Data ownership, rights and controls: reaching a common understanding; Discussions at a British Academy, Royal Society and techUK seminar on 3 October 2018 (London: The British Academy, 2018).
  2. General Data Protection Regulation (GDPR) includes a right for individuals to request that inaccurate or incomplete personal data be An individual can make a request for rectification verbally or in writing. However, there are some circumstances where the right is restricted, including when it conflicts with important objectives of public interest and when it conflicts with the right of freedom of expression and information. The right to rectification is covered in articles 16 and 19 of the General Data Protection Regulation (2018), gdpr-info.eu/, laid out in Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016, eur-lex.europa.eu/legal-content/EN/TXT/?uri=CELEX%3A32016R0679.
  3. The right to transmit or receive personal data an individual has provided in a structured, commonly used, and machine-readable format is covered in Article 20 of the General Data Protection Regulation (2018).
  4. General Data Protection Regulation (2018); Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016; and Official Journal of the European Union, L 119, 59 (May 4, 2016): 1–88, eur-lex.europa.eu/legal-content/EN/TXT/?uri=oj:JOL_2016_119_R_TOC.
  5. The California Consumer Privacy Act of 2018, Civ. Code § 1798.100 (2018), accessed December 30, 2024, leginfo.legislature.ca.gov/faces/billTextClient.xhtml?bill_id=201720180AB375.
  6. S. Department of Health and Human Services, Summary of the HIPAA Privacy Rule, original ruling 1966, www.hhs.gov/hipaa/for-professionals/privacy/laws-regulations/index.html.
  7. Leyser and Richardson, “Data ownership, rights and controls.”
  8. General Data Protection Regulation (2018): See 4 GDPR Definitions; Art. 6 GDPR Lawfulness of processing; Art. 7 GDPR Conditions for consent; Art. 8 GDPR Conditions applicable to child’s consent in relation to information society services; Art. 9 GDPR Processing of special categories of personal data; Art. 22 GDPR Automated individual decision-making, including profiling; Art. 49 GDPR Derogations for specific situations.
  9. General Data Protection Regulation (2018): See 13 and 14 The right to be informed (transparency). See Recitals 38 Special Protection of Children’s Personal Data; 58 The Principle of Transparency; 59 Procedures for the Exercise of the Rights of the Data Subjects; 60 Information Obligation; and 73 Restrictions of Rights and Principles, gdpr-info.eu/recitals/.
  10. General Data Protection Regulation (2018) sets out certain restrictions or conditions on disclosing personal data to third parties or between different parts of an Data sharing can be routine, one-off, or in an emergency.
  11. Accountability is a principle of General Data Protection Regulation that requires organizations to implement measures to demonstrate their compliance with data processing These measures can be technical or organizational.
  12. Centre for Data Ethics and Innovation, Independent report BritainThinks: Trust in data, December 15, 2021.
  13. Ibid, 27.
  14. Rebecca Skloot, The Immortal Life of Henrietta Lacks (New York: Random House, Inc., 2010).
  15. “Everyone has the right to legal ownership of their inherent human data as property.” “Human Right #31: Powered by Blockchain—Infographic,” Hu-manity.co, accessed September 5, 2024, hu-manity.co/human-right-31-powered-by-blockchain-infographic/; and United Nations, Universal Declaration of Human Rights, 1948, www.un.org/en/about-us/universal-declaration-of-human-rights.
  16. Kateryna Nekit, “The (im)possibility of personal and industrial (machine-generated) data to be subject to property rights,” International Journal of Law and Information Technology 32, 1 (June 2024): p.eaae008.
  17. Jacob Leon Kröger, Milagros Miceli, and Florian Müller, “How Data Can Be Used Against People: A Classification of Personal Data Misuses,” Social Science Research Network (December 30, 2021).
  18. Blockchain and AI have been suggested for compensating people for their The following references discuss how these technologies could work for data. See Stanton Heister and Kristi Yuthas, “How Blockchain and AI Enable Personal Data Privacy and Support Cybersecurity,” chapter 3 in Advances in the Convergence of Blockchain and Artificial Intelligence, ed. Tiago M. Fernández-Caramés and Paula Fraga-Lamas (London: IntechOpen, 2022); Renee Garett, Mohamed Emish, and Sean D. Young, “Cryptocurrency as a new method for participant compensation in research,” Health Policy and Technology 12, no. 2 (June 2023): 100746.
  19. Dennis Grishin, Kamal Obbad, and George Church, “Data privacy in the age of personal genomics,” Nature Biotechnology 37, no. 10 (September 2019): 1115–17.
  20. Kristin Kostick-Quenet et al., “How NFTs could transform health information exchange,” Science 375, no. 6580 (February 2022): 500-502.
  21. Individuals getting paid for use of their data would be positive if organized around patient-centric data ownership, allowing patients to control how their data are used and This could empower patients and create new commercial opportunities for service providers. However, this would need to be carefully considered, as there is room for further exploitation, since people will not be starting on equal footing. As well as privacy issues, there are accessibility issues: individuals from lower-income backgrounds or rural areas often lack access to the necessary technology and internet connectivity. This can result in incomplete or inaccessible health records after their death, complicating the management and use of their data, which can affect the accuracy and usefulness of health data for commercial purposes. Also, if certain populations are underrepresented due to the digital divide, it can lead to biased research outcomes and inequities in health interventions. See Angela G. Winegar and Cass R. Sunstein, “How Much Is Data Privacy Worth? A Preliminary Investigation,” Journal of Consumer Policy 42, no. 3 (July 2019): 425–40.
  22. The lawful bases for processing personal data in the UK are set out in Article 6 of General Data Protection Regulation (2028). There are six lawful See “A guide to lawful basis,” Information Commissioner’s Office, accessed December 31, 2024, ico.org.uk/for-organisations/uk-gdpr-guidance-and-resources/lawful-basis/a-guide-to-lawful-basis/.
  23. Talha Burki, “Concerns over England’s new system for collecting general practitioner data,” The Lancet Digital Health 3, no. 8 (August 2021): e469–70.
  24. The UK’s National Health Service (NHS) shared patient data with Google DeepMind to develop a health app. The data sharing was done without explicit patient consent under the premise of “public ” This led to public concern over privacy and data security. See Alex Hern, “Royal Free breached UK data law in 1.6m patient deal with Google’s DeepMind,” The Guardian, July 3, 2017, www.theguardian.com/technology/2017/jul/03/google-deepmind-16m-patient-royal-free-deal-data-protection-act; and Matthew Sparkes, “Google is shutting down controversial data-sharing project with NHS,” New Scientist, September 2, 2021, www.newscientist.com/article/2289101-google-is-shutting-down-controversial-data-sharing-project-with-nhs/.
  25. The Cambridge Analytica scandal was a data breach involving personal information that increased public awareness of data privacy It was revealed that Cambridge Analytica had harvested the personal data of millions of Facebook users without their consent. The data were used for political advertising purposes, justified as a legitimate interests basis. This caused significant public and regulatory backlash. See Carole Cadwalladr and Emma Graham-Harrison, “Revealed: 50 million Facebook profiles harvested for Cambridge Analytica in major data breach,” The Guardian, March 17, 2018, www.theguardian.com/news/2018/mar/17/cambridge-analytica-facebook-influence-us-election; and Mark Scott, “Cambridge Analytica helped ‘cheat’ Brexit vote and US election, claims whistleblower,” Politico, March 27, 2018, www.politico.eu/article/cambridge-analytica-chris-wylie-brexit-trump-britain-data-protection-privacy-facebook/.
  26. Valentina Pavel, “Rethinking data and rebalancing digital power,” Ada Lovelace Institute, November 17, 2022; UK Department for Digital, Culture, Media & Sport, National Data Strategy, December 9, 2020; and Leyser and Richardson, “Data ownership, rights and controls.”
  27. UK Central Digital & Data Office, “Data Ethics Framework,” September 16, 2020, gov.uk/government/publications/data-ethics-framework/data-ethics-framework-2020; General Services Administration, President’s Management Agenda, “Federal Data Strategy: Data Ethics Framework,” 2020.
  28. Data stewardship involves managing and overseeing data assets to ensure data quality, privacy, and security; for example, organizations appoint data stewards to ensure that data are accurate, accessible, and used appropriately.
  29. Data accountability is a framework for managing data availability, usability, integrity, and security; for example, implementing policies and procedures to ensure data consistency and compliance with regulations.
  30. Consent management comprises systems and processes that allow individuals to control how their personal data is used; for example, websites using consent management platforms to obtain and manage user consent for data processing.
  31. General Services Administration, President’s Management Agenda, “Federal Data Strategy: Data Ethics Framework.”
  32. These are government and regulatory bodies enforcing data protection laws and regulations; for example, the Information Commissioner’s Office (ICO) in the UK overseeing compliance with the General Data Protection Regulation.
  33. Data sharing agreements (DSAs) are essential for managing the exchange of data between organizations while ensuring compliance with legal and ethical standards.
  34. Lawrence A. Tabak, “10 years in, NIH-Lacks Family partnership holds strong,” National Institutes of Health, August 24, 2023, www.nih.gov/about-nih/who-we-are/nih-director/statements/10-years-nih-lacks-family-partnership-holds-strong.
  35. Benedikt Fecher, Sascha Friesike, and Marcel Hebing, “What Drives Academic Data Sharing?,” PLoS One 10, 2 (February 2015): e0118053; Steven Van Tuyl and Amanda L. Whitmire, “Water, Water, Everywhere: Defining and Assessing Data Sharing in Academia,” PLoS One 11, no. 2 (February 2016): e0147942; U.S. General Services Administration, GSA Information and Data Quality Handbook, May 2021; and Office of Management and Budget, General Services Administration, and Office of Government Information Services (n.d.), Data management & governance, accessed January 8, 2025, resources.data.gov/categories/data-management-governance/.
  36. Tracey Bignall and Jess Phillips, “Improving the recording of ethnicity in health datasets,” Race Equality Foundation, November 2022; Serin Edwin Erayil et al., “The Value and Interpretation of Race and Ethnicity Data in the Era of Global Migration: A Change Is in Order,” American Journal of Tropical Medicine and Hygiene 105, no. 6 (December 2021): 1453–55; Lilla Farkas, Analysis and comparative review of equality data collection practices in the European Union: Data collection in the field of ethnicity (European Union, 2017); Gulnaz Iqbal et al., “UK ethnicity data collection for healthcare statistics: the South Asian perspective,” BMC Public Health 12 (March 2012): 243; and UK Government Cabinet Office, Race Disparity Audit: Summary Findings from the Ethnicity Facts and Figures website, October 2017, rev. March 2018, accessed December 30 2024.
  37. Sarah Scobie, Jonathan Spencer, and Veena Raleigh, Ethnicity coding in English health service datasets: Research report (London: Nuffield Trust, 2021).
  38. Jessica Cerdeña et al., “Race-based medicine in the point-of-care clinical resource UpToDate: A systematic content analysis,” eClinicalMedicine 52 (July 2018): 101581; Richard S. Cooper, Jay S. Kaufman, and Ryk Ward, “Race and Genomics,” New England Journal of Medicine 348, no. 12 (March 2003):1166–70; Jesutofunmi A. Omiye et al., “Large language models propagate race-based medicine,” npj Digital Medicine 6 (2023); Darshali A. Vyas, Leo G. Eisenstein, and David S. Jones, “Hidden in Plain Sight—Reconsidering the Use of Race Correction in Clinical Algorithms,” New England Journal of Medicine 383, no. 9 (June 2020): 874–82; and Bignall and Phillips, “Improving the recording of ethnicity in health datasets.”
  39. Jessica Cerdeña, Marie V. Plaisime, and Jennifer Tsai, “From race-based to race-conscious medicine: how anti-racist uprisings call us to act,” The Lancet 396, no. 10257 (October 2020): 1125–28.
  40. Ibid.
  41. Tina Hernandez-Boussard et al., “Promoting Equity In Clinical Decision Making: Dismantling Race-Based Medicine,” Health Affairs 42, no. 10 (October 2023): 1369–73.
  42. Dawei Lin et al., “The TRUST Principles for digital repositories,” Scientific Data 7, no. 144 (May 2020).
  43. Michael Kwet, “Digital colonialism: US empire and the new imperialism in the Global South,” Race & Class 60, 4 (April 2019): 3–26.
  44. Ibid.
  45. “Data and privacy unprotected in one third of countries, despite progress,” United Nations Conference on Trade and Development, April 29, 2020, unctag.org/news/data-and-privacy-unprotected-one-third-countries-despite-progress.
  46. Nature addresses helicopter research and ethics dumping,” editorial, Nature 606, 7 (May 2022).
  47. Ibid.
  48. Kwet, “Digital colonialism”; Billy Perrigo, “Exclusive: OpenAI Used Kenyan Workers on Less Than $2 Per Hour to Make ChatGPT Less Toxic,” TIME, January 18, 2023, com/6247678/openai-chatgpt-kenya-workers/; and Sharifah Sekalala and Tatenda Chatikobo, “Colonialism in the new digital health agenda,” BMJ Global Health 9, no. 2 (February 2024): e014131.
  49. Kwet, “Digital colonialism.”
  50. Bitange Ndemo, “Addressing digital colonialism: A path to equitable data governance,” UNESCO Inclusive Policy Lab, August 8, 2024, unesco.org/inclusivepolicylab/analytics/addressing-digital-colonialism-path-equitable-data-governance.
  51. Ibid.
  52. Sekalala and Chatikobo, “Colonialism in the new digital health agenda.”
  53. Ibid.
  54. Ibid.
  55. Ibid.
  56. Ibid.
  57. Bignall and Phillips, “Improving the recording of ethnicity in health datasets.”
  58. Andrew Campbell et al., “An Analysis of Racial and Ethnic Backgrounds Within the CASiRe International Cohort of Sickle Cell Disease Patients: Implications for Disease Phenotype and Clinical Research,” Journal of Racial and Ethnic Health Disparities 8, no. 1 (February 2021): 99–106.
  59. Sekalala and Chatikobo, “Colonialism in the new digital health datasets.”
  60. Ibid.
  61. Ramya Kumar, Rajat Khosla, and David McCoy, “Decolonising global health research: Shifting power for transformative change,” PLOS Global Public Health 4, 4 (April 2024): e0003141; and “Nature addresses helicopter research and ethics dumping.”
  62. Kumar, Khosla, and McCoy, “Decolonising global health research”; and “Nature addresses helicopter research and ethics dumping.”
  63. “More than a cell: the legacy of Henrietta Lacks,” Research, University of Bristol, accessed December 31, 2024, bristol.ac.uk/research/impact/stories/hela-cells/.
  64. Ayah Nuriddin, Graham Mooney, and Alexandre R. White, “Reckoning with histories of medical racism and violence in the USA,” The Lancet 396, no. 10256 (October 2020): 949–51.
  65. “The Evolving Role of Decentralized Clinical Trials and Digital Health Technologies,” CDER Conversations, US Food & Drug Administration, May 2, 2023, fda.gov/drugs/news-events-human-drugs/evolving-role-decentralized-clinical-trials-and-digital-health-technologies.
  66. ICH Harmonised Guideline: Guideline for Good Clinical Practice E6 (R3) (Geneva, Switzerland: International Council for Harmonisation of Technical Requirements for Pharmaceuticals for Human Use [ICH], 2025).
  67. Evelyne Shuster, “Fifty Years Later: The Significance of the Nuremberg Code,” New England Journal of Medicine 337, no. 20 (November 1997): 1436–40.
  68. James Kim and Anthony Scialli, “Thalidomide: The Tragedy of Birth Defects and the Effective Treatment of Disease,” Toxicological Sciences 122, no. 1 (July 2011): 1–6, Erratum in: Toxicological Sciences 125, no. 2 (February 2012): 613; and Waqas Rehman, Lisa M. Arfons, and Hillard M. Lazarus, “The rise, fall and subsequent triumph of thalidomide: lessons learned in drug development,” Therapeutic Advances in Hematology 2, no. 5 (October 2011): 291–308.
  69. Raymond Vonderlehr et , “Untreated Syphilis in the Male Negro: A Comparative Study of Treated and Untreated cases,” JAMA 107 (September 1936): 856–60; Allan M. Brandt, “Racism and Research: The Case of the Tuskegee Syphilis Study,” The Hastings Center Report 8, no. 6 (1978): 21–29; and Vanessa Northington Gamble, “Under the shadow of Tuskegee: African Americans and health care,” American Journal of Public Health 87 (November 1997): 1773–78.
  70. “The Evolving Role of Decentralized Clinical Trials and Digital Health Technologies.”
  71. Carsten Sommer et al., “Building clinical trials around patients: Evaluation and comparison of decentralized and conventional site models in patients with low back pain,” Contemporary Clinical Trials Communications 11 (June 2018): 120–26.
  72. “The Evolving Role of Decentralized Clinical Trials and Digital Health Technologies.”
  73. Venessa Wong, “These People Of Color Are Anxious About Racist Microaggressions When They Return To The Office,” BuzzFeed News, June 29, 2021; and Heejung Chung, Shiyu Yuan, and Alice Arkwright, Making hybrid inclusive: Black workers experiences of hybrid working (London: Trades Union Congress, 2024).
  74. Michelle Gyimah et al., Broken Ladders: The myth of meritocracy for women of colour in the workplace (London: The Fawcett Society and The Runnymede Trust, 2022).
  75. Marie Plaisime, Marie-Claude Jipguep-Akhtar, and Harolyn M. E. Belcher, “‘White People are the default’: A qualitative analysis of medical trainees’ perceptions of cultural competency, medical culture, and racial bias,” SSM—Qualitative Research in Health 4 (December 2023): 100312.
  76. Marian Knight et al., eds., on behalf of MBRRACE-UK, Saving Lives, Improving Mothers’ Care: Lessons learned to inform maternity care from the UK and Ireland Confidential Enquiries into Maternal Deaths and Morbidity 2016–18 (Oxford: National Perinatal Epidemiology Unit, University of Oxford, 2020); and House of Commons Women and Equalities Committee, “Black Maternal Health: Third Report of Session 2022–23,” accessed 30 December 30, 2024.
  77. Isaac Ashe and Rob Sissions, “Nottingham: New mums report racism in hospitals, says maternity lead,” BBC, February 29, 2024, bbc.co.uk/news/uk-england-nottinghamshire-68431157; and Ian James Kidd, “Black women are at greater risk of maternal death in the UK—here’s what needs to be done,” The Conversation, June 1, 2023, theconversation.com/black-women-are-at-greater-risk-of-maternal-death-in-the-uk-heres-what-needs-to-be-done-204709.
  78. Barbara Bierer and Sarah A. White, “Ethical Considerations in Decentralized Clinical Trials,” Journal of Bioethical Inquiry 20 (March 2024): 711–18; and Monica R. Chmielewski, Kyle Y. Faget, and Michael J. Tuteur, “Decentralized Clinical Trials: Research Misconduct Risks & How to Avoid Them,” Health Care Law Today (blog), Foley & Lardner, September 30, 2024, www.foley.com/insights/publications/2024/09/decentralized-clinical-trials-research-misconduct-risks/.
  79. Bierer and White, “Ethical Considerations in Decentralized Clinical Trials”; and Lily Vesel, “Exploring the Ethical Challenges of Decentralized Clinical Trials: 14th Annual CCTSI Research Ethics Conference,” CU Anschutz newsroom, Colorado Clinical and Translational Sciences Institute, Anschutz Medical Campus, University of Colorado Denver, December 4, 2024, cuanschutz.edu/cctsi/exploring-the-ethical-challenges-of-decentralized-clinical-trials.
  80. Bierer and White, “Ethical Considerations in Decentralized Clinical Trials.”
  81. Ibid.
  82. Charles McCarthy, “Historical background of clinical trials involving women and minorities,” Academic Medicine 69, no. 9 (September 1994): 695–98.
  83. Ibid.
  84. US Food & Drug Administration, “FDA Guidance Provides New Details on Diversity Action Plans Required for Certain Clinical Studies,” news release, June 26, 2024, fda.gov/news-events/press-announcements/fda-guidance-provides-new-details-diversity-action-plans-required-certain-clinical-studies.
  85. Luther Clark et al., “Increasing Diversity in Clinical Trials: Overcoming Critical Barriers,” Current Problems in Cardiology 44, no. 5 (May 2019): 148–72; and Brandon E. Turner et al., “Race/ethnicity reporting and representation in US clinical trials: A cohort study,” The Lancet Regional Health—Americas 11 (July 2022): 100252.
  86. Stacey Versavel et al., “Diversity, equity, and inclusion in clinical trials: A practical guide from the perspective of a trial sponsor,” Contemporary Clinical Trials 126 (March 2023): 107092; “Quantifying DEI in clinical trials: Understanding real-world patients and their needs,” HealthMatch, accessed September 5, 2024; and National Institutes of Health, “Inclusion of Women and Minorities as Participants in Research Involving Human Subjects,” see link to “Policy for the Inclusion of Women and Minorities in NIH-funded research,” grants.nih.gov/policy-and-compliance/policy-topics/inclusion/women-and-minorities, accessed December 31, 2024.
  87. Kirsten Bibbins-Domingo and Alex Helman, eds., Improving Representation in Clinical Trials and Research: Building Research Equity for Women and Underrepresented Groups (Washington, DC: The National Academies Press, 2022).
  88. Shuster, “Fifty Years Later: The Significance of the Nuremberg Code.”
  89. National Academies of Sciences, Engineering, and Medicine, “Lack of Equitable Representation in Clinical Trials Compounds Disparities in Health and Will Cost S. Hundreds of Billions of Dollars: Urgent Actions Needed by NIH, FDA, Others to Boost Representation,” press release, May 17, 2022, www.nationalacademies.org/news/2022/05/lack-of-equitable-representation-in-clinical-trials-compounds-disparities-in-health-and-will-cost-u-s-hundreds-of-billions-of-dollars-urgent-actions-needed-by-nih-fda-others-to-boost-representation.
  90. Rebecca West, “Bridging the Ethnicity Gap in Clinical Trial Participation: Education and tailored communications needed,” Ipsos, March 11, 2024, ipsos.com/en-uk/bridging-ethnicity-gap-clinical-trial-participation-education-and-tailored-communications-needed; HealthMatch, “Quantifying DEI in clinical trials: Understanding real-world patients and their needs.”
  91. See S. Food & Drug Administration, “FDA Guidance Provides New Details on Diversity Action Plans Required for Certain Clinical Studies”; Gyimah et al., Broken Ladders; and Olalekan L. Aiyegbusi et al., “Recommendations to promote equity, diversity and inclusion in decentralized clinical trials,” Nature Medicine 30 (2024): 3075–84.
  92. Effy Vayena, Alessandro Blasimme, and Jeremy Sugarman, “Decentralised clinical trials: ethical opportunities and challenges,” Lancet Digital Health 5, no. 6 (June 2023): e390-e394.
  93. Aiyegbusi et al., “Recommendations to promote equity, diversity and inclusion in decentralized clinical trials.”
  94. Ibid.
  95. Versavel et al, “Diversity, equity, and inclusion in clinical trials.”
  96. “Quantifying DEI in clinical trials: Understanding real-world patients and their needs.”
  97. The biggest barrier to clinical trials is patient recruitment and We would like to sensitize people to do more research in this area, as well as influence policymakers to include decentralized clinical trials (DCTs) more broadly in their recommendations. See Gaurav Kumar et al., “Barriers for cancer clinical trial enrollment: A qualitative study of the perspectives of healthcare providers,” Contemporary Clinical Trials Communications 28 (May 2022): 100939; Yemi Akala, “Exploring the promise of decentralised cancer clinical trials,” Cancer News, Cancer Research UK, July 31, 2024, news.cancerresearchuk.org/2024/07/31/exploring-the-promise-of-decentralised-cancer-clinical-trials/; Olalekan L. Aiyegbusi et al., “Digitally enabled decentralised research: opportunities to improve the efficiency of clinical trials and observational studies,” BMJ Evidence-Based Medicine 28 (2023): 328–31; Daniel F. Hanely et al., “Decentralized clinical trials in the trial innovation network: Value, strategies, and lessons learned,” Journal of Clinical and Translational Science 7, no. 1 (July 2023): e170; Gyimah et al., Broken Ladders; Olivia Miller, “BME workers’ experiences of home working linked to other forms of discrimination,” News Centre, University of Kent, March 12, 2024, www.kent.ac.uk/news/social-justice-inequalities-and-conflict/34669/bme-workers-experiences-of-home-working-linked-to-other-forms-of-discrimination.
  98. Kumar et al., “Barriers for cancer clinical trial enrollment”; Hanley et al., “Decentralized clinical trials in the trial innovation network”; Aiyegbusi et al., “Recommendations to promote equity, diversity and inclusion in decentralized clinical trials”; Akala, “Exploring the promise of decentralised cancer clinical trials”; and “Advancing decentralised clinical trials: Patient Recruitment Centres cement their success,” Stories, National Institute for Health and Care Research, April 10, 2024, www.nihr.ac.uk/story/advancing-decentralised-clinical-trials-patient-recruitment-centres-cement-their-success.