In Conversation with Dr. Natalie Banner: Creating Trustworthiness and Transparency in Health Data
Health data provides an incredibly valuable source of information for researchers, health professionals and those looking to improve public health services. The use and sharing of this data can enable early detection of diseases through diagnosis, the development of new treatments as well as improved standards of health and wellbeing across the population. This being said, health data, in comparison to many other forms of data, can often be extremely sensitive in nature. It is therefore crucial that we have robust ethical policies and regulatory standards in place to protect people's privacy and to uphold patient confidentiality. The implications of the lack of this kind of ethical infrastructure can be very serious and profoundly damaging to people’s relationship with public health services.
As we find ourselves well over a year into the outbreak of the Covid-19 pandemic, maintaining strong relationships between health services and the public has become an increasingly urgent concern. Developing principled ethical guidelines which support transparency and trust around the uses of our health data will be critical in tackling this public health emergency.
To navigate and better understand these issues, WEDF conducted an interview with Dr. Natalie Banner, project lead at Understanding Patient Data (UPD). The UPD initiative is hosted at Wellcome in London, UK and aims to ensure that uses of patient data for care and research are more visible, understandable and trustworthy. Natalie's work explores the opportunities around trustworthy governance for the development of new data-driven technologies in healthcare and research, and focuses on bringing benefits back to patients and the health system. She formerly managed Wellcome’s UK Parliamentary advocacy on GDPR and data protection and led the research sector response to enable a supportive environment for research using patient and health-related data. Prior to joining Wellcome, Natalie was a postdoctoral research fellow in philosophy at King’s College London. In 2020 she was named in Intelligent Health AI's Top 50 AI Innovators.
Do you think the Covid-19 pandemic has provided an incentive for the wider use and sharing of health data internationally?
The pandemic has brought health data to enormous public prominence internationally and made the use of it much more tangible for people. The ‘R’ number occupies column inches and statistics on case rates and vaccinations are now forming part of our public discourse. In many cases, the benefits of being able to use, link across and analyse data at an unprecedented scale and pace have been really evident and this has helped create a shared vision across health systems for what the potential for data use might be beyond the pandemic. The rapid genomic sequencing of variants is a good example here, and has required exceptional levels of sharing and collaboration between countries.
However, the increased use of data has also highlighted health disparities in a shocking way and shown where there are real gaps in our data, or problems with data accuracy and quality. The variations between countries have also created challenges – for example, different ways of recording Covid rates and deaths make it hard to compare countries and the effectiveness of their interventions.
Do you think the justification that it is in the public’s best interests to have greater access to people’s health data has been exploited during the Covid-19 pandemic?
In many respects, wider and more extensive use of people’s health data to manage a global-scale public health emergency can be seen as justifying the overriding of other rights such as privacy, in these exceptional circumstances. But these measures and the grounds for them are temporary and specific to managing the pandemic, and require constant democratic oversight if they are to be trusted by citizens. There are of course a series of value judgements and political choices being made over what data counts and what scope of uses are acceptable; we should not fall into the trap of thinking data is value-neutral or objective, nor that the only ethical issue to care about is privacy. The risk now is that these wider uses of data may be treated as ‘business as usual’ as we shift to a longer-term view beyond the pandemic: this would be a huge error from the perspective of ensuring the institutions responsible for managing health data are trustworthy.
If the pandemic has shown us the positives of being able to use data better and more widely, any decisions about how to embed these broader uses have to be centred on meaningful engagement with the public, high standards of transparency and strong democratic accountability – or people will fear the legitimate and beneficial uses of data creep into data being exploited unfairly for government or commercial ends. And even if these wider uses are privacy-preserving, people also care about ensuring public benefits being equitably shared, addressing not exacerbating health inequalities and so on.
Finally, if there is going to be investment in more infrastructure to enable better use of data within and between countries, can this at least be accompanied by a commitment to ensure people’s own healthcare needs are served at the same time – better access to their own health data to use for managing their care, for example?
Do you think the language used by organisations using people’s data needs to be more accessible? If so, can you suggest ways of overcoming this problem?
Data is complex, dry and abstract. It’s hard to make people engage with stories about data and how it’s used because if you start with ‘data’ you’ve already lost most of your audience. So yes, language does need to be more meaningful and accessible but not lose accuracy, which is a tall order.
We’ve found through our research that flipping the ‘data story’ on its head works well: start with what matters to people, which in this context of health data is usually their own healthcare journey. Now, identify points in that healthcare journey where their care or treatment may be informed by data, say, a new diagnosis, or a switch in medication, or a blood test result. Here you have a key point at which you can provide a nugget of information about how data is informing that care or treatment, for example, how studies using data from other people’s records tells us what a ‘normal’ range is for this blood test for people like you. You can’t give a complete picture of the data pipeline, but you can provide a relevant insight that can then form the basis of further information if people are interested. Making data accessible is about making it resonate with what people care about – serving the health needs of their local community, or a condition they or a loved one manage – and articulating how data informs those important aspects of their lives.
Which aspects of health data do you think people are most reluctant to share with others?
Different things are sensitive for different people, and the context of data use is often also very important in determining what people are comfortable with being shared under different circumstances. Mental health is an interesting illustration here: for some people, their mental health history might feel especially sensitive and they’d be reluctant for this information to be shared with anyone outside those directly involved in their care. But others with a mental health condition may be frustrated that mental health needs are poorly understood, under-resourced and often overlooked, in part because we don’t have good enough data available for research and service planning to know at a population level what the needs are. I use this example to demonstrate that it’s not always clear cut what is sensitive and what is not, and what people are reluctant to share versus keen to be visible.
A further aspect of this complex picture is that ‘who’ is accessing and for what purpose matters hugely. There’s a big perceived difference between a university researcher conducting an ethically approved study and a data analytics company training a new diagnostics algorithm as part of a health tech product – both might be valuable purposes and result in benefits back to patients but many people will instinctively feel more uneasy about the latter project. This is in part why we advocate for greater transparency, accountability and public benefit to be at the heart of decisions about data, so that the rules for whoever is using health data are clear, consistent, and developed with public views and values embedded from the outset.