Page URL: https://www.progress.org.uk/ncobdata

Response to a Nuffield Council on Bioethics Consultation on The Collection, Linking, Use and Exploitation of Biological and Health Data: Ethical Issues

20 January 2014

This policy document is a response by the Progress Educational Trust (PET) to a Consultation (.doc 155KB) conducted by the the Nuffield Council on Bioethics, as part of the latter's project The Collection, Linking, Use and Exploitation of Biological and Health Data: Ethical Issues.

PET is grateful to Professor John Galloway and Professor Marcus Pembrey for their contribution to this document.


This is a consultation about data and ethics. It prompts the fundamental question of whether the opportunities and threats represented by 'big data' in relation to biomedicine and health should be restricted by the stringencies of current ethical thinking, or whether current ethical thinking - and possibly current legal safeguards - should change to accommodate new opportunities, in order to realise the benefits that big data projects might bring.

PET's patron Baroness Mary Warnock has argued that 'occasionally science makes procedures possible that are so radical that those at the interface between science and politics are called upon to define moral standards for society' ('The ethical regulation of science', Nature, 28 November 2007). If this is true, then we might expect changes in the law to reflect changes in ethical standards prompted by a big data culture.

Historical precedents suggest that legal change following the introduction of radical new medical techniques can lead to what is generally accepted to be a good ethical outcome, accompanied by widespread public acceptance of these once-radical techniques. In addition to challenging established ideas about health research and (for example) reasonable expectations of privacy on the part of research participants, there are also novel considerations prompted by big data in relation to genomics and new technology.

We begin by drawing two historical parallels - heart transplants and in vitro fertilisation (IVF).

Anaesthetist John Bunker, recalling his role in the USA's first ever heart transplant (at Stanford in 1968), has said 'there were only two of us willing to have anything to do with the transplant, because we had been warned by the local district attorney that we could expect to be indicted for murder'. Pioneering British heart surgeon Donald Ross has pointed out that 'in Japan, they indicted surgeon Juro Wada for murder, and I think Ake Senning in Zurich was also threatened with being indicted for murder if he did a transplant'. (Wada was not ultimately prosecuted, but it would take three decades before another heart transplant was performed in Japan.)

Bunker adds: 'There was some talk about, "Why weren't the lawyers there to set the guidelines that would have protected us in advance?" The lawyers responded that, "You in medicine have to establish your procedures and we will try to determine how to legalise them afterwards."' (Early Heart Transplant Surgery in the UK, ed LA Reynolds and EM Tansey, 1999)

The first heart transplant in the UK resulted in those involved being required to present themselves at a coroner's court, investigating the nature of the death of the donor. One of the most profound changes brought about by an ability to perform heart transplants was an updating of the legal definition of death.

An example closer to home for PET is IVF. The implications of the work done by Robert Edwards, Jean Purdy and Patrick Steptoe in the early 1970s were both wide-ranging (in the sense of how many people took advantage of a novel biotechnology) and profound (in terms of how society's ethical norms seemed to change, to accommodate the taking of advantage both by those seeking treatment and by those offering it). Of course, law was subsequently enacted to provide benchmarks for behaviour, and to enforce observance of those benchmarks by the creation of regulatory powers.

These two examples illustrate quite clearly the truth of Baroness Warnock's claim. Ethical and legal frameworks were changed by novel procedures.

A major question for the present consultation exercise is whether the emergence of big data, and more particularly its deployment in medicine and healthcare, has shown signs so far of categorical changes to the ethical landscape analogous to the changes brought about by heart transplants and IVF. It is also worth considering whether the responsibilities of clinicians and researchers at the intersection of clinical care and research participation - as exemplified by big data projects such as the 100K Genome Project, which participants can join only through clinical indications relating to a prioritised set of diseases - require legal clarification.

We believe that those whose health or biological data is shared in research projects should continue to be seen as health research participants, with all of the existing legal protections that this status entails. This type of data is held ethically, and in terms of established social norms, as a special category of data.

We mention this because we are aware of thoroughgoing changes to the way health research participants are regarded, and to the terms on which it is thought reasonable for people to participate in research. Perhaps the most salient example is care.data project administered by the NHS Health and Social Care Information Centre, which aims to create a dataset stretching over the whole patient care pathway, and which permits the sharing of GP and hospital records from patients cared for by the NHS in England. Researchers from public or private organisations, based in any global jurisdiction, conducting any type of research (not just medical research), will be able to apply to use this data.

The participants in this research project will be anyone who has had contact with the NHS in England at any time in their life (thereby generating a health record), and who has not contacted their GP to opt out of the project. This model of research participation, and sharing of medical records, is novel in the scale of its departure from some of the typical aspects of health research participation - for example, participants consenting individually on the basis of the objectives of a clearly defined research project (run by individuals or organisations who can be identified), or consenting more broadly to a widely-defined (but not open-ended) research aim. Searchable linkage to health records under the auspices of care.data will last a patient's lifetime, and beyond.

Media coverage of the care.data project has drawn attention to the issues inherent in big data projects more broadly, and could result in public pressure for legal changes to be made - for example, there could be demand for safeguards beyond those supplied by 'pseudonymisation', a rather loosely-applied term which in this context allows researchers access to individuals' NHS numbers, date of birth, postcode, ethnicity and gender. There is understandable concern that once such information is matched with other publicly available information, individuals will be neither anonymous nor pseudonymous, but rather easily identifiable.

Having said this, it is important to recognise that such pressure for legal change may be countervailed by changing social norms in respect of personal data of other kinds, shared willingly with commercial organisations via retailer loyalty cards or social networking websites. The case will have to be examined as to why the sharing of health data might require special protections, and why it would be ethical to apply these protections if they are likely to conflict with researchers' freedom to make new discoveries of likely benefit to the public. The present consultation is therefore very timely.

The consultation has been prompted by 'technological advances...leading to new global opportunities', and the consultation document discusses biobanks (among other examples). It was the fact that big data was in prospect some 15 years ago that led to the inception of UK Biobank, whose independent Ethics and Governance Council has since done much to address major issues raised by the large-scale linkage of biological, social and health data - work which is applicable far beyond the UK Biobank's own endeavours. UK Biobank's Ethics and Governance Framework (.pdf 292KB) makes for particularly instructive reading in relation to the present consultation.

It is also worth noting that one of the reasons for the widely-perceived success of UK Biobank's conduct in this area is the fact that its Ethics and Governance Council is independent from UK Biobank proper, and has a clear mandate to hold UK Biobank to account via an agreed framework. This makes for an interesting contrast with the example of Genomics England, the company established by Government to deliver the 100K Genome Project. Genomics England has no comparable independent governance committee. An ethics committee exists to advise its Board of Directors, but there is no published framework for ethical conduct according to which a more independent committee might hold the organisation to account.

UK Biobank, like the examples of organ transplantation and IVF given above, draws upon the obvious widespread willingness of people to behave unselfishly towards others in need of medical help - and perhaps also a sense of reciprocity or solidarity that comes from people envisaging that one day, they too may be in need of similar help. The effectiveness of UK Biobank, like the effectiveness of fertility treatment and organ transplantation, relies upon people adopting an altruistic view that the welfare of others justifies challenges to their own welfare.

Surely, it behoves big data projects benefiting from such public altruism - particularly projects with public funding - to provide assurances of probity and sound ethical practice. It seems surprising, therefore, that a national flagship initiative like the the 100K Genome Project - which enjoys Prime Ministerial backing, and which is funded to the tune of £100million by the public purse - would not put in place the safeguard (or perceived safeguard) of a governance framework model similar to that of UK Biobank.

A significant aspect of big data genomics projects is the way they arguably cast two enduringly important bioethical concepts- 'autonomy' and 'ownership' - in a new light.

The concept of 'autonomy', in bioethics and also more broadly, is associated closely with the concept of the individual. However, genotype information belongs (quite literally) to the family. It is potentially ethically problematic if informed consent is obtained solely from individual research participants, but subsequent findings have ramifications for the genetic relatives of those individuals. This can occur if genotype data is fed back to the research participant - for example, within translational research feeding into health services.

One way of mitigating this problem is to make explicit assurances, as part of the process of obtaining informed consent, that research will avoid making certain discoveries - for example, ensuring that triplet repeat length in the gene linked to Huntington's Disease will not be studied. Such assurances run contrary to the prevailing idea that genotype research is an 'all or nothing' proposition, and that 'one genotyping platform for all' is ideal on grounds of cost and efficiency. This popular idea raises broad consent issues, namely people being encouraged to participate in research, but then being obliged to opt for the genotyping platform that has been selected by the research team.

Granted, no assurances can be given in respect of future developments in science and technology, and the way these may result in genomic information becoming significant in as yet unanticipated ways. Nonetheless, it is wilfully perverse to insist on exposing research participants and/or their families to possible discoveries that are known to pose difficulties of ethics and consent. It is not ethically satisfactory to insist that researchers should have automatic latitude to apply the most advanced technology of the day to the genome, to the fullest extent possible, simply because this is expedient and fashionable.

This is especially the case, when participation in a big data project is clinically indicated - for example, participants in the 100K Genome Project who have cancers or rare diseases, and who may have exhausted all other routes of obtaining a diagnosis and/or a more personalised form of treatment. Non-participation may seem too much of a burden to bear for those who do not wish to receive unwanted feedback, and systems should therefore be designed so as to mandate feedback of (specific and appropriate) incidental findings on an opt-in basis only.

As for the concept of 'ownership', we maintain - following the pioneering researcher and policymaker Richard Titmuss - that anyone relinquishing ownership (for example, of their biological samples and the information therein) has a choice. Either they relinquish ownership through a commercial transaction (which encompasses barter), or they relinquish ownership through a gift. Titmuss promulgated this view in his 1970 book The Gift Relationship: From Human Blood to Social Policy, and it has had a profound influence upon biomedical research participation in the UK, where it is perceived by many to be philosophically concomitant with the ethos of the NHS.

The gift relationship seems to play less of a role in research participation in the USA, where there has latterly been pressure to feed back selected information to research participants - a practice which could be perceived as tantamount to barter. Indeed, the USA has seen examples of healthcare (that would ordinarily be paid for) being offered in exchange for participation in research.

Whether and in what circumstances researchers should feed back personalised information arising from research, using increased linkage with healthcare records, is a crucial issue to consider. Widespread linkage risks blurring the distinction between research and service. This should not be done lightly, because the distinction has served us well.

Different standards of accuracy and clarity are required for service provision than are required for research results. The latter can be legitimately presented as provisional pending further investigation, whereas the former are required to be as clear as possible, in order to hold meaning for the individual patient and serve as a basis for how the patient is advised and treated. When research and service are distinguished clearly, then even if it does transpire that research results have potentially important applications in service provision, a case can be made for taking universally beneficial, service-wide (rather than patient-specific) measures accordingly.

We would like to conclude with some reflections on the role of technology in the use of biomedical data. At the risk of being platitudinous, we feel it is worth pointing out that no matter how impressive a new technology is and no matter how daunting the ethical and practical challenges that accompany it, the technology has no volition of its own. It is human decisions that dictate the use to which technology is put.

Concerns about 'privacy' and 'trust' can serve to obscure this fact when applied to technology, because these are concepts that have traditionally been applied to people - or to institutions made up of people - rather than being applied to the tools used by people. It may make sense to talk of concerns about 'data protection' in relation to technology, or of 'confidence' in technology, but concerns about 'privacy' and 'trust' have their origins in the negotiation of relations between individuals and institutions.

Interestingly, however, 'data protection' is fast becoming coterminous with 'privacy' in discussion of data, and 'confidence' is fast becoming coterminous with 'trust'. This suggests that the technology itself is seen as being possessed of agency. We flag this up not necessarily as fallacy that needs to be rectified, but as a significant development in the way the subject of this consultation is understood. Indeed, such elision between categories is present in the Nuffield Council's consultation document.

Finally, we would like to question the idea that there can be a truly 'safe haven' for data.

Obviously, organisations handling biomedical data should have rigorous policies and standards to ensure data protection, insofar as this is possible. At the same time, and without wishing to excuse laxity in these areas, it is also the case that any centralised collection of data will be a potentially attractive target for nefarious endeavours - if nothing else, the act of successfully compromising such a collection of data will have the cachet of a big prize among hackers.

For this reason, we believe that this consultation would benefit from the input of experts in data security working outside, as well as inside, the field of biomedical research. After all, the recent case of defecting National Security Agency (NSA) contractor Edward Snowden is salutary - in terms of the supposedly secure data that the NSA was shown to have obtained from elsewhere, and also in terms of the supposedly secure data that Snowden was able to remove from the NSA - in encouraging us to regard any cast-iron assurances of the security of data with scepticism.