Statistics from Altmetric.com
If you wish to reuse any or all of this article please use the link below which will take you to the Copyright Clearance Center’s RightsLink service. You will be able to get a quick price and instant permission to reuse the content in many different ways.
The sudden appearance of a truly new disease is a wake-up call. A new global pandemic of an infectious agent, transmitted through sexual contact and blood, affecting alienated and/or deprived people and communities, infectious throughout, that causes a slowly progressive breakdown of defence against other infectious diseases, as well as causing dementia in some, and leads to a premature death, occurring in an era of extensive travel and rapid communication, is a veritable tocsin. These crude ingredients of AIDS as a medical and social phenomenon, blended with the poignant personal histories revealed to clinical professionals, spiced — as if this were necessary — by the fear, the prejudice, the wild theorising and the voyeurism, have presented a substantial challenge to practical medical ethics.
Familiar ethical debates have been reopened with new perspective and insight, others have been examined thoroughly for the first time. Building on the major developments in medical ethics over recent decades, AIDS has provided an opportunity, indeed a necessity, for its maturation. It continues to present new issues for debate, while those that were present from the outset continue to perplex us, as this issue of the journal should show. In teaching, AIDS offers a plethora of examples to illustrate most of the central tenets and tensions of medical ethics. As an exemplar, it has provided preparation and precedent for addressing some of the issues emerging from new challenges, such as screening for susceptibility to genetic disease.
The word comes from the same germanic root as “to be robbed excessively”. Bereavement is an intensely personal and disorienting experience, to which people can react with denial, anger, guilt or depression, sometimes prolonged. In Western countries nowadays it is most commonly experienced in mid-life through the death of a parent, or in old age through that of a partner, friends or contemporaries. Mourning, and the self-understanding of the bereaved person may be helped by the feeling that such deaths are “natural”, especially if the deceased person was “ready to go”. Being bereaved of a child, or a partner in mid-life, is much more deeply disorienting, and in a society where that experience is not widely shared, isolation can be added to the burden. Some bereaved parents whose adult children died as a result of HIV/AIDS have suffered this. Many others, and many bereaved partners, have been members of communities whose solidarity has been immensely supportive. Bereavement may have complications that can be helped medically or psychologically. But essentially it poses an existential question, about finding meaning in an experience, sometimes life-long, of apparent meaninglessness.
The legal requirement for consent to treatment is based on the ethical principle of respect for persons and their autonomy. For consent to be valid, it must be given voluntarily by a competent patient who understands the treatment's nature and implications. Whether the patient has been sufficiently informed is decided in law by either a “professional standard” or a “prudent patient” test. Both of these legal tests, however, involve judgments that are contestable not only on clinical and cultural, but also on moral grounds, in particular the need to take account of the principle of non-maleficence. Informing an intellectually or emotionally unprepared patient about all the clinically relevant, but remote or as-yet-unconfirmed possibilities, can result in information overload, disabling rather than assisting patients' decision-making, and alarming or even harming them. Because of this, pregnant women for example were not normally informed that their antenatal tests included one for the remote and treatable possibility of syphilis; and in general, when conducting investigations, most physicians and pathologists assumed that while consent was required to take blood, explicit consent was not required for each individual test on that blood. These assumptions were challenged when a test for HIV became available, and its use without consent was advocated, not only for diagnostic and epidemiological purposes, but also to protect hospital staff by identifying infected patients. The ensuing debate about HIV testing was the catalyst for establishing clearer ethical guidelines on testing and screening for serious communicable diseases generally. In Britain, the General Medical Council (GMC) now states, in relation to such diseases, that: (1) for diagnostic tests, informed consent is always required (except when the test is in an unconscious patient's immediate clinical interests); (2) in the case of unlinked anonymised screening for epidemiological purposes, patients must be informed of their right to refuse sampling; (3) testing without the patient's consent to protect hospital staff is permissible only (and only on an existing blood sample) in the most exceptional circumstances, after an identifiable health worker may have been infected (through a needlestick injury or other occupational exposure) by a patient who refuses or is unable to give consent; and even this, the GMC warns, could be challenged in the courts. Such guidelines, of course, still leave room for judgment in particular cases; and in this respect the main contribution of the HIV debate to changing attitudes on consent is recognition of the importance of mutual empowerment and concordance between patients and doctors.
General Medical Council. Serious communicable diseases. London: GMC, 1997
Duty to care
In the early days of the AIDS epidemic some doctors refused to care for patients who were suffering from an apparently highly contagious and incurable disease. In contrast, the range and depth of some lay care in this disease has fallen little short of the heroic and has raised the standard of good care in other conditions. AIDS has come like lightning in the mountains to outline and illuminate age-old questions. Who should care for the sick, and in what way and to what extent? What does the duty of care consist of and who defines it? How should an individual balance this duty with other competing duties? Is there a correlative right to care, and if so what are the limits to this right? If the duty to care for the sick does not fall on health care personnel, it falls on no one. The limits of professional response are defined by the context — the resources available and the structure in which the case is delivered. But in reality the duty to care is incumbent on all. The recent stress in medical ethics on respecting the autonomy of the individual (in order to counteract the power of professional paternalism) masks a potentially deeper truth about reciprocity and interdependence in human life. Without these, human existence as we know and value it would be impossible. They form a moral cheque to be cashed particularly in times of sickness and suffering, when there are few who would not view there to be a claim on others. The degree may depend on the relationship. For instance, partnership and special friendship mean little without a particular duty to offer attention and help in sickness: the relationships have been entered voluntarily and with promises, actual or implied. In modern Western society the ties of blood or heterosexual marriage have often been made to seem thin compared with the responses of gay friendships once illness has struck. (Although we should remember there is also a duty to care by preventing harm, for which the latter community receives fewer accolades). However, the duty extends more widely than this, to society in general; and more deeply, to the self. In spite of Mrs Thatcher's famous disclaimer about society, the collective responsibility for citizens as a group to care for sick individual members cannot seriously be doubted or dismissed from the political agenda: the debate is about how it should be delivered, and to what extent. Nor can this collective responsibility be minimised or cancelled by reference to the obvious responsibility for the individual to care for him- or her-self. This is a shared duty which must be delivered across a broad front, with explicit negotiation. No one is let off the hook. No one should hang there alone.
Initially, employment concerns related especially to discrimination on the grounds of sexuality and/or HIV status. Managers who became aware of their employees' health status often dismissed them, although they may have used other grounds, some cynically recognising that employees would avoid litigation to protect their privacy. The role of the occupational physician as a physician to the employee, with strict duties of confidentiality and the right to transmit only the employment consequences of health to management, became crucial.
Employee health insurance schemes depending on HIV testing or questions effectively became tests for employment. Some employers (for example airlines) became disproportionately concerned about insidious loss of mental faculties through HIV encephalopathy. This went beyond the evidence and beyond the risk compared with, say, depression or alcohol/drug misuse. Such defects could be detected without reference to causation.
New issues arise from the greatly improved prognosis with combination therapy. Going back to work after a long break would raise concerns among prospective employers; yet patients would be keen not to reveal the true reason. De-skilling or new personal goals may necessitate re-training. Although better, many patients have still found that they are not completely fit, with disease or medication still causing some symptoms or disability. There may be pressure for disclosure, if concessions on employment conditions are needed. Adherence to treatment involving many pills and careful timing around meals can be challenging at work.
While some patients with HIV disease have indeed returned to work, others are inhibited by social and medical constraints, including fear of disclosure and continuing stigma about HIV/AIDS, influenced by early perceptions. Paradoxically, protecting confidentiality may help to keep people in work when they are ill, but may make it harder to get back into it when they are better.
Families have a particular claim on the attention of those responsible for medical care, for several reasons. Firstly, because of the implications for such a group of an infectious or inherited disease, it may be necessary (both out of kindness and for practical considerations) to balance the requirements for individual confidentiality with the need to inform or instruct potential carers or to protect those at risk. The idea of a family rather than an individual being the operative “unit” for health care offers creative possibilities but is fraught with practical, moral and legal difficulties.
Secondly, the traditional view of kinship groups takes on an extra significance when a disease such as AIDS may be passed vertically to children, who may then both lose their parents and become ill and dependent on extended family members or even (as is the case in some areas of Africa ravaged by the disease) on unrelated adults from two generations above. This brings into the frame the question of who has a duty to care and why. When an adult has children, the responsibilities are obvious: although are they lifelong or just until the child reaches maturity? But claims on siblings, uncles or grandparents are not derived from promises or choices made but from traditional views of “family ties” or, where these have lost significance, on the duties incumbent on all members of society to those close to them—such as in the “neighbour” concept of the Christian gospels.
Thirdly, the word family reaches back into its origins (Latin “familia” = “household” and Indo-European words referring to house or neighbourhood) to encompass new groupings of those who choose to live together with a shared lifestyle but without offspring. A gay couple or group should be considered as much a family as any other similar group; yet the law and medical practice in many countries lags behind this social change, leaving the gay partner of an ill patient few rights or positive duties and the “next of kin” status still in the hands of biological relatives who may have lost contact through choice or chance and have nothing to offer when difficult decisions have to be made or loss occurs. Medical staff need to take the step of looking at the foreground relationship not the social background and sensitively balancing the needs of current partners and family against those of parents, whether estranged or supportive.
Partridge E. Origins: a short etymological dictionary of modern English. London: Routledge Kegan Paul, 1966.
No other disease has so clearly revealed the global nature of epidemics or the gross discrepancies between industrialised and developing worlds in access to and resources for health care. While the early impact of AIDS revealed particular sentinel populations that were behaviourally vulnerable, increasingly the burden falls on those who are situationally vulnerable, through deprivation and poverty. Freedom of travel, whether to and from rural and urban settings or from one country to another, has liberated previously constrained micro-organisms and humans. People leave the protective social structures in rural societies for the more chaotic townships, where darkness and alcohol conspire.
Short term economic necessity may be a stronger imperative than long term health. The most vulnerable are those least able to know and least able to protect themselves, whether by lack of education, money or social status (for example, women). Road, rail, ship and air routes have all provided easy routes for the disease to be spread by human beings. Transient encounters with travellers and people at war have been catalysts.
Most countries have had difficulties acknowledging and responding to AIDS. Political, economic, cultural and religious obstacles have had to be overcome repeatedly to produce, let alone sustain, an appropriate public health response. Political leaders, who may themselves have been vulnerable at the outset, were inclined to deny or ignore the issues. Others, who engaged early, found that community or religious backlash, or dwindling interest, meant that any early gains were not sustained. Taboos, old and new, abounded. Outsiders were seen not only as being to blame, but also as conspiring against affected countries; thus those who wanted to help were rendered paralysed by politics and post-colonial xenophobia. Some tried to investigate the emerging pandemic in developing countries with insensitivity and dubious ethical standards; the local backlash would delay engagement.
The self-evident multiple standards of care have created irresolvable controversies, for example in research on vaccines or use of antiretroviral prevention of vertical transmission. Should the standard of treatment or of ethics be that of the local or the sponsoring country? Increasingly, the unavailability of interventions for prevention and treatment, of proven and substantial efficacy, is now sharpening the discrepancies of opportunity, creating greater gulfs. Short term political imperatives in industrialised countries seem to obscure the recognition that, quite apart from any moral perspective, investment in a global response will, in the long term, protect the health and economic prospects of industrialised countries themselves.
Health care workers
Potential risk to patients from HIV-infected health care workers has caused particular concern. It is vital but difficult to strike the right balance between reducing any genuine, if low, risk on the one hand and to safeguard the rights of the employee, especially to confidentiality, on the other.
Screening staff for HIV would be cumbersome, costly and relatively ineffective; it would necessitate frequent tests and still be likely to miss some cases. It could be discriminatory and subject to abuse. Any approach to this issue requires the active participation, through self-disclosure, of those at risk. The General Medical Council places a statutory duty on those who perform exposure-prone invasive procedures and who are at risk from HIV, to determine their status and seek appropriate guidance. This places appropriate responsibility on the individual. The key questions are: to whom must they disclose positive results and what consequences flow from doing so.
Draconian guidance was briefly introduced in the UK (after media pressure), which required employers to be informed directly, with minimal safeguards to the individual's confidentiality. The guidance also undermined the role of occupational physicians, requiring them to disclose HIV status to management. During this time, fewer people came forward, even for testing, because of the adverse personal consequences. Thus by not adequately recognising the rights and concerns of the individuals these measures had the opposite effect from that intended.
Similar problems are posed by the pursuit of “look back” exercises to identify, notify and offer tests to patients treated while the health care worker was positive. Such studies can readily threaten the individual's confidentiality, and even his or her mental health and safety. Data so far suggest the risk of patient infection is very low, but present. At what level does the duty to warn apply? Can potential hazard to the health care worker, through disclosure, affect that? Is there also a duty to investigate the risk in all such cases in order to inform management of future cases? If so, at what (human) cost is this necessary or acceptable?
See also, in this issue: Blatchford O, O'Brien SJ, Blatchford M, Taylor A. Infectious health care workers: should patients be told? Journal of Medical Ethics 2000;26:27–33.
Breaching the barrier of skin is responsible for non-sexual horizontal transmission of HIV. Apart from inoculation injuries and transfusion of unscreened blood/blood products, it occurs through sharing of needles and other equipment by drug users. Sharing is not intrinsic to drug misuse but is determined by necessity, though it has become part of the culture. The necessity is from lack of money or access to needles and syringes. Lack of access is frequently driven by political or policing approaches to drug misuse.
Thus, the Edinburgh epidemic of HIV arose in a deprived subculture when equipment ceased to be available after a supplying pharmacy closed and policing strengthened, focusing on equipment as evidence. Despite this experience and the evident hazards in the “shooting galleries” in the USA, where multiple sharing was inevitable, the establishment of needle/syringe exchange schemes frequently ran into political objections. These were that they were condoning drug misuse and would increase it. There were also fears that exchange schemes would increase accidents to children from needles left in the streets, etc. There was a collision between the public health approaches to containment of blood-borne infection and the unresolved controversies regarding control of drug misuse. The lessons of Prohibition seem to have been forgotten.
With a terrible irony, although the first publicly aired proposal to set up such scheme was by a Commissioner for Health in New York in 1983 (David Sencer), the USA has been the last industrialised country with a significant drug-use epidemic to introduce such a scheme. Dr Sencer lost his job shortly after his proposal and Americans lost the opportunity to prevent this major part of their epidemic. Refusal to allow such schemes in the USA continued long after schemes set up in Europe and Australia had shown that they did not have the feared impacts and that they greatly reduced the epidemic among drug users. The political ethics of moral rectitude overpowered more liberal, realistic and pragmatic public health interventions, and at terrible cost.
In the health care context, distributive justice (allocating goods and responsibilities to equals equally, and to unequals unequally when the inequality is morally relevant) is more familiar than retributive justice (righting wrongs, either by restoration or by punishment), which is the province of the law. Justice has been compared both to scales (balance) and to a mighty stream (passion). It can be seen as a regulative ideal, something to be aimed at, and thereby got closer to, even when it cannot be perfectly achieved to the satisfaction of everyone involved. As the last of the four prima facie principles of health care ethics, following beneficence, non-maleficence, and respect for autonomy, justice can be seen a regulative ideal for weighing, in particular contexts, the respective claims of the other principles. This presupposes, however, that the procedures and persons involved themselves also approximate to the ideal (procedural justice; “the just man justices”). HIV and AIDS have raised in sharp focus ethical questions about distributive justice (proportionate allocation of resources and responsibilities to people affected within developed, and between developed and developing countries) and metaphysical questions (Why has this happened to him, her, me? How can you still believe in Justice?).
Unable to deny or comprehend the coexistence of reason and goodness, suffering and evil, human thought has sought inspired explanations. In Indian religion and philosophy, the apparent unfairness of life is explained by karma, the burden accumulated, or lightened, by bad or good actions in one life, and carried forward to the next, until liberation from all burdens (and hence the need for rebirth) is achieved. The idea of karma reflects the mundane observation that people's past actions tend to be inscribed in their character. But using this idea to explain, for example, the diseases or disabilities from which particular individuals suffer, radically oversimplifies it. What to observers seem to be misfortunes, to believers may be challenges on the path towards, rather than away from, liberation; and indeed the mere fact of having been born as a human (rather than an animal) may be seen as a sign of spiritual progress. Nevertheless, for believers, karma does help to account for otherwise intolerable aspects of life, such as the suffering and death of children. By contrast, both secular materialism and Judaeo-Christian-Islamic religion (which alike posit only one earthly life), are hard pressed to explain such things without sacrificing the claims of meaningful human existence or loving divine providence.
Living with HIV/AIDS
Early perceptions and media insistence that people were dying of AIDS led to a strong rebuttal from those affected that they were “living with AIDS”, an affirmative assertion that changed language and hence perceptions. Along with the outlawing of words such as “victim” among those affected and working in the area, the phrase represented a strong culture of solidarity that used and changed language to shift attitudes and remove negative stigma. This was part of the community empowerment and activism that challenged the alienation. It helped individuals to focus on the positive phases of the typically episodic illness and to aspire to goals that were achievable, with a combination of realism and affirmation. This movement was generally beneficial. However, some patients, who were more continuously and progressively ill, found the need to be seen to be “living with AIDS” an untenable deceit and a linguistic tyranny.
The double-edged sword of the media has been a substantial player in the AIDS pandemic. Unsolicited or solicited, the media have disseminated information and misinformation with characteristic speed and style. The transience of their immediate products belies the durability of their impact on awareness and influence on perceptions. Early reports were often sensationalised, stigmatising and voyeuristic, with a curious but potent mix of scare/reassure messages. Yet they provided some opportunity and a platform for at least reactive public health responses. Specialist media for affected communities, especially the gay press, had a more vital role to play in alerting those at especial risk, initiating community advocacy and providing peer-based health education advice. Yet they were also sometimes vehicles for denial and obscurantism.
In the UK in the mid-1980s, the media were involved in a unique partnership with government in disseminating health messages based on official information, but less encumbered than government in purveying it. Ministers putting condoms on cucumbers provided memorable images, through dissonance and humour. Government television advertisements served an alerting function. The media at their best (in public health terms) gave substance and emphasis. Unfortunately, their need for novelty and entertainment value meant that media-derived messages were neither sustained nor consistent.
The relentless and unrestrained pursuit by journalists of affected individuals (and of those close to them) was terrible to witness; there was no sustainable “right to know” justification. It did, however, lead to the landmark legal judgment in “X v Y”, which secured legal precedent for medical confidentiality and in which the public health gain from respecting individual confidentiality was eloquently articulated.
An extraordinary campaign was waged by a limited group of influential media people to claim, or purvey claims, that HIV didn't cause AIDS, that HIV was not transmitted heterosexually and that there was no AIDS epidemic in Africa. This seriously undermined public health in the UK and beyond. Freedom of speech provided freedom to mislead, even fatally. An unholy amalgam of anti-“establishment”-ism, conspiracy theory, anti-science attitudes and the promotion of seemingly iconoclastic hypotheses, unfettered by the constraints of data, appealed to the public almost as much as the early recipe of sex, blood and death, to which perhaps they had become inured.
“No” has become a word of major moral importance in current HIV care. With good treatment available to few on the planet and immunisation still to be developed, avoiding exposure to the virus remains the only effective way of reducing personal risk and containing the epidemic. Refusal of unsafe sexual or personal practices remains the key advice, but this is extremely hard for the young, the curious or the passionate. Risk-taking is part of the excitement of sex for some people: this dilemma affects all societies but in different ways. In some, the loosening of restrictive sexual prohibitions has joined with a more tolerant attitude towards different types of equal adult relationships to provide liberating opportunities for personal fulfilment. But with these have come suggested new responsibilities: those who have put themselves at potential risk of serious infection should make it safe for new partners, and so on. In other societies where sexual roles are more defined there may be a potential for greater power imbalance between partners, less tolerance of “deviation” and less openness: one of a partnership may become infected by the other and remain unaware, with all the potential consequences, but even if aware may be unable to modify the sexual contact with the infected partner. The ability of individuals to take control of their sexual lives is thus paramount to survival and is linked to issues such as the status of women, attitudes to homosexuality and the acceptance of barrier methods of contraception. But in every circumstance it is also linked to the individual's ability to make choices about specific practices which affect him or her, and to say no to unsafe or undesired encounters or practices in a way which offers a positive as well as negative message. The parallels with modern health care are fascinating: here professionals and patients have had to learn to limit what might be offered or requested without breaking off the relationship or creating antipathy, in order to cope with intrusions on personal autonomy or inappropriate demands for resources. The phrases may contain both the “cracked record” reiterated response such as “that may be, but I'm afraid not”, and clinicians have had to learn to say like lovers “I care so much for you, that I must refuse you this”. Swahili has “nakupenda, lakini . . .” (I love you, but . . .). Or is it “Je t'aime, donc je le refuse”?
Having children is central to most people and cultures. The vertical transmission of HIV raises fundamental issues of how far the rights of the parent extend in making choices that affect the health of the unborn child. The high risk of early mortality in one or both parents from the effects of the same infection, whether or not the child is infected, adds the dimension of “planned orphanhood”.
This remains a critical issue for developing countries, notably Africa, where AIDS orphans are already affecting society and social structure. Grandparents or other extended family may take on the burden of these orphans, without the economic means or health to do so. The pyramidal age structure of a typical developing country shows how poorly these societies are placed to cope. The very countries most affected lack the resources to set up welfare schemes. Children may need to fend for themselves; older children often have to take on an early parenting role, with impact on their own education and opportunities. Stigma is a further constraint. Despite all this, parents, especially women, may not be in a position personally or socially not to have children. As one affected woman in rural Zambia said: “If I can't have children, I am no longer a woman”.
In industrialised countries, analogous but different issues arise for drug misusers. Their drive to have children may be just as strong as that of others, but some individuals and organisations will see drug misuse as a basis for questioning their parenting capabilities and for taking the children into care—another type of orphan? Adoption and fostering themselves raise complex issues where the child is infected. Adoptive or fostering parents would need to know the child's status in order to provide effective care. At what stage and with what safeguards to the child and the natural parents can this information be provided?
Parents in declining health may, through physical or mental illness, gradually cease to be effective parents. Questions arise about at what stage, and with what authority and agreement, others take on responsibility for the children. Relatives or even friends may take on these de facto orphans on the basis of immediate need, without the opportunity to consider long term consequences. Clinicians and social workers may inadvertently compound these tensions by the need to identify an immediate care package. Many HIV-infected parents of uninfected children are insistent that their children are not told of their infection. Pressure on clinicians to do so at the parent's death, or when the children reach maturity, is often strong.
Prisons have become a major moral concern in modern health care for several reasons. Individuals legitimately deprived of their liberty may also thereby lose other positive aspects of modern life which they could reasonably expect to enjoy. Prison health care is usually old-fashioned, inadequate in quantity as well as quality and frequently abused by both inmates and staff. Respect for a patient's autonomy may be at risk both because of the lack of related liberties for that patient and because of a general atmosphere of coercion and conformity. Funding in the prison service generally is usually spent on containment not on safety: where it is spent on safety, the subsequent rules may be intrusive. Staff are often isolated from knowledge of developments elsewhere or criticism (constructive or otherwise) from the outside world. To this the AIDS epidemic has added another unpleasant twist. Since it is a sexually transmitted disease common amongst intravenous drug users (who form an increasing proportion of the inmates of prisons worldwide but especially in Western countries), those in prison for other reasons who have not contracted the disease are at considerable risk of becoming infected from any sexual contact, as well as from drug use in prison. Where there are shared cells and overcrowding, rape by fellow prisoners is common. In the outside world this dire assault is likely to be an isolated incident; in prison it is likely to be repeated. Thus a prisoner may receive, in addition to the custodial sentence delivered to him, another much more grave punishment which is not deserved (and from which society should protect him) which may be quite literally a life sentence. Law makers and enforcers find it politically expedient not to acknowledge this, or choose to turn a blind eye. Where prison populations rise, especially if more vulnerable young people or infected addicts are incarcerated, diseases like AIDS (or tuberculosis or both) may rise exponentially.
Quality of life
A concept that is as easy to grasp as it is hard to measure. HIV and AIDS have such a profound impact on life expectancy that approaches to care necessarily focus on quality of life. Ultimately, only those affected can judge the quality of their life. Personal adjustment to diagnosis and prognosis, and development of new priorities and goals, are essential prerequisites. External perceptions, including those of clinicians, can be misleading. Furthermore, the expectation of what an individual would regard as acceptable or unacceptable may change if and when the reality occurs. So people who expect to be unable to adjust to loss of physical capability or even mental faculties can take a very different view when these things happen. This underscores the need continually to reassess a patient's view of quality of life and to review living wills and prior instructions for care.
Clinical trials and other clinical research need tools with which to evaluate the impact of disease or treatment interventions on quality of life. Many such tools have been developed, some specifically adapted for HIV, yet they are necessarily crude. They make assumptions that can have broad relevance and yet be individually inapplicable.
Even more difficulties arise from the use of quality-of-life measures in health economic assessments. Up to a point it is appropriate to assess quality of life and extent of health impairment, along with life expectancy, to determine health loss or gain from disease or intervention. However, external indices are a long way from personal perspectives on quality of life. Many AIDS patients with short life expectancy and significant illness and disability may regard their quality of life as having been paradoxically enhanced, by the way that their disease has refocused their priorities and given meaning to their lives. It might be easier if we didn't use the same value-laden phrase for such different applications.
A right is a just claim that ought to be respected. Various moral rights, and correlative duties, pertaining to particular relationships, have been acknowledged by different societies, and various legal rights have been recognised or created. The idea of natural or self-evident human (or creaturely) rights developed more slowly, and in some applications remains more aspirational than actual. A negative right (to be unhindered) is often easier to agree on and implement than a positive right (to be helped), because the latter is resource-dependent. A patient's right to refuse medical treatment, for example, is stronger than his or her right to be given medical treatment which his or her doctors consider futile. But negative rights are not absolute. A patient's right not to have his confidentiality breached, on rare occasions may be overridden if respecting it puts another identifiable individual at a clear risk of serious harm. In practice however, a perceived need to do this may be symptomatic of an avoidable communication problem rather than an unavoidable moral dilemma. Problematic aspects of positive rights are that it may be difficult to identify who has the correlative moral (beyond legal or statutory) duty to help, and that insisting on another's, and especially one's own right to something, may “inhibit any possible impulse of charity on both sides” (Simone Weil).
The introduction of HIV antibody testing in 1985 and the rapidly improving sensitivity and specificity of the assays provided a ready means of screening populations and individuals for HIV. The personal and social consequences of HIV, however, necessitated considerable care and sophistication in applying such tests. Their appropriate use, as with any others, rests more on what is to be done with the result and, crucially, what actions follow than on the test itself. Failure to think through these consequences created a gulf between the political rhetoric of screening for HIV and its clinical applicability.
Screening tests are often used as part of the resolution of differential diagnosis, enabling the exclusion or confirmation of clinical hypotheses. The extent to which prior consent, explanation or consideration of the consequences are required can be problematic. Clinicians may feel that attempting to explain all the possible outcomes is unnecessary and impracticable, while others feel that informed consent is not only necessary but good practice. It is important to strike the right balance and enable each patient to negotiate the extent to which these issues are explored. For HIV and some other tests, it is widely considered that the personal and social consequences are so profound that explicit consent and advance preparation of the patient are essential.
Population screening poses some additional concerns. Consent remains central. In the case of blood donation, there is evidently a secure case for testing to protect recipients; people who do not wish to be tested do not have to be donors. The problems arise from calls for screening in settings where participation is not voluntary. Thus, screening of all members of a professional group, or a population, such as pregnant women, patients needing certain treatments (for example, surgery, renal replacement), or prisoners, raises vital issues of human rights. These relate to what will happen to those identified as positive and who is to be informed of any results. It is a continuing source of surprise how rarely these are thought through by those who advocate screening, which is all too often seen as a solution in itself.
The health and prognosis of people with HIV and AIDS have been transformed since 1996 by the introduction of triple (or more) combination anti-retroviral therapy. This combines more potent effects than earlier single or dual combinations, and enables more durable suppression by reducing the emergence of drug-resistant strains. For the first time, the progressive development of immune deficiency was being substantially reversed. The remarkable falls in levels of detectable virus and the increases in CD4 lymphocyte number and immune function produced a sea change.
The optimism was reflected in the adoption by some of the term Highly Active Anti-Retroviral Therapy (HAART), though the limitations that have subsequently emerged suggest that may have been overstated. Furthermore, the massive reductions in virus load, often below the limit of the available assays, led some investigators to start talking very publicly about the possibility of eliminating the virus. This has proved very premature. While this was a legitimate scientific question, its very prominent public airing was lacking in necessary restraint, fuelling unrealistic expectations. It greatly and perhaps excessively influenced the thinking of both clinicians and patients in their application of these treatments.
It is still unclear when it is best to start. The early rhetoric to “hit hard and early” was theoretically and emotionally plausible; but given the prolonged containment of untreated HIV infection and the extent of recoverability, it may not be necessary. Exposing the patient's virus to the limited portfolio of drugs too soon may limit options later. These therapies have not been without their problems. Adherence to the very demanding regimes has proved difficult. Short term side effects can be minimised, but concerns are emerging about very visible body changes from lipodystrophy (and hence deductive disclosure) and possible long term risk of cardiovascular disease.
Poor responders to therapy may feel demoralised and cheated in the surrounding optimism. Those who followed earlier treatment rhetoric, which has perhaps reduced their prospects of responding to new strategies, may be angry or suspicious of clinicians and research data. Even those who have benefited may have significant adjustment problems if they were psychologically and socially adjusted to an early demise. This is reminiscent of the reaction to reprieve from the firing squad described by Dostoyevsky. None of this diminishes the extraordinary clinical and social benefit that has been achieved, albeit only in those industrialised countries that can afford it.
Unlinked anonymised surveillance
This is a means of epidemiological surveillance that enables assessment of the prevalence of disease markers — for example, HIV antibodies — in populations with broadly defined characteristics. Samples taken from unselected populations — for example women in antenatal clinics, people attending hospitals as emergencies — have personal identifiers removed and are used to show the prevalence of infection. The methodology prevents results being traced back to individuals.
Although an established approach to surveillance, its application to HIV raised concerns about consent and the duty to warn, especially in relation to antenatal screening. The debate ensured that the procedures used and the parallel opportunities for voluntary named testing were secure, but concern and misunderstanding continued. Participating centres provided information that the studies were being done and made it known that people could opt out. They emphasised that if individuals had personal concerns about HIV risk, they could discuss these with their clinicians and could, if appropriate, be tested individually.
Nevertheless, the importance of consent to HIV testing had been so well established that the idea of testing without personal consent gave rise to concern, though the centrality of consent rests on the result being linked to an identifiable individual. The other problem was the idea that a positive result could not be passed back to the individual who might need to know. This gave rise to discomfort that the personal benefits of the knowledge could not be applied to the individual.
However, the research data obtained by such screening has, as intended, enabled the prevalence to be determined and publicised in defined populations in broad geographical areas. Clinicians and public health planners can then develop approaches to raise awareness and, if appropriate, encourage testing for individuals, where there is a significant discrepancy between the prevalence determined by unlinked anonymised screening and that shown from voluntary named testing.
See also, in this issue: de Zulueta P. The ethics of anonymised HIV testing of pregnant women: a reappraisal. Journal of Medical Ethics 2000;26:16–21; and: Pinching AJ. Commentary. The ethics of anonymised HIV testing of pregnant women: a reappraisal. Journal of Medical Ethics 2000;26:22–4.
When HIV was confirmed as the cause of AIDS in 1984, an over-enthusiastic US politician announced that this would lead to a vaccine in two years. Little did she appreciate the formidable technical barriers that lay ahead, let alone the ethical challenges. What she did implicitly recognise was the importance that a preventive vaccine would have in containing the pandemic. Suffice it to say that the nature of HIV, its routes of infection and its establishment of latency have posed real problems in vaccine development, along with limited understanding of what, if any, immune protection can be afforded.
As a uniquely human infection, preclinical studies on vaccines in animals, even primates, can only afford limited insights, though some have been gained by looking at related primate viruses. Concerns about the appropriateness of the use of primates in challenge experiments which could lead to long term infection have been raised, especially since the information gained will still be of uncertain applicability to humans. Thus there has been a drive to study human populations.
Here the challenge is that a well-informed person at risk of HIV infection can reduce that risk to negligible levels by behavioural change. Yet a test of vaccine efficacy in protection would require risk to continue for a result to be obtained. Yet the vaccine trial could itself lead subjects to think that protection was likely, and hence encourage increased risk behaviour before efficacy had been established. The use of live attenuated vaccines, which are more likely to generate a protective response has generally been considered too hazardous because of the risk of reversion to wild type by mutation or recombination. Technical solutions have generally been found to resolve earlier concerns that it would be hard to distinguish antibody responses to a vaccine from responses to actual infection.
The idea of “therapeutic vaccines” has also been explored, where enhancement of protective immune responses was suggested as a way of delaying or even preventing disease progression in subjects who were already infected. Results have been disappointing to date, though the idea is being revisited in patients who have responded to antiretroviral therapy as a way of boosting immune containment when the virus itself is no longer boosting it.
Wider issues about how to study vaccines in developing countries, where the need is greatest, when most of the products will have been developed in the industrialised world, are considered elsewhere in this issue.
Living wills, or advance directives, are statements of people's wishes about their medical treatment in the future, should they no longer be able to speak for themselves. If these are advance refusals, stating what kind of life-prolonging treatment the person would wish not to have, they can have legal force, although it is still up to the clinician to decide at the time whether the circumstances are such as were envisaged by the person when he made the living will; and advance statements which request illegal (for example, active euthanasia) or clinically inappropriate treatment have no such legal force. People with HIV, and their doctors, have played a significant part in making advance directives more widely accepted, partly because they valued mutual empowerment — advance directives can be helpful to clinicians in interpreting their patients' wishes where it is otherwise unclear how “officiously to strive to keep alive”. But this also illustrates the main limitation of advance directives, which is that a “good death” as part of a “good life” is an ideal that can be aimed at but never guaranteed. In the end, much or all depends on the solicitude as well as the skill of the human clinician.
British Medical Association. Advance statements about medical treatment. London: BMA, 1995.
Boyd KM. Advance directives - the ethical implications. Scottish Journal of Healthcare Chaplaincy 1999;2:3-7.
In most modern societies it is unlikely that the average person's first sexual contact will be with his or her life partner. In spite of the strictures of fundamentalist religion, experience of different partners probably contributes to the sum of human health and happiness in all sorts of ways. However, as AIDS is now one of the greatest single threats to human health worldwide, and depends almost entirely on sex for transmission, this “song from the sixties” now carries a discordant and disquieting note. Like syphilis, this disease may lie concealed for many months, even years. An anxious person may take a test immediately after a sexual contact and be falsely reassured by a negative test: yet one contact may be enough. It is transmissible vertically to children as well as horizontally between partners. It is particularly prevalent amongst those who have had unprotected gay sex or unsafe anal intercourse with an infected person: but other modes of sexual behaviour are not exonerated. It is not yet obviously curable or medically preventable. Although the disease is now tragically focused on some communities for historical reasons, increased global mobility (tourism, immigration), the breaking down of barriers between social groups, and a more relaxed attitude to sexuality—all in most people's eyes to be welcomed in themselves—have the unpleasant possibility of increasing the spread of this disease unless safe sex is the rule. Under these circumstances, knowing about the past sexual experiences of a new partner and being aware of one's status or open about one's own experience become a new imperative. “All's fair in love and war” moves from being a cynical and amusing catch phrase to an attitude of psychopathic destructiveness. A new openness and a new language is required for both personal and medical encounters.
Young people are society's future and may define the direction in which change takes place. There is a tendency to encourage (or seize) appropriate independence and the exercise of personal decision making among teenagers (which is also likely to affect both the content and the style of health care delivery in future). Such independence may be particularly exercised in the area of sexual relationships where teenagers may now insist on experimenting both with sex and with sexuality at an age which gives concern to older adults. Disturbing evidence has come to light about the prevalence of sexual abuse of children: while there is no evidence that this is necessarily always as damaging as the hysteria about it suggests, it is clearly very often so and must be presumed to be so for all in order to protect those at risk, whatever proportion of the whole these be. Risk-taking is part of youthful experimentation: perhaps it is this feeling of invulnerability that makes young people particularly vulnerable. The difficulty for medical ethics and for law-makers comes in distinguishing at what stage a young person may be competent to make a decision about sexual behaviour, and how to distinguish an abusive relationship from a creative one, particularly in the presence of the possibility of serious and long term effects of a sexually transmitted disease. In practice the priority must be given to protecting young individuals from harm and enabling them increasingly to make their own good decisions rather than always protecting them from these decisions. Within such positive relationships, the balance of power is crucial: young people should be helped wherever possible, by instruction, example and discussion to make good judgments, including assessment of acceptable risk. But teenagers, impervious to messages about health gain and powerfully influenced by their peers, may be hard to reach. In health care the clinician should always offer a proper discussion of confidentiality and its limits, which should be drawn this side of abuse, violence or threat of suicide.
Zoonosis is a human infection derived from animals. Often the organism is non-pathogenic or only mildly pathogenic in its natural host, but becomes a cause of serious disease in the new host, especially when it first strikes. Increasingly strong evidence now suggests that HIV is a zoonosis derived from an infection of non-human primates. Although this type of origin for infection is not unusual, the pursuit of the origins of HIV amongst humans and among primates became entangled in the polemic of denial and blame. Ill-judged speculation about sexual behaviour of humans in other cultures was extended to some fanciful notions about possible contacts between humans and primates. These speculations were being aired at a time when African countries were already ambivalent about the emerging evidence of the new pandemic and compounded the forces of denial and xenophobia.
Debates continue about the possibility that the transfer of HIV or its progenitor to humans was a result of human interventions. The idea that HIV was deliberately created for germ warfare was always intrinsically implausible and could readily be rejected from the timing of its appearance and the required knowledge-base. More recent questions have focused on inadvertent transmission in cell-derived vaccine programmes, though they too remain highly improbable. The greater likelihood would be from occasional direct contact with primate carriers, for example through bites, with subsequent human-to-human spread.
Interestingly, AIDS has reinforced in a timely manner the potential hazard that animal viruses could become human pathogens. The exploration of xenotransplantation as a solution to organ or tissue replacement has raised the spectre of iatrogenic zoonoses, where retroviruses are among the most likely agents. The ethical approach to this issue, and the threshold of knowledge required before it is attempted, have been strongly informed by scientific, clinical and ethical insights from HIV and AIDS.
Kenneth M Boyd is Senior Lecturer in Medical Ethics, Edinburgh University Medical School and Research Director, Institute of Medical Ethics. Roger Higgs is Professor of General Practice and Primary Care, Guy's, King's and St Thomas' School of Medicine, King's College London. Anthony J Pinching is Louis Freedman Professor of Immunology and Fellow, Department of Human Science and Medical Ethics, St Bartholomew's and The Royal London School of Medicine and Dentistry, Queen Mary & Westfield College, London.