Article Text

Paper
Development of clinical ethics services in the UK: a national survey
  1. Anne Marie Slowther1,
  2. Leah McClimans2,
  3. Charlotte Price3
  1. 1Warwick Medical School, University of Warwick, Coventry, UK
  2. 2Department of Philosophy, University of South Carolina, Columbia, South Carolina, USA
  3. 3Department of Public Health Epidemiology and Biostatistics, University of Birmingham, Birmingham, UK
  1. Correspondence to Dr Anne Marie Slowther, Warwick Medical School, University of Warwick, Coventry, CV4 7AL, UK; a-m.slowther{at}warwick.ac.uk

Abstract

Background In 2001 a report on the provision of clinical ethics support in UK healthcare institutions identified 20 clinical ethics committees. Since then there has been no systematic evaluation or documentation of their work at a national level. Recent national surveys of clinical ethics services in other countries have identified wide variation in practice and scope of activities.

Objective To describe the current provision of ethics support in the UK and its development since 2001.

Method A postal/electronic questionnaire survey administered to the chairs of all 82 clinical ethics services registered with the UK Clinical Ethics Network in July 2010.

Results Response rate was 62% with the majority of responding services situated in acute trusts. All services included a clinical ethics committee with one service also having a clinical ethicist. Lay members were present in 72% of responding committees. Individual case consultation has increased since 2001 with 29% of chairs spending more than 50% of their time on this. Access to and involvement in the process of case consultation is less for patients and families than for clinical staff. There is wide variation in committee processes and levels of institutional support. Over half of the responding committees undertook some form of evaluation.

Conclusion Clinical ethics services in the UK are increasing as is their involvement in case consultation. However, the significant variation in committee processes suggests that further qualitative research is needed to understand how these committees function and the role they play in their institution.

  • Clinical ethics committee
  • health care survey
  • clinical ethics
  • resource allocation
  • primary care
  • applied and professional ethics
  • feminism
  • quality of health care
  • philosophical ethics
  • general

Statistics from Altmetric.com

Request Permissions

If you wish to reuse any or all of this article please use the link below which will take you to the Copyright Clearance Center’s RightsLink service. You will be able to get a quick price and instant permission to reuse the content in many different ways.

Introduction

Although clinical ethics services (CESs) have been a part of North American healthcare since the early 1970s,1 and in some European countries since the 1980s, clinical ethics committees (CECs) were first described in the UK only in the mid-1990s.2 In 2001 Slowther et al identified 20 established CECs in the UK3 and in the same year the UK Clinical Ethics Network (UKCEN) was established to facilitate CEC training, networking and to provide support to CECs.4 In 2011 the number of CECs registered with the network has risen to 84.5

In parallel with the development of CESs worldwide there has been a continuing commentary on the nature, goals and competencies of these services, including critiques of their legitimacy and effectiveness.6–8 The described activities of CESs include individual case consultation where current or retrospective cases are considered and recommendations are provided to relevant individuals, contributing as a member of the multidisciplinary team, providing input into organisational policies, promoting ethics leadership in the institution and facilitating moral reflection and deliberation among health professionals.4 9 10 The models of CES provision and the institutional and political contexts in which they operate are also diverse.11 CESs can manifest as CECs (as they typically do in the UK), individual ethics consultants or even small teams. This variation creates challenges for evaluation of CESs. Previous suggestions for evaluation criteria for clinical ethics services have included the quality and efficiency of a service as well as the access individuals have to it. The variation in CESs makes it difficult to know precisely what constitutes good quality, efficiency and access.12

A further challenge is the scarcity of good quality empirical evidence on the functioning of CESs. A small number of surveys of CESs in various countries have been published since 2005. Fox et al surveyed a random sample of all US hospitals and at each interviewed the ‘best informant’ on the provision of an ethics consultation service.13 They concluded that, although the prevalence of ethics consultation services was high, there was a marked variation in practice. Gaudine et al surveyed all hospitals in Canada with over 100 beds and found that 84% had a CEC.14 They noted the increasing number of CECs over the previous 20 years, but identified a need for research to define what the scope of activities should be. Pedersen et al surveyed all 24 CECs in Norwegian hospitals and conducted qualitative group interviews with nine of them.15 They identified a need to improve processes, clarify the committee's role and secure adequate organisational support. In 2009 Whitehead et al surveyed 70 UK CECs focusing on committee composition, member qualifications and the number of active or retrospective cases considered in the past year.16 They questioned whether the committee is the appropriate model for CESs in the UK.

This empirical evidence of the variation in practice and the need to clarify the goals of CESs highlights the importance of having a solid understanding of what is happening before developing models of evaluation. We report here a national survey of CESs in the UK with the aim of describing the current provision of clinical ethics support and its development since 2000. Specifically we looked at the following areas of interest:

  • The structure of UK CESs.

  • How UK CESs function in consultation, education and policy review.

  • The range of ethical issues that present to a CES in the UK.

  • The level of institutional support for CESs in the UK.

Method

We developed a questionnaire with reference to the literature on the evaluation of clinical ethics, including the descriptive surveys conducted in USA and Canada.13 14 The questionnaire was refined through six telephone interviews with selected chairs of UK CECs. Their comments informed the development of a final version which was piloted to test user acceptability and time taken to complete. The final instrument contained 57 questions for chairs of CECs or groups and 40 questions for individual clinical ethicists. The study was approved by Warwick University Biomedical Research Ethics Committee.

Sample

The study population was the CESs registered with UKCEN. Our experience in working with the network leads us to believe that most if not all CESs in the UK are registered with UKCEN.

Data collection

In July 2010 the survey was sent electronically and by post to all (N=82) UK CESs known to UKCEN. A reminder was emailed to all chairs 4 weeks after the original mailing with a follow-up telephone reminder at 8 weeks. The information provided by chairs was anonymous. Data collection concluded on 31 October 2010.

Data analysis

Data were entered into and analysed using SPSS V.18.0. Descriptive statistics including frequencies with corresponding percentages were used to summarise the survey results.

Results

Of the 82 CESs registered with UKCEN, 51 completed the survey (response rate 62%); (see supplementary online data). All geographical regions were represented in the responses with the exception of Scotland. The majority of responding CESs (32/50, 64%) were located in acute trusts (general hospitals) with five services in a hospice, four in mental health trusts and four in children's hospitals (Q2). Five belonged to the ‘other’ category and one did not respond to this question. Two chairs noted links between their service and primary care.

Description of clinical ethics services

The majority (43/51, 84%) of surveyed services had been active for ≥4 years (Q3). Although 71% (36/51) of chairs reported having some kind of administrative support, the amount of support varied considerably ranging from 1 h per month to 15 h per week (Q44, Q44a). When asked about avenues of accountability 22/51 (43%) chairs said they reported to the clinical governance director (responsible for quality and governance in clinical care), 18/51 (35%) reported to the trust (healthcare institution) board and 19/51 (37%) reported to a medical director (Q39). Some services reported more than one avenue of accountability.

All the surveyed CESs were described as a ‘formal committee or group with specific membership’ (Q4). One service described an individual ethicist in addition to a formal committee. The number of scheduled committee meetings ranged from two to 12 a year, with a median of six (Q5). Minimum duration of scheduled meetings was reported as 60 min with 22/50 (44%) responders to this question reporting meetings of longer than 90 min (Q7). Unscheduled meetings also occurred with 19 out of 41 responders (46%) reporting two or more unscheduled meetings a year (Q6).

Committee membership numbers ranged from eight to 33 (median 16, interquartile range 12–20)(Q10). Committees were multiprofessional with most (36/50, 72%) having lay membership (Q11). A member with a legal background was reported by 36/50 (72%) committees (table 1). Forty-nine committees (all of those who responded to the question) reported at least one member with a masters or doctorate in clinical/medical ethics (Q18). Not all members with an ethics qualification work as an academic ethicist.

Table 1

Membership of clinical ethics committees

General procedures and practices

When asked about how new members are recruited chairs reported using a range of strategies, often in combination (Q16). Personal recommendation from a member of the committee was the commonest method of recruitment (39/50, 78%) with receipt of unsolicited requests from someone interested in being on the committee (32/50, 64%) and advertising within the trust (29/50, 58%) also used frequently. Only four committees advertised externally for lay members. A selection process for new members was reported by 30/49 (61%) committees. Processes included interview (13/30, 43%), application form (12/30, 40%) and/or CV (14/30, 47%) (Q17).

Functioning

Chairs of the service were asked how many hours a month were devoted to CES work (Q19). Twenty-eight (55%) spent <5 h per month on clinical ethics work, but 5/51 (10%) spent more than 15 h a month. Although there was some variation, the main focus of chairs' time was on case consultation with 14/49 (29%) spending more than half of their time on this compared with 3/48 (6%) who spent more than half of their time on education and 5/48 (10%) who spent more than half of their time on policy review (Q20).

Consultation

Over half (27/51, 53%) of the chairs reported receiving 1–5 requests for consultation in the previous 12 months and almost a third (16/51, 31%) reported receiving 6–10 (Q21). Four services had 11 or more requests and a further four reported not receiving any consultation requests in the previous 12 months. Of the 43/49 (88%) chairs who reported having a mechanism for semi-urgent or emergency requests, 28/43 (65%) reported receiving 1–5 requests in the previous 24 months (Q32).

We defined two categories of consultation requests: institutional consultation requests, which refer to organisational ethical issues and requests focusing on individual clinical cases, which refer to ethical issues involving a patient or family member. Some services reported institutional consultation requests about audit (7/51, 14%) or service evaluation (15/51, 29%) (Q22). The majority of consultation requests related to clinical cases and table 2 shows the range of issues that formed the basis of these requests over the previous 12 months (Q27).

Table 2

Issues that formed the basis of individual case consultation request over previous 12 months (N=49)

When asked about access to consultations, almost all chairs reported accepting consultation requests from doctors (50/51, 98%) and nurses (48/51, 94%) with fewer accepting referrals from non-clinical members of staff—for example, social workers and chaplains (42/51 82%) (Q23).Consultation requests were accepted from patients and family members by 25/51 (49%) services and 12 services placed no restriction on whom could request a consultation. Several respondents commented that in practice most consultation requests came from clinicians. Most services accepted written referrals (44/51, 86%) and/or verbal referrals (37/51, 73%) while 6/51 (12%) accepted referrals via email (Q24). Standardised forms for referrals were used by 19/51 (37%) of committees. For semi-urgent or emergency consultations 42/45 (93%) services who responded to this question used a small group model for consultation (Q30).

In gathering information for individual clinical case consultations, 42/45 (93%) reported having a group meeting with one or more members of clinical staff and 11/45 (24%) reported including the patient in meetings (Q25). Individual meetings were reported with members of clinical staff other than the referrer (29/45, 64%) and with the patient and/or family (15/45, 33%). Just under half, 21/45 (47%), said that they examined the patient's medical records. When asked about the use of decision-making frameworks, over half of the responders to this question (28/50, 56%) reported using the ‘four principles’ approach in the previous 12 months. Other less commonly used frameworks were the Ethox approach (18/50, 36%), the Four Quadrant approach (5/50, 10%) and the Dilemma method (3/50, 6%). Some used more than one framework and 15/50 (30%) said that they did not use any (Q26).

All responding chairs reported using consensus for arriving at a conclusion or recommendation, although two also reported sometimes voting on the outcome (Q33). Records of consultations were most commonly made in meeting minutes 45/51 (88%) with 30/51 (59%) keeping a separate consultation summary and 17/51 (33%) making notes in the patient's medical records (Q34).

Education and policy review/development

Chairs identified a range of educational activities for clinical staff organised or supported by the CES in the previous 12 months including grand rounds—that is, a formal presentation to the trust community (29/51, 57%), conferences (23/51 45%) and clinical unit seminars (18/51, 35%) (Q35). However, 41/51 (80%) chairs reported that they were not living up to their potential in this area (Q36). Involvement of services in policy review or development was variable and 9/48 (18%) responding chairs said that they had not reviewed any policies in the previous 12 months (Q37). Of the 39 services that had reviewed institutional policies, 36 reported reviewing less than five. Thirteen committees stated that they had developed policies in the previous 12 months. Chairs reported a wide variation in the kinds of policies that their service reviewed or developed, but 22/49 (45%) had looked at policies regarding end-of-life decisions/resuscitation/care pathways (Q37).

Service development and evaluation

Just over half of responding chairs (28/47, 60%) identified trust financial support for members attending external courses or conferences (Q43). Twenty-three committees had access to in-house educational workshops. Ten chairs, however, reported no resources available for education of committee members.

Evaluation of the service was reported by 30/51 (59%) of responding chairs (Q42). In addition, 35/51 (69%) responding services submit an annual report (Q40). The most common forms of evaluation are individual referrer feedback forms for cases consultation (13/51, 25%), quantitative assessment of the ethics service's workload (12/51, 24%) and discussions with other CEC members (12/51, 24%).

Discussion

In 2001 Slowther et al predicted that the number of CECs in the UK would increase rapidly.3 At the time of our study there were 82 known CESs in the UK compared with the 20 identified in 2001.3 Our study shows that CESs have increased in number, and also that their workload and scope have also increased. Given that the existence of CESs in the UK is the result of interest within the trust rather than a national recommendation this increase suggests a growing awareness in NHS trusts of ethical issues in patient care and the need to address them.

In 2001 most chairs of CECs reported that they concentrated on guideline and policy development.4 In 2010, while policy development and review was still an important feature of the CESs surveyed, only 10% chairs reported spending more than half of their time on it. The educational function of CECs, seen in 2001 as important, but difficult to deliver, was still underdeveloped in 2010. The main change in workload has come from increased engagement in consultation. In 2001 the eight committees receiving consultation requests had fewer than two requests a year.4 In 2010 39% of committees reported receiving six or more requests a year. Moreover, the scope of consultation has widened to include institutional concerns as well as individual clinical cases.

Despite this increase in consultation activity, referral numbers are still low. This low referral rate has been criticised by some authors. Whitehead et al have suggested that the low referral rate reflects some of the difficulties of a committee model for CESs, citing lack of committee responsiveness as a barrier to effective and timely support.16 Fox et al, however, found similar low rates of case referrals in their USA survey, which included a more varied range of models including individual ethicists and small group consultation.13 While there is a need for qualitative research into why referral rates for ethics consultation are low and which model of support is most effective, the difficulty might lie less with the model of support and more with the level of organisational awareness of ethical difficulties and perceived need for support in resolving them.17 18

A characteristic of CESs in the UK compared with services in North America and in some European countries is the lack of regulatory oversight. While CESs in North America are not regulated directly, the accreditation process for healthcare organisations requires them to have a mechanism for addressing ethical issues that arise in provision of patient care and those mechanisms are evaluated as part of the accreditation process. In the UK, the lack of regulatory guidance, or indeed of national recognition of CESs, means that a variety of practices can proliferate without appropriate support or scrutiny. We consider some examples below.

Selection and training of members

We found that 39% of surveyed services did not have a selection process for committee members. The selection of members is a significant undertaking and should not be left to chance. Although there is no consensus about the criteria for membership of a CES,19 it is reasonable to expect transparency and consistency of recruitment and appointment processes.

In 2010 UKCEN published core competencies for CEC members and similar competency statements have been published elsewhere most notably by the American Society for Bioethics and the Humanities (ASBH).20 21 While new members of CECs need not already possess these competencies they should be expected to acquire some of them in the near future. Therefore, it is worrying to note that 20% of services surveyed received no support from their trust for education of committee members. Concerns about appropriate training for members of CESs is not unique to the UK. Gaudine et al found that fewer than half of Canadian responders reported that their committee members received special training on joining the committee.14 Fox et al found that 45% of US ethics consultation providers had learnt independently without direct supervision by an experienced member of an ethics consultation service.13

Despite legitimate worries about the training and qualification of CES members it is worth highlighting the presence of at least one UK CEC member with a masters in clinical or medical ethics in all services responding to our survey. Whitehead et al found similar results.16 This is a marked improvement from 2001, which also compares favourably to US membership. Fox et al found that only 5% of ethics consultants had completed a fellowship or a graduate programme in bioethics.13

Referrals and collection of information

The differential access afforded to clinicians and to patients and their families by UK committees has been noted previously.22 This is not unique to UK CESs and the involvement of patients and their families in ethics consultation has been debated in other national contexts.23 24 We also found variation in the extent to which patients and family members are involved in the information-gathering process for consultation raising questions about the quality of the information on which case deliberations are based. Thus information considered relevant by patients and families may be omitted, and also the personal perspectives of patients and families will be excluded, or at best only be represented second hand.

Further qualitative research is required to explore the reasons for and consequences of differential access to CESs. Although we plan to investigate these issues in our future work, one contributing factor in the UK might be the reason for the establishment of CESs. CESs in the UK initially developed in response to a perceived need by clinicians for support in facing difficult ethical decisions in caring for their patients.3 It is perhaps not surprising that the primary focus of the service was to respond to clinicians rather than the wider hospital community or to patients and families. In countries where there has been a more top down approach to development of CESs one might expect the aims of the service to be more broadly set and access to be more inclusive. For example, in the USA Fox, found that the stated primary goal in 94% of services was ‘intervening to protect patients’ rights'.13 Services which articulate this as their primary goal are more likely to provide access to patients and their representatives than a service where the primary goal is clinician support. Thus, it is encouraging that half of UK CESs provided access to patients, although further work is needed to explore reasons for the differences between services.

Transparency and accountability

All but one of our responding services required a written referral for case consultation and all kept written records of consultations. Less than half (31%), however, recorded the outcome of the consultation in the patient's notes and given the variation in involvement of patients in the consultation process it seems likely that many patients are unaware of the deliberation that has taken place about their case. (In the UK an ethics consultation report is considered part of the patient record if it has an impact on the patient's care and as such it is available for the patient to see under the Data Protection Act 1998.)25 Therefore we suggest there is an obligation on CESs to consider how patients are informed about this process. The nature of this obligation; procedural, ethical, legal, or all three, requires further discussion which is beyond the scope of this paper.

It is encouraging to note that 59% of committees have performed some sort of evaluation of their service. This statistic compares favourably to the US and Canadian contexts. Fox et al found that only 28% of ethics consultation services reported having a formal evaluation process and Gaudine et al reported that over 50% of Canadian CECs said that their committees were not effective in evaluating their activites.13 14 Nonetheless, the model of evaluation in the UK is variable and we have no evidence on the robustness of these evaluations. The need for good evaluative data on CESs has been recognised in the clinical ethics literature, but little has been done to create the tools for its provision.26 Providing robust evaluative data is a serious challenge for all those involved in providing ethics support in clinical practice and one which must be addressed by CESs in the UK if they wish to maintain and increase their position in a changing health service.

Limitations of the study

This study has some important limitations. There is a possibility that not all clinical ethics services in the UK are registered with UKCEN. We think this is unlikely based on our experience of working with UK CECs over the past 10 years. Unlike hospitals in the USA, Canada and Norway there is no regulatory or government requirement for UK healthcare institutions to have a CES. Therefore, most services are likely to be involved with the only national body explicitly addressing their needs. An alternative sampling method using NHS trusts would be unlikely to further identify the existence of services not registered with UKCEN. Although our response rate (62%) is reasonable, our findings may not be representative of all UK CESs. We did not receive completed surveys from some regions and we did not receive any from the three UKCEN registered CESs based in primary care trusts. Some of the individual questions in our survey had a low response rate, which may reflect the level of detailed knowledge that the questions required. Also all of our respondents were chairs of committees. Although the chair is likely to be the most knowledgeable informant on the service, he or she nonetheless provides only one perspective of the committee's activities.

Conclusion

Clinical ethics services in the UK are increasing as is their involvement in consultation. This expansion over the past 9 years suggests that there is a perceived place for CESs in the NHS. While this growth is largely encouraging with it has come variations in practices and procedures. These differences are not unique to UK CESs and can be found in similar data from other countries. Differences in how CESs manage their affairs are not necessarily problematic as long as the practices are justified. We suspect, however, that just as CESs have grown throughout the UK in an ad hoc manner so too have their practices and procedures. Further qualitative research is needed to understand better how these services conduct their business and why. The development of evaluation criteria is also needed. Research evidence alone, however, is not sufficient to stimulate use of justifiable practices. As the presence of CESs in the UK increases there is also a need for national bodies such as the NHS and the Department of Health to become involved in the coordination of best practice guidelines. This involvement could take a number of different forms, but perhaps the most obvious starting point is the formal recognition of CESs in the UK. If CESs were recognised as a feature of the NHS it would become easier to put into place the appropriate support and scrutiny that other healthcare services require.

Acknowledgments

The authors thank all clinical ethics committee chairs who completed the questionnaire. This project benefited from facilities funded through Birmingham Science City Translational Medicine Clinical Research and Infrastructure Trials Platform, with support from Advantage West Midlands.

References

Supplementary materials

  • Supplementary Data

    This web only file has been produced by the BMJ Publishing Group from an electronic file supplied by the author(s) and has not been edited for content.

    Files in this Data Supplement:

Footnotes

  • Funding The study was part of a clinical ethics development project funded by the Ethox Foundation. The funding supports a research fellow and research secretary.

  • Competing interests AMS has been a member of the board of trustees of the UK Clinical Ethics Network since 2001 and is chair of the board with effect from 25 June 2011.

  • Patient consent The study was a questionnaire survey and responses were anonymous. Return of the form was taken as consent. It is not possible to retrospectively seek consent from individuals and to have obtained individual consent initially would have broken anonymity.

  • Ethics approval University of Warwick Biomedical Research Ethics Committee.

  • Provenance and peer review Not commissioned; externally peer reviewed.

  • Data sharing statement Further data from the questionnaire survey in relation to questions not covered in this paper are available on request to the corresponding author.

Linked Articles

  • The concise argument
    Mark Sheehan

Other content recommended for you