Midlands and Lancashire Commissioning Support Unit, West Bromwich, UK
Email: alison.turner14@nhs.net
Background
In a climate of increasing financial pressure and rising demand, local health and social care economies are being encouraged to develop new integrated models to transform the delivery of health and care services (NHS England, 2014; Naylor et al., 2015). Within England’s National Health Service (NHS), the New Care Models programme was established by NHS England as part of implementation of the Five Year Forward View (NHS England, 2014). A total of 50 Vanguard sites were announced in 20151, to test five different new care models:
- Integrated primary and acute care systems (PACS), which aim to integrate hospital, general practice, mental health and community services;
- Multispecialty community providers (MCP), which focus on moving specialist care from hospital to community settings;
- Enhanced health in care homes, which aim to integrate health, care and rehabilitation services for older people;
- Urgent and emergency care, which are designed to reduce pressures on emergency departments;
- Acute care collaborations, which are exploring different organisational models for collaboration.
There is a growing recognition of the complexity, ambiguity, volatility and uncertainty (Ghate et al., 2013, The Evidence Centre, 2010) inherent in public service transformation and observers have advocated a “paradigm shift” (Cady and Fleshman, 2012), drawing on complex adaptive systems theory (Snowden and Boone, 2007, Best et al., 2012). This approach favours iterative and experimental change using co-produced solutions rather than a “big bang” approach.
Local health and care economies are required to demonstrate a clear evidence base for service reconfigurations (Nicholson, 2010). It is also recognised (Timmins, 2015) that this evidence base is critical to engage with people and to provide resilience (Wye et al., 2015). However, recent research has shown health planners’ use of evidence to be variable and inconsistent (Swan et al., 2012; Imison et al., 2015) pointing to a need for greater understanding of how evidence is perceived and used (Edwards et al., 2013).
This qualitative case study, sponsored by the Health Education West Midlands Research Fellows (part of the NHS in England) programme, set out to explore how evidence is valued and applied in large-scale change in health and social care. The project aimed to capture a small sample of perspectives, to understand how evidence can be packaged to better support decision making. Specifically, the study considered:
- What constitutes evidence in the context of large scale change in health care?
- Which evidence is deemed to be of value?
- What difference does evidence make?
- How is evidence framed to support decisions?
- What are some of the barriers to using evidence?
The case study centred on the New Care Models programme and was carried out across two units of analysis: the first, focused on the New Care Models team within NHS England and the second, on a purposive sample of New Care Model vanguard sites. This paper is a brief summary of some of the findings and conclusions.
Methods
Given the small scale of this study, data collection was focused on two of the six sources of evidence suggested by Yin (2014): interviews and documents.
Interviews
The total sample comprised:
- 6 senior managers from the New Care Models team within NHS England;
- 7 senior managers and clinicians from 5 vanguard sites.
Interviews were semi-structured, with questions focused around four key areas of inquiry, based on themes identified in the literature review: the context in which individuals are working; personal perspectives on evidence; processes for using evidence; and experiences of using evidence.
Documents
Key programme documentation from NHS England and the selected vanguard sites was also analysed, to contextualise and corroborate interview data (Yin, 2014).
Findings and discussion
Context
Interview participants were asked to describe their local context, in particular the challenges they face in designing, delivering and evaluating their vanguard programmes. The scale and pace of the change involved was referenced by several participants, both at national and local level, indicative of the “zone of productive distress” described by Ghate et al. (2013) in their work on the complexity of change. There was a strong recognition that the vanguards are experiments, to test new ways of delivering care, and as such, there is a great deal of interest in learning from them across the health and social care sectors. One participant commented that this profile can lead to high expectations, including pressure to be innovative. There was a sense that whilst vanguards were being encouraged to be “bold” and “disruptive”, the daily pressures of financial and performance management remain, which can distract from or even conflict with the aims of the vanguard.
The responses from the various participants demonstrate the complexity of the task facing vanguards – the programmes are multi-faceted involving new ways of working around contracting; information sharing; budgets; workforce and governance. The reality of working in a more system-oriented way was noted by participants from vanguard sites. In particular, the implications of financially challenged economies is a significant barrier to achieving expected efficiencies; for example, “our secondary care trust for example has a PFI [private finance initiative] and it’s fixed costs so it’s going to be difficult for us to defund it every time we save an admission. £1500 or whatever because ultimately they’ve got the same fixed costs.” [Interviewee G].
Alongside the financial challenges prevalent across much of the NHS, the vanguard programmes depend on robust relationships across organisations and sectors, as outlined by one of the participants. The challenge of managing sustainability against growing demand is often referred to as a “wicked problem”, a concept coined (Grint, 2008) to describe intractable problems in uncertain and unstable environments. Decisions and solutions have to be negotiated with multiple stakeholders, involving an element of consensus (Walshe and Rundall, 2001; Edwards et al., 2013).
“how you work in a system rather than a functional way. I think that is the bit where we’ve had to get our heads around – managing change in a completely different environment, you cannot use your old-fashioned hierarchical NHS [National Health Service] grip, command and control, that’s gone. There’s no system to pull. You can’t pull x and expect y to happen. So the approach is like spaghetti, so you will have to go through spaghetti hoops to actually get and you can’t be certain that you pull x and y will happen.” [Interviewee H].
Perspectives on evidence in whole system transformation
Participants were asked to reflect on what the term “evidence” meant to them personally. How evidence is defined is important – too narrow a definition risks missing sources relevant to the decisions to be made (Dobrow et al., 2004). The emphasis in medicine is very much on hierarchies of evidence, where experimental methods such as randomised controlled trials are viewed as the gold standard (Howick, 2011). It has been suggested that a broader definition is more appropriate suggest a broader definition is more appropriate in a management setting (Williams and Glasby, 2010; Briner et al., 2009), including theoretical knowledge, empirical research, expertise/judgement, evidence from the local context and the perspectives of those impacted by the decision being made. This is echoed in the interview responses, which suggest a broad interpretation of “evidence”, including quantitative data (on activity and outcomes), empirical research, staff expertise, consultation or co-production with patients and members of the public, in addition to organisational learning from prior programmes:
“I suppose it’s possibly information that helps support implementation of interventions to show a good or bad, positive or negative impact on a person, an individual or a society. And I think that evidence can vary from sort of hierarchical levels of write-up from randomized controlled trials to expert opinion.” [Participant J]
It was noted by one participant that “evidence has to be considered in its totality” [Participant B], acknowledging the need for the integration of these multiple types of evidence.
Evidence is perceived differently by individuals and this can be influenced by context, including professional background, experience, training and culture (Edwards et al., 2013; Swan et al., 2012). Participants were asked about what types of evidence they value; there was a suggestion here of the different lenses and “variance in value” suggested by Weber and Khademian (2008). Several participants mentioned patient generated evidence as being of particularly high value and this is reflected in recent research which highlights the impact of stories (Wye et al., 2015). There were also preferences towards practice-based evidence, particularly in models applied successfully elsewhere; however, Edwards et al. (2013) caution the use of practice-based evidence without critical analysis can risk “being vulnerable to the latest fad”. The responses suggest that integration across sectors and co-production with citizens brings with it different perspectives on what constitutes valid evidence and how it is applied in practice.
In terms of the quality and characteristics of evidence, several participants highlighted the need for contexualised evidence: “the studies you do have are all being done on a slightly different basis. Some of them are compared to some sort of average, some of them are compared to their own previous performance – for varying periods of time, from various starting positions – y’know it’s really difficult.” [Participant K].
How evidence is used and applied in transformation
Participants made several references to the value of evidence during the earlier phases of transformation, specifically establishing the case for change and informing design of the programme and care model. There was little mention of the value of evidence to implementation and evaluation of programmes and models, but this may reflect the current focus of programme teams on design.
Whilst one individual expressed a feeling of being overwhelmed by the volume of evidence (“There’s too much evidence out there actually, there’s not enough action” [Participant H]), another participant referenced a dearth of evidence (“So the evidence that we’ve used – there’s not a massive amount of it” [Participant K]), reflecting the “information poverty” and “information overload” described by MacDonald et al. (2011). It seems that for some aspects of new care models, the evidence base is relatively sound, however, for other aspects and for combinations of interventions, it can be lacking. There may also be different expectations of how evidence might inform decisions, which influence individuals’ perceptions. Some programmes are addressing such gaps through co-production with stakeholders, as evidenced on one site which has developed patient-derived outcomes to inform contracting and evaluation.
There were indications that programme teams were considering how they will contribute to an emerging evidence base, through evaluation but also collaborations with academic or commercial partners. Knowledge sharing across the New Care Models programme appears to be particularly important to participants working both at national and local level. This is reflected in work by Best and Holmes (2010) who suggest that to encourage more evidence-informed practice, there needs to be more practice-informed evidence, through more evaluation.
“Although the vanguards have all got different titles, if you look at the common themes coming out of the vanguard programmes, a lot of them are around population planning and outcomes based delivery so I think there’s a lot of common themes that need to be shared and I think we’ve recognized that quite early” [Participant J].
However, one participant noted a potential tension around sharing quickly (supporting the “fail fast, learn fast” approach referenced by one participant) and risking the spread of poor practice.
Several researchers have commented on the reframing, integration and co-production of evidence (Wye et al., 2015; Dobrow et al., 2004) in this context. A key challenge is in translating findings from another setting to the local setting or, in relation to building the evidence base, generalising findings from a local setting. Also, there is a challenge is translating the evidence for a particular aspect of the programme and how this resonates with a programme wide view
Reflections on and experiences of using evidence
Participants were asked to reflect on their personal experiences of using evidence, specifically; expectations; barriers and enablers; and opportunities to improve the spread and use of evidence. Several participants shared their expectations that evidence simply wouldn’t be available, citing the experimental and innovative nature of the new care models:
Where evidence is available, it can be valuable in challenging assumptions (Lewis et al., 2013), understanding a problem and opening up communication (Kovner and Rundall, 2006) as noted by one of the participants:
“So you make assumptions and then you gather evidence, that either proves or disproves the assumption and then identify gaps and what you can do differently” [Participant M].
Participants mentioned several barriers and challenges to the effective use of evidence, on several different levels:
- Issues with the evidence base itself included: quality, completeness, relevance, timeliness, gaps, accessibility and replicability.
- At an individual level, participants referenced a lack of skills and confidence.
- At an organisational or programme level, participants identified time pressures, culture and capacity as particularly constraining.
- At a national programme level, there were some concerns that the support was “out of sync with delivery” [Participant H].
- At a wider system level, there was a sense that the fragmented nature of support, often with a competitive element, led to unnecessary duplication.
There is a growing evidence base on the barriers to using evidence (Edwards et al., 2013; Swan et al.; Sosnowy et al., 2013; Humphries et al., 2014; Shepperd et al., 2013) which is important if we are to avoid the underuse, overuse and misuse of evidence explained by Walshe and Rundall (2001) which can lead to poor investment of resource and potential harm to patients.
Several ideas and suggestions were raised as opportunities to improve evidence use. Across a system such as the NHS, there are opportunities for a more systematic approach to generating practice-based evidence, with participants suggesting a collaborative approach to finding, translating and sharing evidence. It was noted that evidence is more useable and meaningful when it is accessible, succinct, “real” and acknowledges the importance of context, thus giving a sense not only what works, but why and how:
“it’s that context piece, it’s getting the context right, you know, this will not work if any of the following things are going on. So if you’ve got poor quality primary care this just won’t work. If you’ve got, you know, high turnover of staff above 25% this just won’t work. If you’ve got no access to Wi-Fi this just won’t work. And I suppose that’s the kind of why research you get these weird contradictory research papers that says this works and then says the same thing didn’t work because it doesn’t describe the context. [… ] And what is it you can vary. You can vary this, you absolutely can’t vary that because if you vary that it loses the essence of what it was. And those are the kind of tools I think people would find genuinely useful, kind of assessment tools to allow them to make an assessment of it” [Participant D].
The balance of timeliness and rigour is an enduring issue (Shaxson, 2005); importantly, given the time pressures involved, participants rated timeliness as critical:
“I think it’s the timeliness – it’s a key issue, because for me I’d rather have something that was 90% accurate quickly than 100% accurate in 6 months time because the pace at which we are expected to work doesn’t allow for that, if that makes sense.” [Participant L]
Support services clearly have a role in providing facilitation, synthesis and signposting but several participants raised a potential risk of competition and duplication.
Conclusions
Evidence is important particularly for informing design, building consensus and challenging assumptions. The findings suggest that whilst evidence is used to support the design of large scale change, there is little evidence to suggest this is sustained through the lifecycle of the programme. The iterative change advocated by systems thinking is changing users’ requirements (demand side) but it seems that producers of evidence (supply side) have yet to respond with tailored services and products.
The findings have informed a change in my own approach – my team and I are working on a “living review” inspired by the concept of the living systematic review (Elliott et al., 2014) and evidence mapping (Miake-Lye et al., 2016), to support formative evaluation of large scale change (which is also presented as a conference paper).
With regards to the wider system, there seems to be an opportunity for a more systematic and collaborative approach to the generation, identification and synthesis of evidence relating to large scale change.
With regards to future research needs, a recent review (Langer et al., 2016) has explored what works in encouraging research uptake and there will be some important lessons here for health and social care and for knowledge brokers, in particular.
Keywords
Innovation, organizational; Evidence-based practice; Information specialists; Information dissemination; Organizational case studies; Decision making
References
Best, A., et al. (2012) Large-system transformation in health care : a realist review. Milbank Quarterly, 90, 421-456.
Best, A and Holmes, B (2010) Systems thinking, knowledge and action: towards better models and methods. Evidence and Policy, 6 (2), 145-159.
Briner, R. B., et al. (2009) Evidence-Based Management: Concept Cleanup Time? Academy of Management Perspectives, 19-32.
Cady, S. H. and Fleshman, K. J. (2012) Amazing change: stories from around the world, OD Practitioner, (44) 1, 4-10.
Dobrow, J. (2006) The impact of context on evidence utilization: a framework for expert groups developing health policy recommendations. Social Science and Medicine, 63 (7).
Dopson, S. and Fitzgerald, L. (2005) Knowledge to action? Evidence-based health care in context, Oxford, Oxford University Press.
Edwards, C. et al (2013) Explaining Health Managers’ Information Seeking Behaviour and Use. National Institute for Health Research.
Elliott, J.H. et al. (2014) Living systematic reviews: an emerging opportunity to narrow the evidence-practice gap. PLOS Medicine, 11 (2), e1001603.
The Evidence Centre. (2010) Complex adaptive systems: research scan. Health Foundation.
Ghate, D., et al. (2013) Systems leadership: exceptional leadership for exceptional times: synthesis paper. The Virtual Staff College.
Grint, K. (2008) Wicked Problems and Clumsy Solutions: the Role of Leadership. Clinical Leader. British Association of Medical Managers.
Hardwick, R. et al. (2015) How do third sector organisations use research and other knowledge? A systematic scoping review. Implementation Science, 10, 84.
Howick, J. (2011) The philosophy of evidence-based medicine, Oxford, Wiley-Blackwell.
Humphries, S et al. (2014) Barriers and facilitators to evidence-use in program management: a systematic review of the literature. BMC Health Services Research, 14, 171.
Imison, C. et al. (2015) Insights from the clinical assurance of service reconfiguration in the NHS: the drivers of reconfiguration and the evidence that underpins it – a mixed-methods study, Health Services and Delivery Research, 3 (9).
Kovner, A.R. and Rundall, T.G. (2006) Evidence-Based Management reconsidered. Frontiers of Health Services Management, 22 (3), 3-22.
Langer, L. et al. (2016) The science of using science: researching the use of research evidence in decision-making. London: EPPI-Centre, UCL Institute of Education.
MacDonald, J. et al. (2011) Information overload and information poverty: challenges for healthcare services managers? Journal of Documentation, 67 (2), 238-263.
McPake, B. and Mills, A. (2000) ‘What can we learn from international comparisons of health systems and health system reform?’ Bulletin of the World Health Organization, 78: 811-820.
Miake-Lye, I.M. et al. (2016) What is an evidence map? A systematic review of published evidence maps and their definitions, methods and products. Systematic Reviews, 5, 28.
Naylor, C., et al. (2015) Transforming our health care system: ten priorities for commissioners. King’s Fund.
NHS England. (2014) Five Year Forward View. NHS England. URL: http://www.england.nhs.uk/wp-content/uploads/2014/10/5yfv-web.pdf (Accessed 16 April 2016).
Nicholson D. NHS Reconfiguration Guidance. London: Department of Health; 2010. URL: www.gov.uk/government/uploads/system/uploads/attachment_data/file/216051/dh_118085.pdf (accessed 14 April 2016).
Shaxson, L. (2005) Is your evidence robust enough? Evidence and Policy, 1 (1), 101-11.
Shepperd, S. et al (2013) Challenges to using evidence from systematic reviews to stop ineffective practice: an interview study. Journal of Health Services Research and Policy, 18 (3), 160-66.
Snowden, D., J. and Boone, M., E. (2007) A leaders framework for decision making. Harvard Business Review, 69-76.
Sosnowy, C.D. et al. (2013) Factors affecting evidence-based decision making in local health departments. American Journal of Preventative Medicine, 45 (6), 763-68.
Swan, J. et al. (2012) Evidence in management decisions (EMD): advancing knowledge utilization in healthcare management, NIHR Health Services and Delivery Research programme.
Timmins, N. (2015) The practice of systems leadership: being comfortable with chaos. King’s Fund.
Walshe, K. and Rundall, T. G. (2001) Evidence-based Management: From Theory to Practice in Health Care. Milbank Quarterly, 79, 429-457.
Williams, I. and Glasby, J. (2010) Making ‘what works’ work: The use of knowledge in UK health and social care decision-making. Policy and Society, 29, 95-102.
Weber, E. P. and Khademian, A.M. (2008) Wicked Problems, Knowledge Challenges, and Collaborative Capacity Builders in Network Settings. Public Administration Review, March/April, 334-349.
Wye, L. et al. (2015) Knowledge exchange in health-care commissioning: case studies of the use of commercial, not-for-profit and public sector agencies, 2011–14, Health Services and Delivery Research, 3 (19), 1-144.
Yin, R. K. (2014) Case study research: design and methods. London: Sage.
1https://www.england.nhs.uk/ourwork/futurenhs/new-care-models/