Evaluating the ‘real world’ differences that new initiatives can make

Health economist Wayne Smith explains how ICHP through evaluating can help assess the future promise of innovations

No matter how great an idea is, without an evaluation measuring and evidencing its efficacy, cost effectiveness and budget impact, it’s unlikely to get off the ground within the NHS.

In a publicly funded service that’s constantly working under financial pressure, it’s not enough just to believe in a new service or innovation; you have to prove to others that an investment in it is worthwhile over the long term.

As a health economist at ICHP I lead on economic evaluations that can help push innovations such as products, services or new ways of working into practice. We’re not here just to rubber stamp a conclusion either way – pass or fail – we’re here to support innovations through ongoing evaluation, offering a steer where we can by using solid data and recommending changes that may increase the positive impact.

We can source such evidence by evaluating an initiative against a peer or alternative approach, and we can look at new ways of working. For example, we’re currently evaluating new models for delivering services in North West London for an early adopter site implementing some of the recommendations from the national Better Births report.

Because we are strong believers in our role of supporting a positive impact for patients, our evaluations are always focused on health outcomes – perhaps even more so than other organisations offering similar evaluation services.  For example, we can look at increased appropriate referrals or reduced waiting times, or measures of wellbeing through qualitative analysis. We also measure value for money, which is where the cost effectiveness and budget impacts come in, but we always keep the patient at the heart of the evaluation.

When we’re starting a project we look in depth at the outcomes and establish which measures would be the most appropriate to focus on to evaluate the initiative. To do this we meet stakeholders and interview clinicians and other interested parties to gain qualitative feedback on what they want to measure. We carry out literature reviews to find any similar projects that have been done in the past, identifying the outcomes and methods they used and building on those, and we can also work with clinicians to understand how the pathway changes with a new product or a new type of approach, making sure we understand the existing treatment so that we can better evaluate the impact of a change on patients and clinicians.

Another part of our role involves working with data analysts to model service evaluations (to predict the expected results). We then enter a pilot phase during which we test the evaluation approach, identify any problems and recommend ways to mitigate them before moving forward with the actual evaluation.

Wherever possible, we make sure we get a ‘sense check’ on our projects from a panel of subject area experts and other health economists. As a networked organisation we have strong links with other academics and are able to provide external validation from expert panels on our health economics work – something that few other similar organisations are able to offer.

We also have access (via our data partners Harvey Walsh) to data where others don’t, including a range of national data sources such as HES data (linked at the HES record level over 10 years via HES ID)*, national SUS data, QOF, ePACT, MHMDS and ONS data. We can also use the North West London Whole Systems Integrated Care (WSIC) database – a unique system which links primary, secondary, mental health and social care data for over two million people and 370 GP practices in North West London.

As well as working directly with NHS organisations, we’re often approached by private companies and new start-ups looking to supply products or services into the NHS. As part of digitalhealth.london, a programme aiming to speed up the development and scaling of digital innovations across health and care, we can work with SMEs who perhaps have an app and need an evaluation or with those already involved in an NHS project. We recently delivered an evaluation of three digital behaviour change programmes for patients with type 2 diabetes which was commissioned by the North West London collaboration of CCGs and involved working with the SMEs who provided some of the data.

As well as our work with NHS organisations, we’ve also started working with SMEs to help them understand what an evaluation would entail, so that they’re more aware earlier on in their development process about what sort of information and evidence they’ll need to get their product adopted by the NHS.

What’s important about our evaluations is that they are robust, they consider all relevant outcomes as agreed by stakeholders, and the methods (including the stats) are validated by an external panel of experts. We present them in the most user-friendly way possible, tailored for the specific audience with results presented by CCG, practice, trust and/or hospital as appropriate. Deciding on the evaluation approach is a staged process – first we consult with the client to understand what they want to get from the evaluation, the background and the long term goal, then we set out clear objectives and produce a report which tells the story.

The second phase is deciding what the end product will be – are they expecting, for example, a modelling tool that they can use internally or something that they want to share? Is the end product a report or an iterative model? We work to understand who our client’s ‘clients’ are so that we can produce the most appropriate output – for a CCG, is the evaluation aimed at a national or trust level audience? Do they want us to focus on just one audience or create separate evaluation reports for more than one? We can also support with communicating the findings, whether through a poster or another type of publication.

Our evaluation reports are always geared towards giving recommendations moving forwards, highlighting any areas for further work. Through our evaluations we act as both a facilitator and a supporter of progress and service improvements that truly have a positive impact for both the NHS and the patients it supports, and that’s something I’m really proud of.

 

*Derived HES outputs are created in partnership with Harvey Walsh Ltd, who work collaboratively with ICHP in the provision of data insights from their proprietary database AXON 360 (Harvey Walsh NHS Digital DSA: DARS-NIC-05934-M7V9K)