Rita Faria’s journal round-up for 10th December 2018

Every Monday our authors provide a round-up of some of the most recently published peer reviewed articles from the field. We don’t cover everything, or even what’s most important – just a few papers that have interested the author. Visit our Resources page for links to more journals or follow the HealthEconBot. If you’d like to write one of our weekly journal round-ups, get in touch.

Calculating the expected value of sample information using efficient nested Monte Carlo: a tutorial. Value in Health [PubMed] Published 17th July 2018

The expected value of sample information (EVSI) represents the added benefit from collecting new information on specific parameters in future studies. It can be compared to the cost of conducting these future studies to calculate the expected net benefit of sampling. The objective is to help inform which study design is best, given the information it can gather and its costs. The theory and methods to calculate EVSI have been around for some time, but we rarely see it in applied economic evaluations.

In this paper, Anna Heath and Gianluca Baio present a tutorial about how to implement a method they had previously published on, which is more computationally efficient than the standard nested Monte Carlo simulations.

The authors start by explaining the method in theory, then illustrate it with a simple worked example. I’ll admit that I got a bit lost with the theory, but I found that the example made it much clearer. They demonstrate the method’s performance using a previously published cost-effectiveness model. Additionally, they have very helpfully published a suite of functions to apply this method in practice.

I really enjoyed reading this paper, as it takes the reader step-by-step through the method. However, I wasn’t sure about when this method is applicable, given that the authors note that it requires a large number of probabilistic simulations to perform well, and it is only appropriate when EVPPI is high. The issue is, how large is large and how high is high? Hopefully, these and other practical questions are on the list for this brilliant research team.

As an applied researcher, I find tutorial papers such as this one incredibly useful to learn new methods and help implement them in practice. Thanks to work such as this one and others, we’re getting close to making value of information analysis a standard element of cost-effectiveness studies.

Future costs in cost-effectiveness analyses: past, present, future. PharmacoEconomics [PubMed] Published 26th November 2018

Linda de Vries, Pieter van Baal and Werner Brouwer help illuminate the debate on future costs with this fascinating paper. Future costs are the costs of resources used by patients during the years of life added by the technology under evaluation. Future costs can be distinguished between related or unrelated, depending on whether the resources are used for the target disease. They can also be distinguished between medical or non-medical, depending on whether the costs fall on the healthcare budget.

The authors very skilfully summarise the theoretical literature on the inclusion of future costs. They conclude that future related and unrelated medical costs should be included and present compelling arguments to do so.

They also discuss empirical research, such as studies that estimate future unrelated costs. The references are a useful starting point for other researchers. For example, I noted that there is a tool to include future unrelated medical costs in the Netherlands and some studies on their estimation in the UK (see, for example, here).

There is a thought-provoking section on ethical concerns. If unrelated costs are included, technologies that increase the life expectancy of people who need a lot of resources will look less cost-effective. The authors suggest that these issues should not be concealed in the analysis, but instead dealt with in the decision-making process.

This is an enjoyable paper that provides an overview of the literature on future costs. I highly recommend it to get up to speed with the arguments and the practical implications. There is clearly a case for including future costs, and the question now is whether the cost-effectiveness practice follows suit.

Cost-utility analysis using EQ-5D-5L data: does how the utilities are derived matter? Value in Health Published 4th July 2018

We’ve recently become spoilt for choice when it comes to the EQ-5D. To obtain utility values, just in the UK, there are a few options: the 3L tariff, the 5L tariff, and crosswalk tariffs by Ben van Hout and colleagues and Mónica Hernandez and colleagues [PDF]. Which one to choose? And does it make any difference?

Fan Yang and colleagues have done a good job in getting us closer to the answer. They estimated utilities obtained from EQ-5D-5L data using the 5L value set and crosswalk tariffs to EQ-5D-3L and tested the values in cost-effectiveness models of hemodialysis compared to peritoneal dialysis.

Reassuringly, hemodialysis had always greater utilities than peritoneal dialysis. However, the magnitude of the difference varied with the approach. Therefore, using either EQ-5D-5L or the crosswalk tariff to EQ-5D-3L can influence the cost-effectiveness results. These results are in line with earlier work by Mónica Hernandez and colleagues, who compared the EQ-5D-3L with the EQ-5D-5L.

The message is clear in that both the type of EQ-5D questionnaire and the EQ-5D tariff makes a difference to the cost-effectiveness results. This can have huge policy implications as decisions by HTA agencies, such as NICE, depend on these results.

Which EQ-5D-5L to use in a new primary research study remains an open question. In the meantime, NICE recommends the use of the EQ-5D-3L or, if EQ-5D-5L was collected, Ben van Hout and colleagues’ mapping function to the EQ-5D-3L. Hopefully, a definite answer won’t be long in coming.


Meeting round-up: ISPOR Europe 2018 (part 2)

Have you missed ISPOR Europe 2018 but are eager to know all about it? Time to continue reading! In yesterday’s post, I wrote about ISPOR’s outstanding short-course on causal inference and the superb sessions I had attended on day 1. This blog post is about day 2, Tuesday 13th, which was another big day.

The second plenary session was on fairness in pharmaceutical pricing. It was moderated by Sarah Garner, with presentations by many key stakeholders. The thought-provoking discussion highlighted the importance of pharmaceutical pricing policy and the large role that HTA can have in shaping it.

Communicating cost-effectiveness analysis was the next session, where myself, together with Rob Hettle, Gabriel Rogers and Mike Drummond, discussed the pitfalls and approaches to explaining cost-effectiveness models to non-health economists. This was a hugely popular session! We were delighted by the incredibly positive feedback we received, which reassured us that we are clearly not alone in finding it difficult to communicate cost-effectiveness analysis to a lay audience. We certainly feel incentivised to continue working on this topic. The slides are available here, and for the audience’s feedback, search on twitter #communicateCEA.

The lunch was followed by the open meeting of ISPOR Women in HEOR Initiative with Shelby Reed, Olivia Wu and Louise Timlin. It is really encouraging to see ISPOR taking a proactive stance to gender balance!

The most popular session in the afternoon was Valuing a cure: Are new approaches needed, with Steve Pearson, Jens Grueger, Sarah Garner and Mark Sculpher. The panel showed the various perspectives on the pricing of curative therapies. Payers call for a sustainable pricing model, whilst pharma warns that pricing policy is necessarily linked to the incentives for investment in research. I agree with Mark in that these challenges are not unique to curative therapies. As pharmaceutical therapies have greater health benefits but at large costs, it is pressing that cost-effectiveness assessments are also able to consider the opportunity cost of funding more costly treatments. See here for a roundup of the estimates already available.

I then attended the excellent session on Drug disinvestment: is it needed and how could it work, moderated by Richard Macaulay. Andrew Walker explained that HTA agencies’ advice does not always go down well with local payers, highlighting this with an amusing imaginary dialogue between NICE and a hospital. Detlev Parow argued that payers find that prices are often unaffordable, hence payment schemes should consider other options, such as treatment success, risk-sharing agreements and payment by instalments. Bettina Ryll made an impressive case from the patients’ perspective, for whom these decisions have a real impact.

The conference continued late into the evening and, I suspect, long into the early hours of Wednesday, with the ever-popular conference dinner. Wednesday was another day full of fascinating sessions. The plenary was titled Budget Impact and Expenditure Caps: Potential or Pitfall, moderated by Guillem López-Casasnovas. It was followed by inspiring sessions that explored a wide range of topics, presented by the top experts in the relevant fields. These really delved into the nitty-gritty on subjects, such as using R to build decision models, the value of diagnostic information, and expert elicitation, just to name a few.

I don’t think I’m just speaking personally when I say that ISPOR Barcelona was an absolutely brilliant conference! I’ve mentioned here a few of the most outstanding sessions, but there were many, many more. There were so many sessions at the same time that it was physically impossible to attend all of those with a direct relevance to my research. But fortunately, we can access all the presentations by downloading them from the ISPOR website. I’ll leave the suggestion to ISPOR here, that they should think about filming some of the key sessions and broadcasting them as webinars after the conference. This could create a further key resource for our sector.

As in previous editions, ISPOR Barcelona truly confirms ISPOR Europe in the top HTA conferences in Europe, if not the world. It expertly combines cutting-edge methodological research with outstanding applied work, all with the view to better inform decision making. As I’m sure you can guess, I’m already looking forward to the next ISPOR Europe in Copenhagen on the 2nd-6th November 2019, and the amazing sessions which will indubitably be featured!


Meeting round-up: ISPOR Europe 2018 (part 1)

ISPOR Europe 2018, which took place in Barcelona on the 10th-14th November, was an exceptional conference. It had a jam-packed programme on the latest developments and most pressing challenges in health technology assessment (HTA), economic evaluation and outcomes research. In two blog posts, I’ll tell you about the outstanding sessions and thought-provoking discussions in this always superb conference.

For me, proceedings started on Sunday, with the excellent short-course Adjusting for Time-Dependent Confounding and Treatment Switching Bias in Observational Studies and Clinical Trials: Purpose, Methods, Good Practices and Acceptance in HTA, by Uwe Siebert, Felicitas Kühne and Nick Latimer. Felicitas Kühne explained that causal inference methods aim to estimate the effect of a treatment, risk factor etc. on our outcome of interest, controlling for other exposures that may affect it and hence bias our estimate. Uwe Siebert and Nick Latimer provided a really useful overview of the methods to overcome this challenge in observational studies and RCTs with treatment switching. This was an absolutely brilliant course. Highly recommended to any health economist!

ISPOR conferences usually start early and finish late with loads of exceptional sessions. On Monday, I started the conference proper with the plenary Joint Assessment of Relative Effectiveness: “Trick or Treat” for Decision Makers in EU Member States, moderated by Finn Børlum Kristensen. There were presentations from representatives of payers, HTA agencies, EUnetHTA, pharmaceutical industry and patients. The prevailing mood seemed to be of cautious anticipation. Avoiding duplication of efforts in the clinical assessment was greatly welcomed, but there were some concerns voiced about the practicalities of implementation. The proposal was due to be discussed soon by the European Commission, so undoubtedly we can look forward to knowing more in the near future.


My next session was the fascinating panel on the perils and opportunities of advanced computing techniques with the tongue-in-cheek title Will machines soon make health economists obsolete?, by David Thompson, Bill Marder, Gerry Oster and Mike Drummond. Don’t panic yet as, despite the promises of artificial intelligence, I’d wager that our jobs are quite safe. For example, Gerry Oster predicted that demand for health economic models is actually likely to increase, as computers make our models quicker and cheaper to build. Mike Drummond finished with the sensible suggestion to simply keep calm and carry on modelling, as computing advances will liberate our time to explore other areas, such as the interface with decision-makers. This session left us all in a very positive mood as we headed for a well-earned lunch!

There were many interesting sessions in the afternoon. I chose to pop over to the ISPOR Medical Device and Diagnostic Special Interest Group Open Meeting, the ISPOR Portugal chapter meeting, along with taking in the podium presentations on conceptual papers. Many of the presentations will be made available in the ISPOR database, which I recommend exploring. I had a wonderful experience moderating the engaging podium session on cancer models, with outstanding presentations delivered by Hedwig Blommestein, Ash Bullement, and Isle van Oostrum.

The workshop Adjusting for post-randomisation confounding and switching in phase 3 and pragmatic trials to get the estimands right: needs, methods, sub-optimal use, and acceptance in HTA by Uwe Siebert, Felicitas Kühne, Nick Latimer and Amanda Adler is one worth highlighting. The panellists showed that some HTAs do not include any adjustments for treatment switching, whilst adjustments can sometimes be incorrectly applied. It reinforced the idea that we need to learn more about these methods, to be able to apply them in practice and critically appraise them.

The afternoon finished with the second session of the day on posters. Alessandro Grosso, Laura Bojke and I had a poster on the impact of structural uncertainty in the expected value of perfect information. Alessandro did an amazing job encapsulating the poster and presenting it live to camera, which you can watch here.


In tomorrow’s blog post, I’ll tell you about day 2 of ISPOR Europe 2018 in Barcelona. Tuesday was another big day, with loads of outstanding sessions on the key topics in HTA. It featured my very own workshop, with Rob Hettle, Gabriel Rogers and Mike Drummond on communicating cost-effectiveness analysis. I hope you will stay tuned for the ISPOR meeting round-up part 2!