Essen Economics of Mental Health Workshop

The third Essen Economics of Mental Health Workshop will take place on August 3 – 4, 2020 in Essen, Germany.

Half of those with a lifetime mental health problem first experience symptoms by the age of 14, and 75% before they reach their mid-twenties (Kessler et al., 2007). The conditions are often persistent and recurrent meaning they influence the entire work-life of the affected individuals (OECD, 2012).

This workshop aims to gather (junior) researchers with an interest in applying the tools of economics to problems surrounding mental health. This includes, but is not limited to, mental health economics studies looking at informal care, loneliness, social exclusion, access to health care, insurance coverage, declines in physical health, age of onset, dementia, suicide, etc. Empirical analyses in this field are especially encouraged for Submission.

Ezra Golberstein (University of Minnesota) and Martin Knapp (London School of Economics and Political Science) will deliver the keynotes for this workshop.

For further details, please see the Flyer (you may also use this for further distribution) or consult the application page.

Chris Sampson’s journal round-up for 18th November 2019

Every Monday our authors provide a round-up of some of the most recently published peer reviewed articles from the field. We don’t cover everything, or even what’s most important – just a few papers that have interested the author. Visit our Resources page for links to more journals or follow the HealthEconBot. If you’d like to write one of our weekly journal round-ups, get in touch.

A conceptual map of health-related quality of life dimensions: key lessons for a new instrument. Quality of Life Research [PubMed] Published 1st November 2019

EQ-5D, SF-6D, HUI3, AQoL, 15D; they’re all used to describe health states for the purpose of estimating health state utility values, to get the ‘Q’ in the QALY. But it’s widely recognised (and evidenced) that they measure different things. This study sought to better understand the challenge by doing two things: i) ‘mapping’ the domains of the different instruments and ii) advising on the domains to be included in a new measure.

The conceptual model described in this paper builds on two standard models of health – the ICF (International Classification of Functioning, Disability, and Health), which is endorsed by the WHO, and the Wilson and Cleary model. The new model is built around four distinctions, which can be used to define the dimensions included in health state utility instruments: cause vs effect, specific vs broad, physical vs psychological, and subjective vs objective. The idea is that each possible dimension of health can relate, with varying levels of precision, to one or the other of these alternatives.

The authors argue that, conveniently, cause/effect and specific/broad map to one another, as do physical/psychological and objective/subjective. The framework is presented visually, which makes it easy to interpret – I recommend you take a look. Each of the five instruments previously mentioned is mapped to the framework, with the HUI and 15D coming out as ‘symptom’ oriented, EQ-5D and SF-6D as ‘functioning’ oriented, and the AQoL as a hybrid of a health and well-being instrument. Based (it seems) on the Personal Wellbeing Index, the authors also include two social dimensions in the framework, which interact with the health domains. Based on the frequency with which dimensions are included in existing instruments, the authors recommend that a new measure should include three physical dimensions (mobility, self-care, pain), three mental health dimensions (depression, vitality, sleep), and two social domains (personal relationships, social isolation).

This framework makes no sense to me. The main problem is that none of the four distinctions hold water, let alone stand up to being mapped linearly to one another. Take pain as an example. It could be measured subjectively or objectively. It’s usually considered a physical matter, but psychological pain is no less meaningful. It may be a ‘causal’ symptom, but there is little doubt that it matters in and of itself as an ‘effect’. The authors themselves even offer up a series of examples of where the distinctions fall down.

It would be nice if this stuff could be drawn-up on a two-dimensional plane, but it isn’t that simple. In addition to oversimplifying complex ideas, I don’t think the authors have fully recognised the level of complexity. For instance, the work seems to be inspired – at least in part – by a desire to describe health state utility instruments in relation to subjective well-being (SWB). But the distinction between health state utility instruments and SWB isn’t simply a matter of scope. Health state utility instruments (as we use them) are about valuing states in relation to preferences, whereas SWB is about experienced utility. That’s a far more important and meaningful distinction than the distinction between symptoms and functioning.

Careless costs related to inefficient technology used within NHS England. Clinical Medicine Journal [PubMed] Published 8th November 2019

This little paper – barely even a single page – was doing the rounds on Twitter. The author was inspired by some frustration in his day job, waiting for the IT to work. We can all relate to that. This brief analysis sums the potential costs of what the author calls ‘careless costs’, which is vaguely defined as time spent by an NHS employee on activity that does not relate to patient care. Supposing that all doctors in the English NHS wasted an average of 10 minutes per day on such activities, it would cost over £143 million (per year, I assume) based on current salaries. The implication is that a little bit of investment could result in massive savings.

This really bugs me, for at least two reasons. First, it is normal for anybody in any profession to have a bit of downtime. Nobody operates at maximum productivity for every minute of every day. If the doctor didn’t have their downtime waiting for a PC to boot, it would be spent queuing in Costa, or having a nice relaxed wee. Probably both. Those 10 minutes that are displaced cannot be considered equivalent in value to 10 minutes of patient contact time. The second reason is that there is no intervention that can fix this problem at little or no cost. Investments cost money. And if perfect IT systems existed, we wouldn’t all find these ‘careless costs’ so familiar. No doubt, the NHS lags behind, but the potential savings of improvement may very well be closer to zero than to the estimates in this paper.

When it comes to clinical impacts, people insist on being able to identify causal improvements from clearly defined interventions or changes. But when it comes to costs, too many people are confident in throwing around huge numbers of speculative origin.

Socioeconomic disparities in unmet need for student mental health services in higher education. Applied Health Economics and Health Policy [PubMed] Published 5th November 2019

In many countries, the size of the student population is growing, and this population seems to have a high level of need for mental health services. There are a variety of challenges in this context that make it an interesting subject for health economists to study (which is why I do), including the fact that universities are often the main providers of services. If universities are going to provide the right services and reach the right people, a better understanding of who needs what is required. This study contributes to this challenge.

The study is set in the context of higher education in Ireland. If you have no idea how higher education is organised in Ireland, and have an interest in mental health, then the Institutional Context section of this paper is worth reading in its own right. The study reports on findings from a national survey of students. This analysis is a secondary analysis of data collected for the primary purpose of eliciting students’ preferences for counselling services, which has been described elsewhere. In this paper, the authors report on supplementary questions, including measures of psychological distress and use of mental health services. Responses from 5,031 individuals, broadly representative of the population, were analysed.

Around 23% of respondents were classified as having unmet need for mental health services based on them reporting both a) severe distress and b) not using services. Arguably, it’s a sketchy definition of unmet need, but it seems reasonable for the purpose of this analysis. The authors regress this binary indicator of unmet need on a selection of sociodemographic and individual characteristics. The model is also run for the binary indicator of need only (rather than unmet need).

The main finding is that people from lower social classes are more likely to have unmet need, but that this is only because these people have a higher level of need. That is, people from less well-off backgrounds are more likely to have mental health problems but are no less likely to have their need met. So this is partly good news and partly bad news. It seems that there are no additional barriers to services in Ireland for students from a lower social class. But unmet need is still high and – with more inclusive university admissions – likely to grow. Based on the analyses, the authors recommend that universities could reach out to male students, who have greater unmet need.

Credits

Chris Sampson’s journal round-up for 14th October 2019

Every Monday our authors provide a round-up of some of the most recently published peer reviewed articles from the field. We don’t cover everything, or even what’s most important – just a few papers that have interested the author. Visit our Resources page for links to more journals or follow the HealthEconBot. If you’d like to write one of our weekly journal round-ups, get in touch.

Transparency in health economic modeling: options, issues and potential solutions. PharmacoEconomics [PubMed] Published 8th October 2019

Reading this paper was a strange experience. The purpose of the paper, and its content, is much the same as a paper of my own, which was published in the same journal a few months ago.

The authors outline what they see as the options for transparency in the context of decision modelling, with a focus on open source models and a focus on for whom the details are transparent. Models might be transparent to a small number of researchers (e.g. in peer review), to HTA agencies, or to the public at large. The paper includes a figure showing the two aspects of transparency, termed ‘reach’ and ‘level’, which relate to the number of people who can access the information and the level of detail made available. We provided a similar figure in our paper, using the terms ‘breadth’ and ‘depth’, which is at least some validation of our idea. The authors then go on to discuss five ‘issues’ with transparency: copyright, model misuse, confidential data, software, and time/resources. These issues are framed as questions, to which the authors posit some answers as solutions.

Perhaps inevitably, I think our paper does a better job, and so I’m probably over-critical of this article. Ours is more comprehensive, if nothing else. But I also think the authors make a few missteps. There’s a focus on models created by academic researchers, which oversimplifies the discussion somewhat. Open source modelling is framed as a more complete solution than it really is. The ‘issues’ that are discussed are at points framed as drawbacks or negative features of transparency, which they aren’t. Certainly, they’re challenges, but they aren’t reasons not to pursue transparency. ‘Copyright’ seems to be used as a synonym for intellectual property, and transparency is considered to be a threat to this. The authors’ proposed solution here is to use licensing fees. I think that’s a bad idea. Levying a fee creates an incentive to disregard copyright, not respect it.

It’s a little ironic that both this paper and my own were published, when both describe the benefits of transparency in terms of reducing “duplication of efforts”. No doubt, I read this paper with a far more critical eye than I normally would. Had I not published a paper on precisely the same subject, I might’ve thought this paper was brilliant.

If we recognize heterogeneity of treatment effect can we lessen waste? Journal of Comparative Effectiveness Research [PubMed] Published 1st October 2019

This commentary starts from the premise that a pervasive overuse of resources creates a lot of waste in health care, which I guess might be true in the US. Apparently, this is because clinicians have an insufficient understanding of heterogeneity in treatment effects and therefore assume average treatment effects for their patients. The authors suggest that this situation is reinforced by clinical trial publications tending to only report average treatment effects. I’m not sure whether the authors are arguing that clinicians are too knowledgable and dependent on the research, or that they don’t know the research well enough. Either way, it isn’t a very satisfying explanation of the overuse of health care. Certainly, patients could benefit from more personalised care, and I would support the authors’ argument in favour of stratified studies and the reporting of subgroup treatment effects. The most insightful part of this paper is the argument that these stratifications should be on the basis of observable characteristics. It isn’t much use to your general practitioner if personalisation requires genome sequencing. In short, I agree with the authors’ argument that we should do more to recognise heterogeneity of treatment effects, but I’m not sure it has much to do with waste.

No evidence for a protective effect of education on mental health. Social Science & Medicine Published 3rd October 2019

When it comes to the determinants of health and well-being, I often think back to my MSc dissertation research. As part of that, I learned that a) stuff that you might imagine to be important often isn’t and b) methodological choices matter a lot. Though it wasn’t the purpose of my study, it seemed from this research that higher education has a negative effect on people’s subjective well-being. But there isn’t much research out there to help us understand the association between education and mental health in general.

This study add to a small body of literature on the impact of changes in compulsory schooling on mental health. In (West) Germany, education policy was determined at the state level, so when compulsory schooling was extended from eight to nine years, different states implemented the change at different times between 1949 and 1969. This study includes 5,321 people, with 20,290 person-year observations, from the German Socio-Economic Panel survey (SOEP). Inclusion was based on people being born seven years either side of the cutoff birth year for which the longer compulsory schooling was enacted, with a further restriction to people aged between 50 and 85. The SOEP includes the SF-12 questionnaire, which includes a mental health component score (MCS). There is also an 11-point life satisfaction scale. The authors use an instrumental variable approach, using the policy change as an instrument for years of schooling and estimating a standard two-stage least squares model. The MCS score, life satisfaction score, and a binary indicator for MCS score lower than or equal to 45.6, are all modelled as separate outcomes.

Estimates using an OLS model show a positive and highly significant effect of years of schooling on all three outcomes. But when the instrumental variable model is used, this effect disappears. An additional year of schooling in this model is associated with a statistically and clinically insignificant decrease in the MCS score. Also insignificant was the finding that more years of schooling increases the likelihood of developing symptoms of a mental health disorder (as indicated by the MCS threshold of 45.6) and that life satisfaction is slightly lower. The same model shows a positive effect on physical health, which corresponds with previous research and provides some reassurance that the model could detect an effect if one existed.

The specification of the model seems reasonable and a host of robustness checks are reported. The only potential issue I could spot is that a person’s state of residence at the time of schooling is not observed, and so their location at entry into the sample is used. Given that education is associated with mobility, this could be a problem, and I would have liked to see the authors subject it to more testing. The overall finding – that an additional year of school for people who might otherwise only stay at school for eight years does not improve mental health – is persuasive. But the extent to which we can say anything more general about the impact of education on well-being is limited. What if it had been three years of additional schooling, rather than one? There is still much work to be done in this area.

Scientific sinkhole: the pernicious price of formatting. PLoS One [PubMed] Published 26th September 2019

This study is based on a survey that asked 372 researchers from 41 countries about the time they spent formatting manuscripts for journal submission. Let’s see how I can frame this as health economics… Well, some of the participants are health researchers. The time they spend on formatting journal submissions is time not spent on health research. The opportunity cost of time spent formatting could be measured in terms of health.

The authors focused on the time and wage costs of formatting. The results showed that formatting took a median time of 52 hours per person per year, at a cost of $477 per manuscript or $1,908 per person per year. Researchers spend – on average – 14 hours on formatting a manuscript. That’s outrageous. I have never spent that long on formatting. If you do, you only have yourself to blame. Or maybe it’s just because of what I consider to constitute formatting. The survey asked respondents to consider formatting of figures, tables, and supplementary files. Improving the format of a figure or a table can add real value to a paper. A good figure or table can change a bad paper to a good paper. I’d love to know how the time cost differed for people using LaTeX.

Credits