Should virtual therapists come with a health warning?
Should virtual therapists come with a health warning?
The Covid pandemic has had a profound impact on our mental well-being, with investigate from Durham Business School highlighting the scale of the challenge.
“It is true that many workers faced new demands on their time, such as the need to learn new technologies like Zoom or navigate improvised work procedures, and new financial demands, as well as facing the loss of essential financial resources,” the researchers say. “However, the change created a number of trade-offs for most people. There were different constraints on how people allocated their time, energy and money that didn’t necessarily have negative consequences.”
With the cost of living crisis, these pressures have not abated and investigate from Cambridge University highlights how this can have a knock-on effect on our careers. The authors highlight the significant gap between those who need mental health support and those who receive it, citing data from the United States revealing that less than half receive the treatment they need.
In addition, the researchers note that the availability of treatment significantly affected people’s career trajectories, removing about a third of the income penalty nominally associated with bipolar disorder.
virtual treatment
Investigate last year from the West Virginia School of Medicine showed that virtual therapy sessions conducted via video link during the pandemic were generally as effective as their in-person counterparts. What investigate in the British Medical Journal reveals, however, that such an approach still requires a trained mental health professional, and in England alone there is such a shortage that around 1.5 million people are awaiting mental health treatment.
It is not surprising, therefore, that in the United Kingdomtwo-fifths of patients expecting mental health support ended up seeking crisis services, with an estimated eight million people sitting on waiting lists. Others in less dire straits could turn to the growing number of AI-based services. These apps have proven to be reasonably popular with users, especially when head-to-head alternatives are so hard to come by.
For example, a few years ago, investigate from Brigham High University found that 90% of users reported feeling more motivated, more confident, and overall feelings of mental and emotional health.
“Our findings show that apps focused on mental and emotional health have the ability to change behavior in positive ways,” the authors say. “This is great news for people looking for affordable, easily accessible resources to help combat mental and emotional health challenges and illnesses.”
accessible support
Several factors underpin this popularity, including low cost, ease of access, and the fact that they are available 24 hours a day and at times convenient to users. Indeed, investigate from the University of Southern California found that virtual therapists can be especially helpful in areas like PTSD where people may feel uncomfortable sharing uncomfortable things with a human being.
It is perhaps not surprising, therefore, that most popular AI-based therapists They have amassed millions of users. Indeed, clinical trials are underway to test whether such a service could become part of formal mental health care.
While few professionals believe that such services replicate official mental health support, given the paucity of official mental health support, they can provide good enough support for people who urgently need help.
mixed messaging
However, they are not without risk, especially due to the variability of the service provided. For example, a to study from the University of Sydney expressed concern about the way apps are being marketed. The researchers evaluated 61 of the top mental health apps in the US, UK, Canadian and Australian markets, and a couple of core themes emerged in marketing material for them.
The first of these is that poor mental health is ubiquitous among the population, and the second is that mental health is something that can be easily managed (with the help of apps, of course). The researchers believe this presents a number of problems.
“Implying that mental health problems are present in everyone promotes the medicalization of normal states,” say the researchers. “The apps we evaluated tended to encourage frequent use and promoted personal responsibility for improvement.”
The authors believe the message suggests that the normal ups and downs of everyday life are such that they require app treatment, even if they are relatively minor concerns. Using the apps for such problems is likely to be time consuming with little real reward. There is a risk of overdiagnosis. They also believe that the medical profession should play a bigger role in the application of mental health mobile apps, especially around more serious issues.
“At the same time, people who have serious mental health problems can be helped by GPs or mental health workers in discussions about the limitations of using the app and the importance of seeking additional forms of care.” medical support when necessary,” they say.
Conflicting results
Other revision from the University of Warwick found that while some mental health chatbots could deliver positive results, they were also at high risk of bias and conflicting results. In fact, the researchers were concerned that they were creating an illusion of help rather than actual help.
This variance is largely a consequence of the lack of meaningful industry regulation, with no government oversight requirements for many applications. In fact, during the pandemic, the FDA it actually loosened the rules surrounding such apps.
It is perhaps not surprising, therefore, that investigate from Indiana University found that app providers consistently tweaked their terminology after the change to appear more medically approved.
Suffice to say that human therapists aren’t perfect either, and quality inevitably fluctuates, but there remain considerable concerns that mental health apps are struggling to recognize when people are in dire need.
There is no doubt that the mental health sector is in dire straits, but this shortage of supply does not mean that chatbots are the inevitable answer. Just because it’s so difficult to receive face-to-face support doesn’t mean we should automatically accept substandard, but readily available technology support.
To date, it seems that while the technology passes the test in terms of cost and accessibility, it is not certain that it will pass the test in terms of quality of care, especially for those with the most significant problems.