Connect with us

Hi, what are you looking for?

Life

Health data platform: Transforming mental health treatment research

Researchers will be able to systematically study which talking therapies work best for different people.

A man expressing sadness with his head in his hands. Image by Tellmeimok. (CC BY-SA 4.0)
A man expressing sadness with his head in his hands. Image by Tellmeimok. (CC BY-SA 4.0)

Oxford University researchers have been awarded £7 million to analyse the UK National Health Service (NHS) talking therapy outcomes, with the aim of improving mental health. In addition, £10 million has been granted to invest in new data research techniques.

Researchers at the University of Oxford’s Bennett Institute for Applied Data Science have received these two major awards from the medical charity Wellcome.

The first award will enable researchers to analyse anonymised NHS Talking Therapies data alongside medical doctor records, in a highly secure setting. The Talking Therapies service is a major part of the NHS, delivering NICE-recommended psychological therapy for depression and anxiety disorders to over 670,000 patients in England each year.

The therapies include cognitive-behaviour therapy, counselling, and guided self-help. Uniquely, outcome data is collected from 98 percent of people who have a course of treatment.

Incorporating this data into the OpenSAFELY platform will help answer many vital questions about mental health treatment, including:

  • How talking therapies affect long-term health outcomes
  • Which approaches work best for specific conditions and patient groups
  • The best way to deliver services
  • The relationship between mental health treatments and physical health

The research will build on the success of OpenSAFELY, the secure analytics platform developed at Oxford during the COVID-19 pandemic. OpenSAFELY delivers whole population data analysis, using innovative new methods to protect patients’ privacy.

The platform’s findings directly informed UK public health policy decisions during the pandemic, particularly regarding protection for vulnerable groups.

Building on the Bennett Institute’s existing collaboration with NHS England, the project will also analyse outcome data from millions of patients who have used NHS Talking Therapies services while maintaining strict privacy controls.

Although using patient data, researchers do not interact directly with real patient records. This utilises new mechanisms for data linkage, where datasets are minimised before moving between NHS England controlled data centres.

Professor Ben Goldacre, Director of the Bennett Institute for Applied Data Science, based within the Nuffield department of Primary Care Health Sciences, states: “This investment on mental health data will be transformative. Researchers will be able to systematically study which talking therapies work best for different people, using secure analysis of NHS GP data at unprecedented scale, for the first time.”

Goldacre adds: “This has the potential to fundamentally change how we deliver mental health care to patients in the NHS. In addition, the £10m data infrastructure investment will allow us to drive better use of data across the whole research community. We are hugely grateful to Wellcome for making this possible.”

Avatar photo
Written By

Dr. Tim Sandle is Digital Journal's Editor-at-Large for science news. Tim specializes in science, technology, environmental, business, and health journalism. He is additionally a practising microbiologist; and an author. He is also interested in history, politics and current affairs.

You may also like:

Business

American AI developer Anthropic plans to "lay the risks out on the table" even as it restricts deployment of a new model dubbed Mythos.

Tech & Science

A push to reduce reliance on foreign compute and give researchers access to more power

Tech & Science

Since the human brain is five orders of magnitude more energy efficient than a digital computer, it makes sense to look to the brain...

Business

New peer-reviewed research finds that actively questioning and refining AI output, not avoiding it, is what keeps people's reasoning sharp.