Connect with us

Hi, what are you looking for?

Tech & Science

Creating better coronavirus science through AI tools

Partly driven by the desire to have a much information about the novel coronavirus published as quickly as possible, given the level of scientific and public interest, this has led to some research being published which is not robust. This includes research that has not been subject to the usual rigors of peer review. Sometimes findings are in the form of ‘preprints’, which means research pending review (as opposed to bypassing review altogether). While preprints can be interesting, they are often subject to changes and, in some cases, may not end up being published at all.

Perhaps the most serious case involved French researchers who reported how a combination of hydroxychloroquine and azithromycin could provide a new, and effective, treatment option for those infected with the SARS-CoV-2 virus. The original paper was submitted to Cell Research on 25 January 2020, accepted on 28 January. This was following only three days in peer review.

Based on the French findings, the U.S. Food and Drug Administration (FDA) agreed to an emergency use authorization that led to the drug combination being rolled out as a treatment for patients. Later, more detailed research, raised serious concerns about the efficacy of the hydroxychloroquine and azithromycin combination therapy. This led to the FDA having to reverse their decision.

On this subject, Dr Angela Rasmussen is a virologist and associate research scientist at Columbia University’s Centre for Infection and Immunity, tells the magazine The Biologist: “The pace at which data is coming out is fantastic but there have also been some studies that have either been just not great, or misinterpreted. And I see a lot of problems with press reports and press releases being treated as if they are data.”

How can more reliable research be evaluated? According to Professor Tudor Oprea (University of New Mexico), in an interview with Biotechniques, artificial intelligence (specifically machine learning) could help with introducing new tools that could assist with the evaluation of new peer reviewed papers.

The researcher explains that text mining (the rapid examination of millions of pages so that specific patterns can be identified) is the solution to filtering out ill-thought claims or poorly executed research.

What is proposed is a digital-based machine learning method that can assess heterogeneous data in a short period of time. providing interpretation on a scale and speed well-beyond what is possible from human interpretation.

Professor Oprea has outlined his model in the journal Nature Biotechnology, where the paper is headed “Artificial intelligence, drug repurposing and peer review.”

You may also like:


Three months after the event, Volkswagen and its Audi subsidiary are notifying 3.3 million people in the U.S and Canada about a data breach.


All around are heaps of junk and scrapped equipment — ranging from telephones and thermos flasks to computer keyboards and printer cartridges.


St. Joseph's Catholic Church in Kamloops, British Columbia, was built by was built by Catholic missionaries who came to convert the Indigenous peoples of...


Hong Kong pro-democracy newspaper Apple Daily began printing a million copies of its final edition Wednesday evening after authorities froze its assets.