New tests have been released by Case Western Reserve University which show computer programmes outperforming human doctors in diagnosing common diseases.
The new tech relies on ‘deep learning’ and should push down the cost of healthcare while improving both the quality and accuracy of your diagnoses.
Pictured – Anant Madabhushi
The ‘diagnostic imaging algorithms’ were developed at Case Western Reserve University by F. Alex Nason Professor II of biomedical engineering Anant Madabhushi and his team. Currently, the technology focuses on diagnosing heart failure and detecting various cancers.
Madabhushi believes his research will create a new set of diagnostic tools which could help identify patients with less aggressive forms of diseases which don’t need aggressive therapies.
Madabhushi computational imaging lab has received nearly $10 million from the US National Cancer Institute to create similar tools to analyse digital images of breast tissue, head, lung and neck cancers to try to better id patients who would respond well to less aggressive treatments.
By providing [the pathologist or the radiologist] with decision support, we can help them become more efficient. For instance, the tools could help reduce the amount of time spent on cases with no obvious disease or obviously benign conditions and instead help them focus on the more confounding cases. Anant Madabhushi – the F. Alex Nason Professor II of biomedical engineering at the Case School of Engineering
So far, the tools the team have demonstrated have produced exceptionally accurate results.
In one example, the computational-imaging system predicted with a 97% accuracy which patients were showing evidence of heart failure. While, two human doctors presented with the same information were only able to predict 74% and 73% accuracy.
How does new Radiomics and Pathomics technology beat human experience?
The big advantage that computers have vs humans is speed. The increased speed of analysis means that machines can study a far larger volume of information than their human counterparts.
The computer programmes developed by Madabhushi and his team can log, read, and compare hundreds of slides of tissue samples in the same time a human pathologist can view one slide.
The algorithms can then ‘learn’ and catalogue a samples shape and texture, look at how it relates to the structure of surrounding glands and tissue to figure out how aggressive a medical issue is and its associated risks to a patient.
‘Deep learning’ means the programme can then create new algorithms which better compare and contrast all this data. Long-term the team hope this means the system might be able to predict everything from if a scanned nodule could be cancerous to how aggressive a disease could be.
Does this put doctors out of work?
The short version is no. Madabhushi sees the technology as a tool to help pathologists and radiologists interpret the data they are presented with as well as helping them make better decisions about treatment.
I always use the example of Botswana, where they have a population of 2 million people—and only one pathologist that we aware of,” he said. “From that one example alone, you can see that this technology can help that one pathologist be more efficient and help many more people. Anant Madabhushi – the F. Alex Nason Professor II of biomedical engineering at the Case School of Engineering
The team hope that ultimately this technology will help make peoples lives better by improving their diagnosis and treatment options. The technology has so far developed rapidly and it is sensible to assume that in the long term it will become common place.