The Robot Will See You Now, Even If It Won’t Replace Your Doc

A robot takeover of medicine? Not so fast, says Stanford University’s Daniel Rubin.

While AI can be a reliable medical tool, it’s not going to replace your doctor, said Rubin, an associate professor of biomedical data science, radiology, medicine and ophthalmology, speaking at the GPU Technology Conference this week.

The challenges researchers face — interpreting medical images, assessing how disease presents in patients, and monitoring patient responses to treatments — can all be supported through deep-learning trained applications that reduce the error rate of diagnosis and improve clinical decision making.

“Physicians are caregivers, not computers. They don’t need to be replaced, what they need is help in taking care of patients,” said Rubin, director of biomedical informatics at the Stanford Cancer Institute, and director of the Scholarly Concentration in Informatics at the Stanford School of Medicine.

His research focuses on the intersection of biomedical informatics and imaging science. This involves developing machine learning methods and applications to use information in images, combined with clinical and molecular data, to characterize disease.

Rubin’s group translates these methods into practice through applications to improve diagnostic accuracy and clinical effectiveness, with an end goal of identifying the best treatments. “Ideally, we would ‘profile’ disease for personalized medicine,” he said.

Image Assessments

What physicians care about in medical imaging is often quite subtle. That’s especially true when looking at black, white and gray images with dense areas that could signal a cancerous tumor is developing, he said.

Radiologists need to know the characteristics of a tumor: Whether the edges of the affected area are straight or faceted, for example. And if the shape is contained, or has anemone-like fronds growing from it. Also, multiple images must be taken using a variety methods, from different angles, and over a period of time during the review process.

“Disease is dynamic, it changes over time,” Rubin said. Looking at different images over multiple times means you can assess if the disease is responding to treatment, or not, he said.

Deep learning, trained on a convolutional network, is being used to detect for breast masses in mammograms, helping identify early signs of diabetic retinopathy in ophthalmology, and looking for signs of cancer during bladder exams.

The improvements in image diagnosis are besting state-of-the-art computerized diagnosis algorithms, Rubin said. That’s helping boost physician confidence in using this information to guide decision making, and making health care safer and more accurate, and improving patient outcomes.

The post The Robot Will See You Now, Even If It Won’t Replace Your Doc appeared first on The Official NVIDIA Blog.