Currently, a visit to the dermatologist is your best chance of accurately identifying whether a mole could be cancerous. However, researchers at Stanford University have just developed a computer program which is able to diagnose skin cancer as accurately as doctors using pictures of moles.
How does it work?
The computer program was developed using a type of machine learning called deep learning. Machine learning is different to normal programming, since rather than the programmer writing all the rules that the computer follows, instead we show the computer lots of data and let it work out the rules for itself.
Imagine for a second that you wanted to get the computer to tell you if a picture contained a cat or a dog. This may seem easy for us, but that is only because our brain is absolutely amazing at interpreting images. For the computer, this is a rather difficult problem. However, using machine learning we can show the computer thousands of images, and tell it which ones have cats and which ones have dogs. At the end of this process, we can show the computer images that it has never seen before and it can tell us whether it thinks that image has a dog or a cat in it based on the examples it has previously seen.
The researchers who have published this paper have done something similar to the example above, however instead of cats and dogs, they’ve shown the computer lots of examples of different skin conditions and told it whether they were benign (not dangerous) or malignant (cancerous).
Why this research is significant
The researchers used a lot of examples to teach the program how to identify cancerous moles (1.41 million in total!). The exciting thing about this research is that they’ve developed a program which is able to deal with pictures from different cameras and of varying quality. This is significant (and rather difficult), as previous attempts at automatically identifying cancerous moles had depended upon high quality images taken by experts.
This means that it may be possible for the program to be turned into an app that people could use on their phone from home in the future.
While this research is exciting from both a technical and clinical perspective, there are potentially a few issues we should be aware of:
- Blurry and far-away images were not used when testing the program, so pictures will have to be up-close and in focus.
- People using an app that may be developed in the future may not bother to check all the moles on their body, or may miss moles even if they have someone helping them.
- Since the detection of a cancerous moles is based on probability, you will need to balance the need to be cautious and flag moles as worth a visit to the doctor against the inconvenience of too many false detections meaning you might as well visit a doctor anyway.
- Since the texture of a mole can help with diagnosis, doctors will often rely on their sense of touch whilst diagnosing a mole, which obviously is not possible with an image.
Aside from those issues, this is an exciting breakthrough in both the fields of computer science and medical research, and it will be interesting to see whether we will start to see software like this appearing in our phones over the next few years.
Link to the study: http://www.nature.com/nature/journal/vaop/ncurrent/full/nature21056.html