Engineers participating in a hackathon last weekend demonstrated an artificial intelligence that they say could someday detect cancerous moles, . Although the program is currently in its infancy, the team hopes that enough user submissions could allow Doctor Hazel to predict skin cancer with at least 90 percent accuracy.
After one day and thousands of image downloads, the AI is identifying cancer at an 85 percent success rate, the team said during a presentation at TechCrunch Disrupt’s San Francisco 2017 hackathon. However, the team has and is inviting users to submit their own photos to improve Doctor Hazel’s performance.
“There’s a huge problem in getting AI data for medicine … no one wants to share,” Mike Borozdin, developer of Doctor Hazel, told TechCrunch. “But amazing results are possible. The more people share, the more accurate the system becomes.”
Doctor Hazel gauges 8,000 variables when viewing a sample to determine whether the image is of a mole, melanoma, another type of cancer, or nothing. The team plans to have an app accompany the platform, as well as an image capturing device that could someday be made available for sale.
A fraught, crowded field
Apps, mobile platforms, and camera devices designed to evaluate moles and estimate skin cancer risk have a long history filled with successes and failures.
In 2011, the Skin Scan app launched with claims of melanoma detection using only the iPhone’s camera, and rebranded as SkinVision in 2012. That same year, University of Michigan Health System physicians launched UMSkinCheck featuring reminders and instructions for patients to self-examine their moles and skin lesions over time. Then, in 2014, experts built upon nearly a decade of image classification research to develop DermoScreen — an app that along with a $500 dermoscope was able to detect 85 percent of melanoma cases.
However, the need for rigorous validation became more apparent in early 2015, when the Federal Trade Commission took action against the melanoma detection apps MelApp and Mole Detective. The FTC alleged that the marketers of both mole photography-based apps “deceptively claimed the apps accurately analyzed melanoma risk,” and that the marketers had insufficient evidence to make these claims.
Regardless, several apps and technologies targeting skin cancer detection have come to the forefront in the years since. Oregon Health and Sciences University researchers developed a ResearchKit study called in 2015, which encourages users to track the growth of any moles using their smartphone camera and a dime for scale. Last year iDoc24, which does business as First Derm, launched a smartphone-connected dermatoscope that, through a companion app, sends pictures of users’ moles to a dermatologist for clinical evaluation.
A similar format is also employed by SkinVision, who since rebranding has received funding from Leo Innovation Lab as well as CE-certification. Weeks ago, VisualDx announced that its app — designed to support non-dermatological physicians by quickly categorizing skin images — would debut in Apple iOS 11. But perhaps Doctor Hazel’s most direct and recent competition comes from Stanford’s Artificial Intelligence Laboratory, which earlier this year released a study describing a convolutional neural network that matched the performance of 21 board-certified dermatologists. The Stanford researchers said that they hope to bring their platform to smartphones and the general population.