A new artificial intelligence (AI) program can effectively identify potential melanoma in wide-field photos, researchers say.
The system could use photographs of large areas of patients’ bodies taken with ordinary cameras in primary care or by the patients themselves to screen for early-stage melanoma, said Luis R. Soenksen, PhD, a postdoctoral associate and venture builder at Massachusetts Institute of Technology in Cambridge, Mass.
“We believe we’re providing technology for that to happen at a massive scale, which is what is needed to reduce mortality rates,” he said in an interview.
He and his colleagues published their findings in Science Translational Medicine.
Diagnosing skin lesions has already proved one of the most promising medical applications of AI. In a 2017 paper, researchers reported that a deep neural network had classified skin lesions more accurately than did dermatologists. But so far, most such programs depend on experts to preselect the lesions worthy of analysis. And they use images from dermoscopy or single-lesion near-field photography.
Soenksen and colleagues wanted a system that could use a variety of cameras such as those in smartphones under a variety of conditions to assess lesions over wide areas of anatomy.
So they programmed their convolutional neural network to simultaneously use two approaches for screening lesions. Like the earlier systems, theirs looks for characteristics of individual lesions, such as asymmetry, border unevenness, color distribution, diameter, and evolution (ABCDE). But it also looks for lesion saliency, a comparison of the lesions on the skin of one individual to identify the “ugly ducklings” that stand out from the rest.
They trained the system using 20,388 wide-field images from 133 patients at the Hospital Gregorio Marañón in Madrid, as well as publicly available images. The images were taken with a variety of consumer-grade cameras, about half of them nondermoscopy, and included backgrounds, skin edges, bare skin sections, nonsuspicious pigmented lesions, and suspicious pigmented lesions. The lesions in the images were visually classified by a consensus of three board-certified dermatologists.
Once they trained the system, the researchers tested it on another 6796 images from the same patients, using the dermatologists’ classification as the gold standard. The system distinguished the suspicious lesions with 90.3% sensitivity (true positive), 89.9% specificity (true negative), and 86.56% accuracy.
Soenksen said he could envision photos acquired for screening in three scenarios. First, people could photograph themselves, or someone else at their homes could photograph them. These photos could even include whole nude bodies.
Second, clinicians could photograph patients’ body parts during medical visits for other purposes. “It makes sense to do these evaluations in the point of care where a referral can actually happen, like the primary care office,” said Soenksen.
Third, photos could be taken at places where people show up in bathing suits.
In each scenario, the system would then tell patients whether any lesions needed evaluation by a dermatologist.
To ensure privacy, Soenksen envisions using devices that do not transmit all the data to the cloud but instead do at least some of the calculations on their own. High-end smartphones have sufficient computing capacity for that, he said.
In their next phase of this work, the researchers would like to test the system on more skin of color cases and in more varied conditions, said Soenksen. And they would like to put it through randomized clinical trials, potentially using biopsies to validate the results.
That’s a key step, said Veronica Rotemberg, MD, PhD, director of the dermatology imaging informatics program at Memorial Sloan Kettering Cancer Center, New York.
“Usually when we think about melanoma, we think of histology as the gold standard, or specific subtypes of melanoma as a gold standard,” she said in an interview.
The technology also raises the question of excessive screening, she said. “Identifying the ugly duckling could be extremely important in finding more melanoma,” she said. “But in a patient who doesn’t have melanoma, it could lead to a lot of unnecessary biopsies.”
The sheer number of referrals generated by such a system could overwhelm the dermatologists assigned to follow up on them, she added.
Still, Rotemberg said, the study is “a good proof of concept.” Ugly duckling analysis is a very active area of AI research with thousands of teams of researchers worldwide working on systems similar to this one, she added. “I’m so excited for the authors.”
Neither Soenksen nor Rotemberg disclosed any relevant financial interests.
Sci Transl Med. 2021:13;eabb3652. Abstract
This article originally appeared on MDedge.com, part of the Medscape Professional Network.
Source: Read Full Article