Artificial intelligence identifies ‘kissing bugs’ that spread Chagas disease
New research from the University of Kansas shows machine learning is capable of identifying insects that spread the incurable disease called Chagas with high precision, based on ordinary digital photos. The idea is to give public health officials where Chagas is prevalent a new tool to stem the spread of the disease and eventually to offer identification services directly to the general public.
Chagas is particularly nasty because most people who have it don’t know they’ve been infected. But according to the Centers for Disease Control and Prevention, some 20 percent to 30 percent of the 8 million people with Chagas worldwide are struck at some later point with heart rhythm abnormalities that can bring on sudden death; dilated hearts that don’t pump blood efficiently; or a dilated esophagus or colon.
The disease is caused most often when triatomine bugs—more commonly known as “kissing bugs”—bite people and transmit the parasite Trypanosoma cruzi into their bloodstreams. Chagas is most prevalent in rural areas of Mexico, Central America and South America.
A recent undertaking at KU, called the Virtual Vector Project, sought to enable public health officials to identify triatomine that carry Chagas with their smartphones, using a kind of portable photo studio for taking pictures of the bugs.
Now, a graduate student at KU has built on that project with proof-of-concept research showing artificial intelligence can recognize 12 Mexican and 39 Brazilian species of kissing bugs with high accuracy by analyzing ordinary photos—an advantage for officials looking to cut the spread of Chagas disease.
Ali Khalighifar, a KU doctoral student at the Biodiversity Institute and the Department of Ecology and Evolutionary Biology, headed a team that just published findings in the Journal of Medical Entomology. To identify the kissing bugs from regular photos, Khalighfar and his colleagues worked with open-source, deep-learning software from Google, called TensorFlow that is similar to the technology underpinning Google’s reverse image search.
“Because this model is able to understand, based on pixel tones and colors, what is involved in one image, it can take out the information and analyze it in a way the model can understand—and then you give them other images to test and it can identify them with a really good identification rate,” Khalighifar said. “That’s without preprocessing—you just start with raw images, which is awesome. That was the goal. Previously, it was impossible to do the same thing as accurately and certainly not without preprocessing the images.”
Khalighifar and his coauthors—Ed Komp, researcher at KU’s Information and Telecommunication Technology Center, Janine M. Ramsey of Mexico’s Instituto Nacional de Salud Publica, Rodrigo Gurgel-Gonçalves of Brazil’s Universidade de Brasília, and A. Townsend Peterson, KU Distinguished Professor of Ecology and Evolutionary Biology and senior curator with the KU Biodiversity Institute—trained their algorithm with 405 images of Mexican triatomine species and 1,584 images of Brazilian triatomine species.
At first, the team was able to achieve, “83.0 and 86.7 percent correct identification rates across all Mexican and Brazilian species, respectively, an improvement over comparable rates from statistical classifiers,” they write. But after adding information about kissing bugs’ geographic distributions into the algorithm, the researchers boosted the accuracy of identification to 95.8 percent for Mexican species and 98.9 percent for Brazilian species.
According to Khalighifar, the algorithm-based technology could allow public health officials and others to identify triatomine species with an unprecedented level of accuracy, to better understand disease vectors on the ground.
“In the future, we’re hoping to develop an application or a web platform of this model that is constantly trained based on the new images, so it’s always being updated, that provides high-quality identifications to any interested user in real time,” he said.
Khalighifar now is applying a similar approach using TensorFlow for instant identification of mosquitoes based on the sounds of their wings and frogs based on their calls.
“I’m working right now on mosquito recordings,” he said. “I’ve shifted from image processing to signal processing of recordings of the wing beats of mosquitoes. We get the recordings of mosquitoes using an ordinary cell phone, and then we convert them from recordings to images of signals. Then we use TensorFlow to identify the mosquito species. The other project that I’m working right now is frogs, with Dr. Rafe Brown at the Biodiversity Institute. And we are designing the same system to identify those species based on the calls given by each species.”
While often artificial intelligence is popularly portrayed as a job-killing threat or even an existential threat to humanity, Khalighifar said his research showed how AI could be a boon to scientists studying biodiversity.
Source: Read Full Article