Evidence shows that the AI can spot differences in coughing that can't be heard with the human ear, and if the detection system can be incorporated into a device like a smartphone, the research team thinks it could become a useful early screening tool. The work builds on research that was already happening into Alzheimer's detection through coughing and talking.
Once the pandemic started to spread, the team turned its attention to COVID-19 instead, tapping into what had already been learned about how disease can cause very small changes to speech and the other noises we make.
The Alzheimer's research repurposed for COVID-19 involved a neural network known as ResNet50. It was trained on a thousand hours of human speech, then on a dataset of words spoken in different emotional states, and then on a database of coughs to spot changes in lung and respiratory performance.
When the three models were combined, a layer of noise was used to filter out stronger coughs from weaker ones. Across around 2,500 captured cough recordings of people confirmed to have COVID-19, the AI correctly identified 97.1 percent of them -- and 100 percent of the asymptomatic cases.
That's an impressive result, but there's more work to do yet. The researchers emphasize that its main value lies in spotting the difference between healthy coughs and unhealthy coughs in asymptomatic people -- not in actually diagnosing COVID-19, which a proper test would be required for. In other words, it's an early warning system.
The boffins now want to test the engine on a more diverse set of data, and see if there are other factors involved in reaching such an impressively high detection rate. If it does make it to the phone app stage, there are obviously going to be privacy implications too, as few of us will want our devices constantly listening out for signs of ill health.
The research has been published in the IEEE Open Journal of Engineering in Medicine and Biology.