The power of the new artificial intelligence (AI) large language models, and AI in general, is still being sorted out as use skyrockets. Sorting the hype from reality with a new technology always takes time, after actual use in real-world settings. I wrote previously about using Chat GPT type applications as a medical expert system, and a recent study highlights another potential use of AI in medicine.
The study – Development of digital voice biomarkers and associations with cognition, cerebrospinal biomarkers, and neural representation in early Alzheimer’s disease, uses AI to analyze voice patterns and correlate that with either mild cognitive impairment or early Alzheimer’s disease. This approach leverages the ability of AI systems to detect subtle patterns in large sets of data.
This approach is based on previous research which shows that speech patterns are altered in early AD. A 2015 review concluded:
Based on our research, we are inclined to claim that AD can be more sensitively detected with the help of a linguistic analysis than with other cognitive examinations. The temporal characteristics of spontaneous speech, such as speech tempo, number of pauses in speech, and their length are sensitive detectors of the early stage of the disease, which enables an early simple linguistic screening for AD.
The research is still somewhat preliminary, however, and the exact changes in speech are still being examined. In addition to the temporal characteristics of speech, there are lexical-semantic features, such as grammatical complexity, word variety, and sentence structure. As the researchers summarize: “Semantic degradation occurs early in AD resulting in a reduction in the amount of specific content information, while changes in features such as syntax and grammar occur later in the disease.” The present study also looks at acoustic characteristics of speech, such as pitch, the ways sounds are formed, and the rate of speech. Acoustic features relate to motor control of speech rather than cognitive function. These are features unlikely to be noticed in the early or mild stages of dementia.
The question for the current study is how well can AI match subjects with MCI, early AD, and healthy controls based entirely on their speech patterns. They also correlated with AI assessments of speech with measures of amyloid beta in the CSF, a biological marker for AD. They found:
This study found that, not only did digital voice biomarkers differ by cognitive status, but they were also accurate in detecting AD biomarker status. These derived voice biomarkers also tracked disease progression, measured by changes in the 2-year follow-up CDR-SOB score.
While this is a small study (206 subjects total), the results are encouraging. It will need to be replicated in larger studies, and will have to be tracked for clinical sensitivity, specificity, and overall utility. Also, like most clinical studies, subjects with common confounding factors, such as prior stroke, thyroid disease, B12 deficiency, were excluded from analysis. This makes the analysis cleaner, but complicates application to the real world. This is essentially a proof-of-concept study – it confirms there are detectable speech changes in early dementia and that their AI tool can detect it.
The advantage of this tool is that it is much quicker than existing tests, such as neuropsychological testing or detailed language analysis. The AI can evaluate a 10 minute sample of speech. This is also something that can easily be done in the office with immediate results. This can therefore be used as a screening tool, either in the general population (as part of a wellness checkup after a certain age) or in patients with complaints that might relate to cognitive function.
One common clinical question is to differentiate normal ageing (people having a “senior moment”) from early dementia. Also, so-called “pseudodementia” is very common – patients with cognitive complaints but without brain disease. This is most often caused by chronic poor sleep, depression, anxiety, or a drug side-effect. One question is, could AI speech analysis replace more expensive diagnostic testing, such as an MRI scan of the brain?
The primary utility, however, will be in early detection. The authors emphasize that this can help patients and families plan better for their future. They are being cautious because we do not yet have any proven treatments that alter the course of the disease. But it is very likely that we soon will. There are a number of monoclonal antibody and drug treatments in the pipeline that very plausibly can be disease modifying. For these types of treatments, early diagnosis and intervention is key. Imaging treating AD years before it would have become detectably symptomatic.
Also, using AI for early disease diagnosis, or just to enhance the accuracy of diagnosis, has tremendous potential. Physicians demonstrably become better at diagnosis as they gain more experience, because of improved pattern recognition. However, AIs are better at pattern recognition than humans. The diagnostic potential here is therefore tremendous, and this is a very active area of research.
AI, for example, is quickly becoming as good or better than radiologists at detecting abnormalities. This does not mean they will replace radiologists anytime soon. Rather, these systems can be used as an expert aid, making interpretation faster and more accurate. Humans and AI have very complementary strengths and weakness, and the combination can be very powerful.
AI pattern recognition can also be used to analysis any clinical data, analysis not only how a patient talks, but how they move, the characteristics of their skin, a list of their symptoms, and of course the outcome of any diagnostic test.
Modern medicine is becoming dizzyingly complex and ruinously expensive. Patients often have multiple chronic illnesses, and may be on many medications simultaneously. Further still, care may be spread out among many specialists, each focusing on only one piece of the puzzle. Primary care doctors who are supposed to put it all together can get overwhelmed.
AIs are good at handling lots of data, of seeing how all those bits of data interact, detecting overall patterns, and then using all of that information to make predictions. It seems like the exact tool we need to wrangle this beast of modern medicine that we have created. The potential for improved outcomes and reduced healthcare costs is tremendous. Research is happening, but not fast enough and not in a coordinated fashion. This is the kind of project that should have its own institute at the NIH. Any government investment would likely be repaid many times over in reduced Medicare and Medicaid costs.
The current study looking at early diagnosis of AD is one tiny slice of the potential for AI to aid in medical diagnosis and decision-making. This is likely to transform medicine in the coming decades. I would like to see it happen faster.