AI could enhance rural healthcare delivery, but concerns linger with the emerging technology
- Experts express concern over a lack of transparency in the types of data being used to develop the technology as well as the need for unbiased algorithms.
Experts expect artificial intelligence to play a part in the future of healthcare in the United States, but there is concern that a lack transparency and biased algorithms may taint the creation of those AI models.
Healthcare advocates often contend that Mississippi faces a number of hurdles in ensuring residents have access to care, especially in rural areas where a number of factors are at play. In addition, many of the state’s hospitals are reportedly facing shortages in funding, staff and facilities. Telemedicine has been proposed as a potential source to address at least some of these issues and advocates say AI could play a part in ensuring people live healthier lives.
There are existing disparities in healthcare within the United States, particularly when studying various demographics. A person’s health outcomes are largely based on their genetics, place of residence, income, exercise regime, diet, education and access to healthcare, Dr. Joseph Betancourt, President of the Commonwealth Fund said.
“And we know within the context of racial disparities that minority communities are more likely to be on the kind of the negative end of all those social drivers, and tend to be disadvantaged in many ways,” Betancourt described, also noting that minorities tend to be less insured for a variety of reasons.
Evidence from two decades of medical data and thousands of papers demonstrates that there has been a difference in the kind of care people receive based on the color of their skin. For example, Betancourt said it has been determined that pulse oximeters, the small devices placed on fingertips to read blood oxygen levels, have a hard time reading those levels on patients with darker skin tones.
Dr. Tina Hernandez-Boussard, Professor of Medicine at Stanford University described another example involving the eGFR calculator, which stands for the estimated glomerular filtration rate. It measures kidney function, and in the past worked under a different set of standards for black Americans based on previously held beliefs that they have more muscle mass and therefore different kidney function leading to higher creatine levels.
“And so, because of this race adjustment in the old calculator, black patients had to have significantly worse kidney function compared to non-black patients to receive the same level of care or qualify for advanced treatments,” Hernandez-Boussard explained.
She said those factors led to delays in receiving treatment and being placed on kidney transplant wait lists. Today, race has been removed from those algorithms and instead focuses on biological markers, addressing racial disparities in that aspect of healthcare.
Hernandez-Boussard added that due to the fact that “mixed race” was noted as the fastest growing category during the 2020 Census, there is even less of an argument to use race as an indicator for health risk factors.
Betancourt admits that over the last two decades while there has been promising movement toward equal access to care, problems remain that could prevent the creation of unbiased algorithms in current and future AI healthcare models. The use of recently developed wearable technology may help address some of the disparities.
“I fundamentally believe that while we need to make sure that these biases that we’ve already began [to implement] in algorithms to try to unroot them. While we do that, we also need to think about the use cases for which these technologies could actually significantly help us address disparities,” Betancourt added. “And I would argue not just for people of color. A lot of these challenges that lead to disparities impact all people, just at different rates.”
Advancement of technology in the medical field could hold significant potential benefits. If implemented properly, technology can help caregivers overcome language barriers, address social isolation for mental health patients, and assist in chronic disease management, Betancourt added.
AI also creates the potential for nurse avatars, or virtual nurses, displayed on a screen that can interact with people at their health literacy level. Unlike some human caregivers, these avatars never get tired of answering a patient’s questions.
“I think there’s also an opportunity to say, ‘What are the great ways in which these technologies can be deployed to meet care gaps today?'” Betancourt said. “And I think both of those things at the same time will really get us to a better place. And they really will make sure that we’re not leaving anyone behind, but by the same token advancing our ability to address disparities with all the technology has to offer.”
If these technologies can be implemented in a proper manner, it will benefit Mississippians, particularly in rural areas.
“Across our country, at the end of the day, we do know that we have financing challenges with rural hospitals and rural healthcare centers. We have workforce challenges in those areas and these are all getting worse, just to be clear,” Betancourt continued. “And so yes people have held up the promise of technology in these spaces, and I do believe that technology will play a very significant role going forward. “
An example of how beneficial virtual care can be was demonstrated during the COVID-19 pandemic. However, even the best virtual care cannot address all health needs. There will always be a need for a physician to physically examine patients to provide more effective care.
“There are going to be certain situations where a person needs a surgeon or needs to be airlifted out of some place,” Betancourt described. “And that’s not going to be solved with just technology.”
There is speculation that at some point robotic surgeons could provide care in rural areas, but that is still years away, he noted.
Another area where AI can provide assistance is in the field of mental health, but Henandez-Boussard said the technology is still being developed. There is concern of using chat bots for mental health care, because they are not created with the safety of the user in mind.
“It’s dire to understand the harms that can happen from these tools,” Hernandez-Boussard said.
Because chat bots are not being developed to provide safe mental health advice, and do not fall under the regulations of the FDA as a therapist, tragic events have played out in various states, such as a recent incident in California in April. In the wake of the suicide of a 16-year-old California boy his parents discovered he was using ChatGPT. Conversations found by the parents determined the bot not only discouraged him from talking with his parents, but even offered to write the suicide note, news reports have stated.
“But it’s not just in California where we’ve seen these tragic cases of chat bots that end in very serious harm… if not death,” Henandez-Boussard described. “It’s coming across the nation and so understanding how we put safeguards on these is going to be essential as we move forward.”
There is also the concern that the data being used to create AI models is selective for varying reasons. Katie Palmer, Health Tech correspondent at STAT, has noted that large swaths of data is not being included in the creation of AI models and AI algorithms, either because the images from X-rays and CT scans have a resolution that is too low or due to variations in the way healthcare is delivered across the world.
Palmer noted a lack of transparency in the types of data being used to create these models, saying that it can be hard to easily determine where bad outcomes in the use of AI are occurring. There are also instances where AI is potentially being used, but the patient is unaware. One example is in the use of recording devices during patient-doctor interactions. AI is, at times, used to create a transcript of that interaction to complete reports and that data is automatically inserted into a person’s health record.
“Patients aren’t always aware when AI is impacting their care,” Palmer said.