The Negative Impacts of Choosing AI Over Humans in Healthcare

virtual medical scribe cost

May 18, 2023

The Negative Impacts of Choosing AI Over Humans in Healthcare

The use of AI in healthcare is growing at a rapid pace with the potential to revolutionize patient care and health outcomes. However, as AI becomes more popular and commonplace, the voices warning against the potential negative consequences are growing louder. From errors that can cause patient harm to data privacy concerns and risk of bias to inequality there are several threats that can come with the increasing dominance of AI in healthcare. Let us now discuss some of the risks and challenges of using AI over humans in healthcare.

Injuries and errors

Increased medical error is one of the obvious risks of poorly designed AI in medicine. It is possible for an AI system to recommend an incorrect medication for a patient, overlook a tumor on a radiological scan, or allot a hospital bed to a wrong patient based on erroneous predictions about which patient would benefit more, resulting in patient injuries. AI injuries and errors are different as patients and physicians react differently to errors resulting from software. Also, an underlying problem in one AI system can result in injuries to thousands of patients instead of a small number of patients being injured by a single physician’s mistake.

Data availability

AI models usually require huge amounts of data to train and test algorithms before they begin to give some useful results. Data is gathered from various sources such as electronic health records, pharmacy records, insurance claims records, or consumer-generated information like fitness trackers or purchase history. However, data is fragmented across multiple systems. Say for example, if the patient sees multiple physicians and changes insurance companies, data may be fragmented among different systems and formats. This fragmentation of data increases the possibility of errors, raises the costs associated with data collection and reduces the comprehensiveness of datasets all of which can limit the development of an effective healthcare AI.

Data privacy concerns

AI, just like any other healthcare technology poses risks to patient data security and privacy. AI models require a staggering volume of data. Hence, without robust data security strategies and appropriate security safeguards there is always a chance that data is constantly at risk. AI can also violate patient privacy in yet another way. They are capable of predicting patient private information even though the algorithm never received the information. For example an AI system can detect Parkinson’s disease using the person’s cursor movements without any data.

Bias and inequality

If the data used to train an AI model contains even the faintest hint of bias then that bias will be present in the actual AI also. Say for example the data gathered for training AI is from academic medical centers then the resultant AI system will be less effective in treating patients who do not typically visit academic medical centers. Similarly, AI’s algorithms do not take into account any socioeconomic background of a patient. Speech-recognition AI systems may not perform well in transcribing encounter notes if physician gender or race is inadequately represented in training data. This is not the case in the use of human scribes to document patient visits.

Professional realignment

One long-term risk of implementing AI technology in healthcare is that it could lead to ‘shifts in the medical profession’. In some medical specialties, such as radiology, AI has automated the process of evaluating the appropriateness of imaging. An AI tool can read chest x-rays without a radiologist’s oversight. But, there is a concern that widespread use of AI will result in decreased human knowledge and capacity over time, causing the roles of a radiologist to become automated and thus obsolete.

The Nirvana Fallacy

In the case of AI, the Nirvana Fallacy, which seeks perfection rather than just performing better than the status quo, has possibly already slowed down the use of AI in healthcare. Humans often tend to believe that a new option is better than the existing one, this Nirvana Fallacy, may be one reason for the AI boom in healthcare. But healthcare AI poses dangers and challenges. Even though, AI in healthcare can bring in numerous benefits their long-term impact is still up for debate.

In recent years, artificial intelligence has been applauded for the tremendous promise it holds, but has also been the subject of intense controversy. Healthcare AI systems could give rise to a host of unwanted, and sometimes serious, consequences. Moreover, the potential solutions are complex and involve investment in infrastructure for high-quality, representative data, collaborating on FDA oversight with other health actors, and modifying medical education to better prepare doctors for the shifting roles in the evolving system. Hence, it is important to be cautious with the use of AI. It would be wise to create a balance between humans and AI. This could happen only if they both work together to improve the delivery of care.

Free Trial

Send us your Inquiry!!

Please fill out this form.

We will reach out to you within 24 hours

Related Articles

Aug 04, 2022

Simplify Clinical Documentation with Smartphone Apps

Documentation is an important daily clinical responsibility. In order to optimize patient care, physicians are always on the lookout for new ways to effectively and efficiently document patient visits.

Jul 28, 2022

The Dual Role of Virtual Medical Scribes - Charting and Coding

The use of virtual medical scribes has become increasingly popular in the recent years, as medical practices across the country are on the constant lookout for ways to reduce clinical documentation overload, thereby improving overall productivity.

Jul 21, 2022

Hybrid Medical Scribing Model - The Best of Both Worlds

The clerical burden associated with EHR usage is attributed as the number one cause of physician burnout. We also know that physicians spend twice as much time on EHRs and other clerical tasks compared to the time providing patient care.