There's been a lot of discussion on the bad side of AI in healthcare, and most times we focus on the wrong things. AI is a tool for assistance, and where there's a problem is when a doctor is over-reliant on the AI. Health practitioners should undergo regular training and practice without AI to maintain their core endoscopic skills. Continuous education is key, and there should be protocols in place that still give room for the doctor to perform these invasive procedures without the help of artificial intelligence.
Based on this colonoscopy study, I can draw some similarities to the things I have observed during the last decade in software engineering. When GitHub Copilot was released, a lot of developers began overusing the auto-generated code and not knowing its logic structure. After around half a year, I observed that junior engineers in my team were not able to debug some very basic problems due to loss of the basics of problem solving. It is not the answer to avoid AI but to use it in a strategic manner. AI in medical practice ought to be an advanced pattern recognition system that indicates possible areas of concern but leaves diagnosing to the physicians. This is what it is like advanced linting in code development- it can tell you that there are problems but it does not handle the issues automatically. According to my experience in training more than 10,000 developers the secret is active interaction. Physicians should have some protocol according to which they have to explain their thinking and only then be provided with AI suggestions as I instruct students to solve algorithm problems step-by-step and then check a solution. Scheduled skill drilling appointments with no AI would maintain the diagnostic instinct. We refer to this in our field as coding without autocomplete - training without the ability to practice those core skills individually will lead to atrophy but still getting the benefit of the technological advances when necessary.
AI in colonoscopies has great potential, but it should assist, not replace a doctor's judgment. I've found that with a large nursing network, continuous training and practical experience are key, even with new tech. I think AI can quickly point out issues, but medical staff need to confirm results to keep their skills up. Using AI alongside routine skill checks helps medical staff stay skilled while also getting the upside of quick, precise diagnoses.
Using AI to get a second opinion rather than as a first-line diagnostic seems like the solution if we want to preserve our own skills. My worry is that economic realities may make this difficult. If AI can do a good enough job of diagnosing potential cancers, providers looking to save money are going to use it, undercutting human expertise.
Neuroscientist | Scientific Consultant in Physics & Theoretical Biology | Author & Co-founder at VMeDx
Answered 6 months ago
Good Day, AI what roles may it play safely and what's to be done to see that those roles are played out well? AI may be put to use in colonoscopy by way of it1 playing a supportive role that augments, rather than replaces, the doctor's diagnostic tools. For example, AI can serve as a second pair of eyes, which flags up possible polyps or lesions in real time, which the endoscopist can then look at through the lens of clinical experience. Also in this model the doctor maintains the primary role in diagnosis which in turn preserves their full participation in the process. Also how do we get doctors to not fall into the trap of over dependence on AI? In order to avoid over reliance and the break down of basic diagnostic skills, it is key that there is an ongoing professional development program. Doctors should be put through regular performance assessments, simulation based learning, and case studies without the use of AI to keep their skills sharp. Also it is a good idea to integrate AI into the training as a tool for feedback instead of a crutch which in turn will also support their autonomy. What are the health care wide strategies that will support the safe use of AI? Health care systems must put in place clinical review boards and standard operating procedures for the use of AI in endoscopy. Also they should track results with and without AI support which will help to identify issues of deskill at an early stage and allow for corrective action. Also policy should very much put out the that AI is to add value to -- not take the place of clinical decision making. Also in to this should go ethical issues, transparency in the design of the AI, and accountability structures to see to it that AI is integrated into patient care in a safe, effective and fair way. If you decide to use this quote, I'd love to stay connected! Feel free to reach me at gregorygasic@vmedx.com and outreach@vmedx.com.
AI in healthcare should be treated as an augmentation, not a replacement. It is meant to support, and not replace, the knowledge & expertise of the medical professional. In the case of colonoscopies, AI could function as a second pair of eyes which helps the physician detect what they may have missed but should not carry out the procedure entirely. AI could be used to flag likely polyps in real-time but ultimately, the endoscopist decides to pursue any findings. This ensures that the clinical judgment of the physician remains at the forefront. In the future, AI can also be engineered to provide more mechanisms for active interaction and transparency. Rather than simply indicating a dispute or an issued detection, the system could provide its rationale. For example, if there is a detected polyp that the AI is aware of, it can provide qualitative features and indicate characteristics such as a specific shape or size to indicate detection as an adenoma. This way, doctors could learn how the AI diagnoses the detections while continuing to refine their diagnostics. In this way, AI can be used and developed as a teaching tool, supporting the physician's observation for subtle signs of disease, but not override the doctor's decision-making. Using AI as an interactive support and partner for doctors helps reduce the risk of dependence & deskilling in their clinical expertise.