We’re all familiar with the cinematic portrayal of artificial intelligence (AI) – humans fighting against human-like machines.
But despite this banal representation, where technology mimics everything aspects of human consciousness (sometimes even superseding human intellect), contemporary AI, and what may be feasibly developed in the near future, are a far cry from this Hollywood portrayal.
Even though the promise of general artificial intelligence creates the hype that drives the development of artificial intelligence, at present, AI can only replicate limited human tasks.
Think about the last time you asked a question and “googled” it. Google performed a massive search in less than a second and was likely able to identify key resources to answer your query.
Machine learning models, currently the most influential AI development approach, can reproduce these limited, but exceptionally well-executed tasks with unwavering consistency.
Because of these benefits, healthcare is increasingly turning to AI to make patient care more effective, safe and efficient, but the question remains: does reality match intentions?
AI in healthcare
In 2019, leading medical education researcher Ken Masters, associate professor of medical informatics at Sultan Qaboos University, Oman, declared this:
“To be a competent doctor, an AI (artificial intelligence) system does not need to be the best doctor in the world. AI [only] must be better than the worst grad student in your class… If AI is better than your average student, that’s [already] better than 50% of all doctors.
Although the idea of having AI healthcare providers might seem like a futuristic idea, AI is already embedded in the healthcare sector.
In radiologyAI has the potential to standardize protocols across institutions, increase the speed of reporting, and improve diagnosis.
Wearable biometric sensors can provide ambulatory monitoring, and smartphones can be well placed to help fight COVID-19, with an app proposed to be able to distinguish cough from COVID-19 other types of cough.
Given the role of healthcare education in preparing students for future clinical work, what role does AI play in healthcare education?
The role in education
In the 20th century, author Arthur C Clarke said:
“Any teacher who can be replaced by a machine should be!” »
Clarke’s prediction is already coming true. YouTube and other digital technologies are used by teachers and students. It is now increasingly common for healthcare educators to work with technology to deliver world-class education.
Anatomy, a very human science, often paves the way for these tech-infused approaches to learning.
Every day a healthcare student can interact with 3D prints, virtual reality or augmented reality learn the structures of the human body.
New AI technologies are also used, such as chatbots to help students answer traditional “Googled” questions such as: “Which nerve supplies this structure?”
These innovations are being rolled out across the university sector, from education to administration, with promises of improvement learner-centred program decrease in educator burnoutand personalized learning.
However, a series of unintended consequences are emerging, particularly for students already marginalized by the higher education system.
Our interdisciplinary team, comprised of a medical educator, public health researcher, medical ethicist and educational technology expert, drew on the broader literature to identify five of the tensions between the promises and the perils of health education infused with AI tools.
A common feature of all five strains is the AI’s inability to detect novelty and tolerate ambiguity.
Given this uncertainty – and uncertainty tolerance is a key attribute of effective healthcare – there is a clear need to focus on Uncertainty Tolerance in Health Care Educationleaving it to human educators to develop and deploy AI tools in a way that balances AI’s strengths in accuracy and consistency with their weaknesses to foster learner tolerance for uncertainty .
Our item, Artificial intelligence and clinical anatomical teaching: promises and perilshighlights the tensions healthcare educators need to consider when integrating AI into the classroom.
These tensions stem from the typical inflexibility of AI in the face of the innate variability, uncertainty, and ambiguities typical of healthcare practice and healthcare education.
The human form is variable and diverse
AI depends on a set of assumptions that can be relatively static.
Our previous article on the spectrum of gender and sex normally present in the human population illustrates the natural anatomical diversity present in the human population, but health education technology continues to illustrate binary representations of gender, and primarily a singular phenotype (e.g., fit, healthy young adults, mainly with fair skin tones).
How do you build an AI for healthcare education that improves opportunities for students to become aware of the diversity and variability of human anatomy?
Health care practice is uncertain
Today’s healthcare curricula increasingly focus on merging essential knowledge with the skills required for healthcare practice, including professional virtues, interpersonal skills, and ethical awareness and reasoning.
While AI can be very effective in presenting foundational healthcare knowledge, it is less helpful (and potentially detrimental) in developing healthcare practice skills.
When will the integration of AI tip the scales from the humanness of healthcare to the reproducibility and inflexibility of AI?
AI focuses on standardization and relies on biased datasets
Many AI tools are developed from datasets that reflect existing biases in healthcare. There is growing evidence that AI is not just perpetuating these inequalities, but exacerbates them.
Depending on how AI technology is integrated into the classroom, AI tools can perpetuate biases, such as sounding like an affable and friendly”white womanwhich can generate in our learners the perception that “serving others” is the role of the female health care provider.
How to effectively integrate AI technology into healthcare education in an inclusive and unbiased way?
Student support can be highly variable and individualistic
While AI can perform repetitive tasks such as grading, answering routine questions about the curriculum, or tracking absences, these seemingly repetitive tasks can sometimes be critical markers for educators to “step in.”
The challenge here is to what extent we “trust” AI to make decisions, instead of human educators.
What is the right balance between AI-led education and human-led healthcare education?
How students engage with learning varies
The limited variables that AI tools rely on can lead to inaccurate reporting of learner engagement. AI depends on student interactions with platforms so that data can be collected for analysis.
Many students like to use “offline” learning approaches, or approaches that may be different from what the AI is programmed to recognize. This could lead to biased reporting of learner engagement by the AI system.
How will AI, which depends on a limited and standardized set of variables for reporting, be used effectively to assess student learning?
Balancing the machine with the human
While these tensions may seem insurmountable, we offer some solutions for integrating AI into higher education in thoughtful and thoughtful ways.
- Increase transparency about when and how AI is used, and the limits of its use in the given context.
- Ensure diversity of developer backgrounds and expertise so that those who will use the technology in their healthcare education or practice, and those whose decisions are influenced by the AI tool, such as students and patients, are involved.
- Build AI-based health education tools that embrace and convey the natural (and clinically relevant!) uncertainty, diversity, and variability represented in human populations.
- Educate educators and their expertise on AI through professional development, and help them know when to align and when to challenge AI recommendations.
- Build a curriculum that has deliberately built in non-AI time to allow students to develop their people skills.
- Develop AI educational tools that support, not replace, the human educator.
We encourage all educational institutions to provide courses and support for educators, and to challenge the “new and shiny” mass adoption of AI educational tools into more thoughtful and realistic adoption.
This will ensure that all students benefit from the educational tools of AI, not just a select group.