- Prickly Pill
- Posts
- Will you and I be replaced by AI?
Will you and I be replaced by AI?
Medical knowledge doubles every two months. Tsunami of technological advances such as generative AI and quantum computing are mounting up and will reshape every aspect of work and life, and what it means to be a productive human. Will you and I be replaced by AI?
IN BRIEF:
To thrive in the rapidly evolving world of healthcare advancement, you will need to build an effective relationship with AI. I recommend you start now by developing and leveling up your key skills (data literacy, statistics, prompting, patient assessments, motivational interviewing, etc) and meta skill such as creative thinking and emotional intelligence, collaboration, learn and apply new knowledge quickly.
AI is like learning a language and as a healthcare professional you will need to be proficient, otherwise you may be replaced by those who are. Being able to communicate in foreign language is a skill but knowing how to develop the skills to learn the language is the meta skill. For that you will need skills that support the four Cs: content, context, complexity, and critique.
My first job, after immigrating to the US as a third year pharmacy student, was an interpreter and translator. It was more like three jobs wrapped in one: I worked for a local newspaper, police, and a hospital. I was in a high demand – not because of my fluent English- I spoke broken British English with Slavic accent, heavily peppered with German words. Luckly, it was Midwest - people were polite and patient. The reason I was in the demand is because I was one of few that could to read, write, and communicate in English within my community. I was also able to create connections, offer calming words during time of need, anticipate needs, go step beyond expected, pivot and learn quickly.
Navigating AI is like learning a language and as a healthcare professional (HCP) you will need to be proficient, otherwise you may be replaced by those who are. To speak it, you will need to build skills that focus on four Cs: content, context, complexity, and critique.
CONTENT
Your role as a healthcare professional is similar to that of translator and interpreter. Like interpreting, pharmacy and medicine are about decoding complex information and translating it into clinically actionable insights. Your words or content are everything that relates to health conditions, their prevention, and treatment. Think of it as your dictionary built of words like “autoimmune” or “zinc deficiency”. SKILL NEEDED: To stay up to date and relevant you must continue to expand your professional vocabulary. Lifelong learning is a must.
CONTEXT
When you study a new language, dictionary alone is not enough… Understanding the context is the key to accurate interpretation. Language is about pattern recognition and applying rules like grammar. Same applies to AI models: machine learning (ML), deep learning, large language models (LLM), and multimodal models (MMM). They follow rules for pattern recognition to give us an output = deep insights on things that we can’t deduct ourselves or by traditional tools. They are trained on content - data we feed them: words, images, numbers, sounds, etc. However, the context in healthcare is complex and nuanced. This is why we need not only technical teams but teams of HCP and field experts who understand context and can feed models diverse and clinically relevant data. This is where you and I come in. Our boots on the ground experience, interacting with patients, providers, and payers, enable us to ask better clinical questions and ensure that data and models trained are reflective of our patients, all of our patients.
UPSKILL: Communication and team work. Effective use of AI in HC requires ability to speak multidisciplinary language. How to do that: develop data literacy skills (data collection, processing, interpretation, ethical use, etc), clinical literacy skills (learn something new, like pharmacogenomics, a study of drugs and genes), and ability to clearly communicate. One simple trick I use it this: how would I explain this to my grandma? Being clear is more important than sounding smart.
COMPLEXITY
PGx is the most used tool of precision/personalized medicine which leverages genetic data for diagnosis and treatment. Since the completion of the Human Genome project about 20 years ago, we accumulated oceans of genomic data. This is why use AI/ML computational tools to help us make sense of it and to find connections, such as identifying disease-causing variants. We also use deep learning to predict cancer response to drug treatments and other diseases or conditions.
Adding to complexity of genetic data, are data from the EHR- medical records such as current and past medical history, medication list, drug allergies, demographics, labs, SDOH, sleep patterns, insurance data, imagining data,etc. But there are also many nuanced things that comes up in a conversation you have with your patient: how they felt that day they came for a visit, their food aversions such as cilantro, difficulty getting numb at the dentist, having flexible joints, having red beard and poor pain response, We refer to this as multiomics biomarkers– all measurable and descriptive as quantitative, qualitative elements of self. For example, AI scientists and clinicians have trained an AI model to understand how genetic and clinical biomarkers can better predict if a patient with MDD will responds to SSRI or not. MDD is a leading cause of disability worldwide. It affects. 1/3 women and 1/5 men by the time they are 65. However, 4 in 10 of us do not respond to the first line therapy, which are SSRI. One of the reasons is that MDD is complex due to heterogeneity of symptoms. Ability to predict if an SSRI is the way to go or nor, cuts down on trial-and-error prescribing, and matches the patient with the right therapy for the get go. This is important as early treatment is the key to successful outcomes.
UPSKILL: Don’t settle in clinical inertia. It is crucial you daily wear your critical thinking hat on and practice effective collaboration to explore the unknown together. Be guided by your practice. What gets overlooked because of “we have always done it this way”? Get creative. Others may follow.
CRITIQUE
Many of us did not experience content, context, and complexity of speaking data until Nov 30th 2022. This was a pivotal moment of democratized AI experience- what it feels like to interact and use ChatGPT, a LLM.
LLM models are fed and trained textual information. They are like search engine on steroids, already used to help with clinical documentation, prior authorization, etc. Last year, the Mayo Clinic partnered up with Google Cloud to improve efficiency of clinical workflows through gen AI, which unifies data across dispersed sources (pdf, labs, clinical notes…). This cuts down on time to find information and increases time spent with a patient.
If you have prompted LLM only to get plausible but inaccurate or made up “facts” = hallucinations. It happens when the model tries to generate content beyond its training data. This is not the only issue, there are also coherence, factuality, etc. Hallucinations range from 3 to 27% depending on a model. A single misinformation can be lead to a misguided treatment or flawed scientific conclusion, and can cause patient harm and pose Legal risks.
If AI models are as good as what we feed them then genomics and precision medicine have a problem- they are predominantly white with about 85% of genomics data bases are from Europeans and Caucasian descent. That is what many algorithms are trained on.
We all need not only be aware of this AND practice our final C = critique.
HCP will need to critically evaluate and validate outputs, say through reliable peer-reviewed medical sources.
Good news for those of us non-technical: there are tools for detecting biases and hallucinations are such as guardrails and a fact checker, retrieval augmented generator (RAG)
It is important to be able to understand where the data comes from and how its trained. Federated, and secure architecture, such as Mayo Clinic Platform_Connect, enables clinicians and researchers to learn from the data of past patients in a diverse, secure and validated playground. Industry professionals, policymakers, and regulatory authorities are collaborating to guide AI regulation in the right direction such as White House Blueprint for an AI Bill of Rights and Executive Order on Trustworthy AI. This week the National Academy of Medicine’s announced their AI Code of Conduct initiative to detail national architecture needed to support responsible and equitable use of AI in HC .
UPSKIlL: Learn effective prompting. This means asking it better questions (content) and direction for a desired output (context). We are already moving from text2text (LLM) to se2q2seq, a multimodal model that are fed text, images, sounds, etc.
AI is like learning a language and as a healthcare professional you will need to be proficient, otherwise you may be replaced by those who are. Being able to communicate in foreign language is a skill but knowing how to develop the skills to learn the language is the meta skill. For that you will need skills that focus on four Cs: content, context, complexity, and critique the composition.
The future of healthcare is bright- we are not going to be replaced, as long as we are involved in building it.