AI is already changing health care, but how it is implemented will determine whether it becomes a meaningful workforce solution or another source of burden for clinicians and patients.
That was a key theme during a WisPolitics-State Affairs and Wisconsin Technology Council luncheon on May 5 at the Medical College of Wisconsin focused on AI and health care workforce shortages. The panel featured WHA Senior Vice President of Workforce and Clinical Practice Ann Zenk; outgoing MCW President and CEO Dr. John Raymond; UW-Milwaukee Assistant Professor of Health Care Informatics Lu He; and Director of AI innovation at Recovery.com Nick Myers.
Pictured, L to R: UW-Milwaukee Assistant Professor of Health Care Informatics Lu He, WHA Senior Vice President of Workforce and Clinical Practice Ann Zenk, outgoing MCW President and CEO Dr. John Raymond, and Director of AI innovation at Recovery.com Nick Myers.
Zenk said AI has real potential to help hospitals and health systems respond to Wisconsin’s growing workforce challenges, but only if it is applied with a clear focus on reducing pressure on health care workers and improving care for patients.
“Everyone’s having a hard time finding workers, but health care has a double [challenge]: shrinking workforce, increasing demand,” Zenk said. “So, we need to utilize AI to take burden away from our health care workforce.”
But Zenk warned that AI could have the opposite effect if it is used to push clinicians to do more rather than relieve administrative and operational strain.
“That’s one of the risks,” she said. “What if we use it and it ends up increasing burden?”
Raymond echoed that concern, noting that tools such as ambient documentation could reduce after-hours charting and other administrative work, but he cautioned against the assumption these would automatically lead to increased patient access or higher productivity.
“If that’s taken care of by ambient AI and some other processes, there’s always a temptation for an employer to say, ‘Well, you can see five more patients from your day or seven,’” Raymond said, “or for a physician to try and increase their productivity by adding more patients.
Panelists discussed AI’s potential to streamline processes, improve communication, support clinical decision-making to improve access and enhance patient experience. At the same time, they emphasized the need to address privacy, trust, equity and data quality.
Raymond said some of the earliest concerns around AI involved the quality of data used to train AI systems and the risk of AI systems “hallucinating” or making things up. While those concerns have improved, he said new issues are now taking center stage.
“We’re now pivoting to issues of privacy, the ethics of the use of AI and disclosure to your patients about how you’re using their data, how it’s being processed, whether it is HIPAA-compliant,” Raymond said. “Those are real concerns, and we need to monitor those.”
Professor Lu He said AI is already being deployed in health care settings faster than many may realize, including ambient scribes and tools that help respond to patient messages. But she cautioned against assuming AI will automatically solve workforce challenges.
“There are so many critical questions that we need to address before we just say, ‘Oh, it’s the solution,’” she said. “For example, is it exacerbating existing clinician burnout? Is it exacerbating existing inequities? Do patients trust them at all?”
Zenk said hospitals and health systems should start by identifying the problem they want AI to help solve, rather than adopting the technology for its own sake.
“You have to look at what problem you want AI to help with, what will best serve the workforce and patients,” Zenk said.
She pointed to a recent visit to a rural hospital lab as an example of AI being used in a practical, workforce-supportive way. AI was assisting with microscopic cell counting, allowing staff to focus their expertise on verifying abnormal results and contacting providers.
“We’re always going to need people to care for people, and those lab techs can stay in the workforce longer if we take some of the physical and mental burdens away,” Zenk said.
Panelists also addressed patient trust and the need for appropriate standards. Zenk said patients may worry that technology will come between them and their providers, making explanations, information and communication essential.
She shared a personal example from a recent neurology appointment in which ambient listening technology allowed her physician to focus more directly on her instead of the computer.
“He said, ‘I’m going to put the computer away because I have ambient listening, and I can pay attention to you,’” Zenk said. “How would that feel as a patient and family? Reassuring.”
For hospitals and health systems, the panelists said, AI’s promise is real—but so are the risks. Used well, AI could reduce burden, improve access, support clinicians and help sustain Wisconsin’s health care workforce. Used poorly, it could deepen burnout, create new barriers for patients and add complexity to an already strained system.
AI is already changing health care, but how it is implemented will determine whether it becomes a meaningful workforce solution or another source of burden for clinicians and patients.
That was a key theme during a WisPolitics-State Affairs and Wisconsin Technology Council luncheon on May 5 at the Medical College of Wisconsin focused on AI and health care workforce shortages. The panel featured WHA Senior Vice President of Workforce and Clinical Practice Ann Zenk; outgoing MCW President and CEO Dr. John Raymond; UW-Milwaukee Assistant Professor of Health Care Informatics Lu He; and Director of AI innovation at Recovery.com Nick Myers.
Pictured, L to R: UW-Milwaukee Assistant Professor of Health Care Informatics Lu He, WHA Senior Vice President of Workforce and Clinical Practice Ann Zenk, outgoing MCW President and CEO Dr. John Raymond, and Director of AI innovation at Recovery.com Nick Myers.
Zenk said AI has real potential to help hospitals and health systems respond to Wisconsin’s growing workforce challenges, but only if it is applied with a clear focus on reducing pressure on health care workers and improving care for patients.
“Everyone’s having a hard time finding workers, but health care has a double [challenge]: shrinking workforce, increasing demand,” Zenk said. “So, we need to utilize AI to take burden away from our health care workforce.”
But Zenk warned that AI could have the opposite effect if it is used to push clinicians to do more rather than relieve administrative and operational strain.
“That’s one of the risks,” she said. “What if we use it and it ends up increasing burden?”
Raymond echoed that concern, noting that tools such as ambient documentation could reduce after-hours charting and other administrative work, but he cautioned against the assumption these would automatically lead to increased patient access or higher productivity.
“If that’s taken care of by ambient AI and some other processes, there’s always a temptation for an employer to say, ‘Well, you can see five more patients from your day or seven,’” Raymond said, “or for a physician to try and increase their productivity by adding more patients.
Panelists discussed AI’s potential to streamline processes, improve communication, support clinical decision-making to improve access and enhance patient experience. At the same time, they emphasized the need to address privacy, trust, equity and data quality.
Raymond said some of the earliest concerns around AI involved the quality of data used to train AI systems and the risk of AI systems “hallucinating” or making things up. While those concerns have improved, he said new issues are now taking center stage.
“We’re now pivoting to issues of privacy, the ethics of the use of AI and disclosure to your patients about how you’re using their data, how it’s being processed, whether it is HIPAA-compliant,” Raymond said. “Those are real concerns, and we need to monitor those.”
Professor Lu He said AI is already being deployed in health care settings faster than many may realize, including ambient scribes and tools that help respond to patient messages. But she cautioned against assuming AI will automatically solve workforce challenges.
“There are so many critical questions that we need to address before we just say, ‘Oh, it’s the solution,’” she said. “For example, is it exacerbating existing clinician burnout? Is it exacerbating existing inequities? Do patients trust them at all?”
Zenk said hospitals and health systems should start by identifying the problem they want AI to help solve, rather than adopting the technology for its own sake.
“You have to look at what problem you want AI to help with, what will best serve the workforce and patients,” Zenk said.
She pointed to a recent visit to a rural hospital lab as an example of AI being used in a practical, workforce-supportive way. AI was assisting with microscopic cell counting, allowing staff to focus their expertise on verifying abnormal results and contacting providers.
“We’re always going to need people to care for people, and those lab techs can stay in the workforce longer if we take some of the physical and mental burdens away,” Zenk said.
Panelists also addressed patient trust and the need for appropriate standards. Zenk said patients may worry that technology will come between them and their providers, making explanations, information and communication essential.
She shared a personal example from a recent neurology appointment in which ambient listening technology allowed her physician to focus more directly on her instead of the computer.
“He said, ‘I’m going to put the computer away because I have ambient listening, and I can pay attention to you,’” Zenk said. “How would that feel as a patient and family? Reassuring.”
For hospitals and health systems, the panelists said, AI’s promise is real—but so are the risks. Used well, AI could reduce burden, improve access, support clinicians and help sustain Wisconsin’s health care workforce. Used poorly, it could deepen burnout, create new barriers for patients and add complexity to an already strained system.