Health Journalism Glossary

Shadow AI

  • AI and Patient Safety
  • |
  • Health IT

Shadow AI is the use of any artificial intelligence tools or applications by employees or end users without formal approval or oversight of a company or health system’s information technology or security department. For example, using ChatGPT or similar programs to automate some job tasks to make things go faster. However, because IT teams are unaware of these apps being used, employees can unknowingly expose the organization to risks regarding data security and compliance, according to a blog post from IBM.

Examples of shadow AI include AI-powered chatbots, machine learning models used to find patterns in data, and data visualization tools to make charts.


Deeper Dive

From 2023-2024, the adoption of generative AI applications by employees grew from 74% to 96%, the IBM post said. Shadow AI can expose companies and health systems to risks such as data leaks, fines for noncompliance and reputation damage. 

Most employees don’t use these programs with malicious intent, but just to make things faster. A 2026 Wolters Kluwer survey found that 41% of health care professionals have encountered unauthorized AI tools in their organization, and nearly 20% have used them. About half said they did so for a faster workflow, and about 30% said it was due to a lack of approved tools, or that any approved tools they did have lacked their desired functionality. Some 26% of providers who reported using unsanctioned AI tools did so out of curiosity and experimentation. 

Share: