Shadow AI in Banking:
The Visibility Problem
“Shadow AI” refers to AI tools that employees use within an organization without the knowledge, review, or approval of IT, compliance, or leadership. In banking, this happens more often than most institutions realize. Employees adopt AI tools to work faster, solve problems, and improve output. The tools work. The problem is that no one in a position of oversight knows they are in use, what data flows through them, or whether they meet the institution's standards for risk, privacy, or compliance.
Visibility is the starting point for governance.
What Is Shadow AI in Banking?
Why Shadow AI Is Growing in Financial Institutions
AI adoption among employees is accelerating across industries, and financial institutions are not exempt. According to Microsoft's 2026 Data Security Index, more than 70 percent of surveyed global knowledge workers report using AI, and more than 70 percent say they are bringing their own AI tools to work.¹ The same report found that 32 percent of surveyed organizations' data security incidents involve GenAI tools.¹
Employees using personal credentials to access AI tools for work grew five percentage points year over year, and use of personal devices to access AI for work grew nine percentage points over the same period.¹ These behaviors move sensitive data into external systems that the institution has no visibility into and no controls over.
One information security director quoted in the report put it plainly: data that flows through unsanctioned AI tools leaves the firewall.¹
AI agents expand the shadow AI problem in ways that are worth understanding. According to the American Bankers Association (ABA) Banking Journal, financial institutions are increasingly adopting agentic AI. Agentic AI describes systems that can work toward goals, make decisions, and complete multi-step tasks with limited human intervention. Agents are already embedded in vendor platforms, productivity tools, and workflow software that staff use daily. An important distinction between consumer tools employees downloads to their personal devices, an agent can operate inside systems the institution has already approved, accessing data and taking action without a human initiating each step. The ABA Banking Journal noted in December 2025 that agentic AI “introduces layers of unpredictability”, and that institutions must “start with fundamentals”: an inventory of every AI tool in use, whether embedded in vendor platforms or introduced informally by staff.⁴ Without that inventory, the visibility gap widens regardless of whether the agent was formally sanctioned.
What Makes Shadow AI a Specific Risk in Regulated Environments
Banks and credit unions operate under data governance requirements tied to customer financial information. When an employee processes that information through an unreviewed AI tool, the institution may have limited ability to account for where the data went, what it was used for, or whether it was retained by a third party. That gap can create exposure under data security and customer information protection frameworks.
The Financial Stability Board has noted that monitoring AI-related vulnerabilities in financial institutions is still at an early stage, and that challenges such as a lack of transparency and the evolving nature of AI systems make them particularly difficult to track.² Regulators are working to close that monitoring gap. Regulators are working to close that monitoring gap, and institutions that have mapped their own AI activity may be better positioned to respond when questions about AI use arise.
Shadow AI may create a data exposure problem that becomes more difficult to manage as the number of unreviewed tools grows. The more tools in use without review, the more difficult it becomes to map where sensitive information has traveled.
What a Governance Committee Can Do About It
The National Institute of Standards and Technology AI Risk Management Framework (NIST AI RMF) describes the MAP function as the process of gathering information to establish visibility over AI activity, capabilities, risks, and impacts across an organization.³ For banks and credit unions, the MAP function translates directly to the shadow AI problem.
A governance committee focused on MAP activities could take steps such as conducting an internal inventory of AI tools currently in use across departments, establishing a simple intake process so employees have a visible and accessible path to surface AI tools for review, and creating a documented record of what AI activity is taking place and under what conditions.
Visibility comes before reduction. An institution that has mapped its AI activity is in a better position to assess risk, assign ownership, and decide which tools warrant additional controls.
Sources
Microsoft, 2026 Data Security Index - https://info.microsoft.com/ww-landing-data-sec-index-2026.html?lcid=en-us
Financial Stability Board, Monitoring Adoption of AI in the Financial Sector, October 2025 - https://www.fsb.org/2025/10/monitoring-adoption-of-ai-in-the-financial-sector
NIST, AI Risk Management Framework 1.0 - https://www.nist.gov/itl/ai-risk-management-framework
ABA Banking Journal, Are We Sleepwalking Into an Agentic AI Crisis? December 2025 - https://bankingjournal.aba.com/2025/12/are-we-sleepwalking-into-an-agentic-ai-crisis
Related pages in this series:
For a plain language overview of how the NIST AI RMF applies to banks and credit unions, see NIST AI RMF for Banks and Credit Unions
For a detailed statutory overview of Texas HB 149, see Texas HB 149 and Financial Institutions
For context on federal AI strategy and its connection to state-level governance requirements, see Federal Strategy to State Law
This page is for informational purposes only. It provides a general factual overview of publicly available laws, regulatory guidance, and frameworks. It does not constitute legal advice, regulatory interpretation, compliance guidance, or a recommendation of any specific course of action. Laws and guidance referenced here may be subject to change. Qualified legal and compliance professionals can help organizations assess their specific circumstances and obligations.

