JPMorgan Chase has seen remarkable voluntary adoption of its internal AI tools, with over 60% of its workforce now actively using the platform. This success wasn’t driven by mandates but rather by organic growth fueled by employee-led innovation. The key? Prioritizing seamless connectivity to existing business systems instead of focusing solely on the AI models themselves.
The Unexpected Viral Growth
Just two-and-a-half years after launching its LLM suite, JPMorgan found itself with 250,000 employees using the internal platform. This rapid adoption surprised even the company’s leadership, but it highlights a crucial trend: when AI tools provide tangible value, employees will adopt them willingly. The firm’s Chief Analytics Officer, Derek Waldron, observed that workers weren’t just using the AI; they were actively building, customizing, and sharing their own AI assistants tailored to specific roles.
This bottom-up enthusiasm created an “innovation flywheel” where early adopters demonstrated practical use cases, encouraging wider adoption. JPMorgan recognized that the models themselves would eventually become a commodity, so it focused on making the connectivity the core advantage.
Connectivity as the Core Strategy
JPMorgan’s strategy stands out because it treats AI as core infrastructure, not just a novelty. The firm invested heavily in retrieval-augmented generation (RAG) technology, now in its fourth generation, and multimodal integration. The AI suite isn’t isolated; it’s deeply embedded within the company’s existing systems.
Employees can directly access and interact with data from CRM, HR, trading, finance, and risk systems. The company is continuously adding new connections, making AI tools an indispensable part of daily workflows. According to Waldron, the real value lies in access – without meaningful connections to critical data and tools, even advanced AI remains underutilized.
The Power of Reusable Building Blocks
JPMorgan emphasizes a “one platform, many jobs” approach. Recognizing that every role is unique, the company provides reusable building blocks (RAG, document intelligence, structured data querying) that employees can combine to create role-specific AI tools. This flexible system empowers workers to customize AI to their exact needs, rather than forcing them into pre-defined solutions.
The firm’s progression through multiple RAG generations—from basic vector search to hierarchical, multimodal knowledge pipelines—demonstrates a commitment to continuous improvement. Waldron even suggests pausing before asking a colleague, considering if an AI assistant can answer the question first.
Ultimately, JPMorgan’s success highlights a critical lesson: The true potential of AI isn’t just about powerful models; it’s about seamless, ubiquitous connectivity that unlocks real-world value. If AI can’t connect to the systems where work actually happens, it remains little more than a sophisticated, expensive toy.































