The Business & Technology Network
Helping Business Interpret and Use Technology
«  
  »
S M T W T F S
 
 
1
 
2
 
3
 
4
 
5
 
6
 
7
 
8
 
9
 
 
 
 
 
14
 
15
 
16
 
17
 
18
 
19
 
20
 
21
 
22
 
23
 
24
 
25
 
26
 
27
 
28
 
29
 
30
 
 
 
 

McKinsey: Open-source AI tools are quietly winning in the enterprise

DATE POSTED:April 9, 2025
 Open-source AI tools are quietly winning in the enterprise

Open‑source AI is quietly reshaping enterprise tech stacks—and a new McKinsey, Mozilla Foundation, and Patrick J. McGovern Foundation survey of 700 technology leaders shows why cost, control, and community are tipping the scales. When a global bank needed full transparency for a risk‑scoring model, it skipped the closed APIs and fine‑tuned Llama 3 on its own servers. That story is no longer the outlier; it is the emerging pattern.

Open source meets the AI gold rush

The past two years have produced a surge of awareness, experimentation, and funding around generative AI and large language models. Enterprises want results fast, but they also need room to tinker. Open repositories deliver both. A single Git pull can spin up a working model in minutes, letting engineers explore architectures, tweak parameters, and benchmark without procurement delays. That do‑it‑yourself velocity explains why projects such as Meta’s Llama, Google’s Gemma, DeepSeek‑R, and Alibaba’s Qwen families have landed in production pipelines despite their “research only” disclaimers. Performance gaps with proprietary titans like GPT‑4 are shrinking; the freedom to inspect and modify the code remains priceless.

What the survey reveals

To quantify the shift, McKinsey partnered with the Mozilla Foundation and the Patrick J. McGovern Foundation, canvassing 41 countries and more than 700 senior developers, CIOs, and CTOs. The study, titled “Open Source in the Age of AI”, is the largest snapshot yet of how enterprises blend open and closed solutions as they move from pilot projects to value capture at scale. Respondents span industries from finance to healthcare, manufacturing to public sector, giving the data broad relevance. While the full report arrives in March, the preview numbers already upend a few assumptions about what “enterprise‑grade” AI looks like in 2025.

Across multiple layers of the AI technology stack, more than half of surveyed organizations use at least one open‑source component—often right next to a commercial API key.

  • Data & feature engineering: 58 % rely on open libraries for ingestion, labeling, or vectorization.
  • Model layer: 63 % run an open model such as Llama 2, Gemma, or Mistral in production; the figure jumps to 72 % inside tech companies.
  • Orchestration & tooling: 55 % employ open frameworks like LangChain, Ray, or KServe for routing and scaling.
  • Application layer: 51 % embed open components in chatbots, copilots, or analytic dashboards.

These percentages climb even higher inside organizations that rate AI as “critical to competitive advantage.” In that cohort, leaders are 40 % more likely than peers to integrate open models and libraries, underscoring a simple fact: when AI is strategic, control and flexibility matter.

  • Lower total cost of ownership. Sixty percent of decision makers say implementation costs are lower with open tools than with comparable proprietary services. Running a fine‑tuned 7‑billion‑parameter model on commodity GPUs can undercut API pricing by orders of magnitude when usage is steady.
  • Deeper transparency and customization. Teams working on regulated workloads—think healthcare diagnostics or trading algorithms—value the ability to audit weights, trace data lineage, and patch vulnerabilities without waiting for a vendor release cycle. Open weights make that feasible.
  • Talent magnetism. Eighty‑one percent of developers and technologists report that open‑source fluency is highly valued in their field. Engineers want to contribute upstream, showcase GitHub portfolios, and avoid black‑box dead ends. Enterprises courting that talent pool lean into permissive licenses rather than walled gardens.

Open tools are not a panacea. When asked about adoption barriers, 56 % of respondents cite security and compliance concerns, while 45 % worry about long‑term support. Proprietary vendors score higher on “time to value” and “ease of use” because they bundle hosting, monitoring, and guardrails. And when executives prefer closed systems, they do so for one dominant reason: 72 % say proprietary solutions offer tighter control over risk and governance. In other words, enterprises weigh openness against operational certainty on a case‑by‑case basis.

This benchmark asks if AI can think like an engineer

Multimodel is the new normal

McKinsey’s data echoes a trend seen in cloud computing a decade ago: hybrid architectures win. Few companies will go all‑in on open or proprietary; most will mix and match. A closed foundation model might power a customer‑facing chatbot, while an open Llama‑variant handles internal document search. The choice often hinges on latency, privacy, or domain specificity. Expect “bring your own model” menus to become as standard as multi‑cloud dashboards.

Playbook for CTOs

Based on survey insights and expert interviews, McKinsey outlines a pragmatic decision matrix for technology leaders:

  • Choose open when you need full weight transparency, aggressive cost optimization, or deep domain fine‑tuning.
  • Choose proprietary when speed to market, managed security, or global language coverage outweigh customization needs.
  • Blend both when workloads vary: keep public‑facing experiences on a managed API, run sensitive or high‑volume inference on self‑hosted open models.
  • Invest in talent & tooling: open success depends on MLOps pipelines, model security scans, and engineers fluent in the evolving ecosystem.

One interviewed CIO summed it up: “We treat models like microservices. Some we build, some we buy, all must interoperate.”

Why it matters now

Enterprises face a strategic crossroads. Betting solely on proprietary platforms risks vendor lock‑in and opaque model behavior. Going fully open demands in‑house expertise and rigorous security posture. The survey data suggests the winning play is optionality: build a stack that lets teams swap models as fast as the market evolves. Open source is no longer the rebel choice; it is becoming a first‑class citizen in enterprise AI. Ignore it, and you may find your competition iterating faster, hiring better talent, and paying less for inference. Embrace it thoughtfully, and you gain leverage, insight, and a community that never stops shipping improvements.

According to “Open Source in the Age of AI,” 76 % of leaders plan to expand open‑source AI use in the next few years. The age of multimodel pragmatism has arrived—code is open, the playing field is wide, and the smartest organizations will learn to thrive in both worlds.

Featured image credit