Most AI Strategy Documents Are Solving the Wrong Question
The default 2026 enterprise AI strategy compares vendors. The strategic question—buried on slide 27 if it appears at all—is dependence. Hugging Face's $4.5 billion bet shows what the alternative looks like.

Most AI strategy documents written in 2026 fall into the same shape. There’s a maturity model, a list of tools, a vendor comparison, and a budget. The implicit question they answer: which AI platform should we standardize on?
The question is interesting. It is not the question that determines whether the strategy works.
The question that does—buried on slide 27 if it appears at all—is dependence. Specifically: how much of the company’s future capability is being routed through a single external provider, and what does the exit option look like if that provider raises prices, gets acquired, or drifts in capability?
Hugging Face is the cleanest example of an organization that built its entire strategy around the answer to that second question. The Harvard Business School cases on the company make the bet legible.
What Hugging Face is betting against
CEO Clément Delangue has been explicit, in interviews and in testimony before the U.S. House Science Committee in June 2023, about what motivates the company’s open-source position. The risk he names is concentration:
“I’m incredibly scared of non-decentralized AGI. If only one organization gets to AGI, I think that’s when the risk is highest.”
That’s the systemic version. The commercial version is what most enterprises face: not AGI risk, but the gradual concentration of mission-critical capability inside one or two foundation-model providers, accessed through a single API contract the customer doesn’t fully control.
Delangue’s framing of the alternative: “Open-source AI is the tides that lift all boats, that enables everyone to build, to get transparency on how AI is working, not working, and ultimately leads to a safer future.”
For a CEO writing a strategy doc, “safer” reads as ideological. The translation into operational terms is more concrete: a model layer the company can host, fine-tune, and swap out without renegotiating its core capability.
What the bet looks like in practice
Hugging Face is not anti-cloud. It runs on AWS (its preferred cloud provider since February 2023) and Google Cloud (partnership early 2024). It collaborates with Cloudflare on serverless GPU deployment. It works closely with Meta, IBM, NASA, ServiceNow, Salesforce, and Nvidia.
The structural choice is at the model layer. Hugging Face’s stance: infrastructure is a market with many providers, models are not—yet—and the model layer is where independence has to be defended.
The result, as of October 2024:
- More than 3 million models, datasets, and applications hosted
- More than 1 million free public models
- More than 5 million daily users
- 50,000 organizations served
- Profitability with 220 employees, on a path to 250 by year-end
And—the part most enterprise strategists ignore—a $4.5 billion valuation set in August 2023 by investors including Salesforce, Google, Nvidia, and Amazon. Three of those investors compete with Hugging Face at the model layer. They put money in anyway, because the strategic position the company occupies is more durable than any individual model.
That’s the bet. Independence at the layer that matters, partnership at the layer that doesn’t.
What most enterprise AI strategies miss
The default enterprise AI strategy in 2026 inverts this. It treats the model as a commodity (we’ll use whatever’s cheapest and best) and the infrastructure as a strategic relationship (we’re an Azure shop, or an AWS shop, or a Google Cloud shop). The bill for that inversion gets paid two or three years out, when the model provider raises rates, deprecates a capability, or restructures pricing to capture more of the value that was previously the customer’s.
The Hugging Face case suggests three operating principles for a strategy that doesn’t bake in that bill.
First, treat model dependence as a board-level risk, not a procurement decision. The exposure is symmetric to single-vendor ERP exposure in the 2000s, with the difference that the rate of change is faster and the lock-in is harder to see until it’s deep.
Second, build the muscle to host and fine-tune open-source models internally, even if 80 percent of production usage stays on a closed-API provider. The 20 percent of capability you can host in-house is the leverage that disciplines the 80 percent. Without it, the contract negotiation is one-sided.
Third, evaluate vendors on exit cost, not feature parity. The right question in a vendor review is “what does our six-month migration plan look like if we leave you?” If the answer is “we couldn’t, realistically,” that’s the answer to the strategy question.
The shape of the strategy doc
A strategy doc that takes the dependence question seriously looks different from the standard 2026 template. It has fewer tool comparisons. It has more language about portability, fine-tuning rights, data ownership, and the cost of switching. It treats the AI vendor short list as a credit-risk portfolio, not a vendor RFP.
The Hugging Face case isn’t a recommendation that every company become an open-source platform. It’s a demonstration that the strategic question—open or closed, concentrated or distributed—has a measurable financial answer. A $4.5 billion answer, in this specific case.
Companies writing strategy docs in 2026 that don’t ask the dependence question will have one of two experiences in 2028. Either nothing significant changes—their chosen provider stays priced reasonably, capable, and aligned with their needs. Or it doesn’t, and they’ll wish they’d asked.