BlogAI Economics
AI Economics8 min read

Most AI Projects Just Burn Cash. Here's How to Make Them Profitable.

An expert analysis on AI unit economics, the "Evergreen Ratio", and calculating the AI Volatility Tax to stop bleeding cash on inferencing.

By Richard Ewing·

The AI Profitability Crisis

In 2026, the era of deploying AI simply to boast about having "AI inside" is functionally dead. Executive boards are no longer accepting pure R&D burn without measurable unit economics. We have transitioned from the awe of capability to the brutal reality of the P&L.

There are two compounding liabilities that drag most AI projects into "Negative Carry" (costing more to run than they generate): The AI Volatility Tax and an abysmal Evergreen Ratio.

The AI Volatility Tax

Every non-deterministic AI inference carries a Volatility Tax. This is the hidden cost of human-in-the-loop verification required because probabilistic systems hallucinate. If your AI writes code 10x faster but requires your senior engineer to spend 4 hours auditing its outputs to prevent a production vulnerability, your AI feature has negative unit economics.

You must factor the labor cost of verification directly into the Cost of Goods Sold (COGS) for your AI features. The solution is the Execution Layer—a deterministic boundary that verifies and enforces strict schemas on LLM outputs before they are processed by your business logic. By shifting validation to code rather than humans, you minimize the Volatility Tax.

The Evergreen Ratio

If every query to your system requires a live inference from a Frontier Model (like GPT-4 Opus or Claude 3.5 Opus), you will bankrupt yourself at scale. Profitability is determined by the Evergreen Ratio: the ratio of Cached/Pre-calculated Responses to Live Inferences.

High profitability AI systems rely heavily on semantic caching, embeddings, and pre-computation. The most profitable AI systems are the ones that use the least amount of live AI execution in production. By driving your Evergreen Ratio up, you detach your revenue scaling from your compute scaling.


To measure your feature's economic viability, use the AI Unit Economics Benchmark (AUEB) or the Volatility Tax Auditor (VTA) tools. Read the full post on Built In.

Like this analysis?

Get the weekly engineering economics briefing — one email, every Monday.

Subscribe Free →

More in AI Economics

Published Work

This article expands on ideas from my published work in CIO.com, Built In, Mind the Product, and HackerNoon. View published articles →

📊

Richard Ewing

The Product Economist — Quantifying engineering economics for technology leaders, PE firms, and boards.