What is Model Distillation?
Model distillation (also called knowledge distillation) is a technique for creating smaller, faster AI models by training them to mimic the behavior of larger, more capable models.
Model distillation (also called knowledge distillation) is a technique for creating smaller, faster AI models by training them to mimic the behavior of larger, more capable models. The large model is called the "teacher" and the small model is called the "student."
The student model learns to replicate the teacher's output distribution rather than learning from raw data. This is more efficient because the teacher's outputs contain "dark knowledge" — information about the relationships between classes and the confidence levels of predictions.
Distillation is one of the most impactful cost optimization strategies for AI applications. A distilled model can achieve 90-95% of the teacher model's quality at 10-50x lower inference cost. For high-volume applications, this can mean the difference between positive and negative unit economics.
Example: instead of calling GPT-4 ($0.03/query) for every customer support question, you can distill GPT-4's responses into a fine-tuned GPT-3.5 ($0.001/query) — a 30x cost reduction with minimal quality loss.
Why It Matters
Model distillation is the key to making AI features economically viable at scale. It directly addresses the Cost of Predictivity problem by reducing inference costs while preserving quality.
Frequently Asked Questions
What is model distillation?
Model distillation creates smaller, cheaper AI models by training them to mimic larger models. The small "student" model learns from the large "teacher" model outputs.
How much does distillation save?
Distilled models typically achieve 90-95% of the original quality at 10-50x lower inference cost. This can turn negative unit economics positive.
Free Tools
Related Terms
Need Expert Help?
Richard Ewing is a Product Economist and AI Capital Auditor. He helps companies translate technical complexity into financial clarity.
Book Advisory Call →