What is AI distillation and what does it mean for OpenAI?

DeepSeek's development of low-cost chatbot has shaken markets

RYOHTAROH SATOH

TOKYO -- The tech industry and stock markets have spent much of this week trying to grasp how a small, relatively unknown Chinese company was able to develop a sophisticated artificial intelligence chatbot on par with Open AI's ChatGPT at a fraction of the cost.

One possible answer being floated in tech circles is distillation, an AI training method that uses bigger "teacher" models to train smaller but faster-operating "student" models.

Sponsored Content

About Sponsored ContentThis content was commissioned by Nikkei's Global Business Bureau.