TOKYO -- The tech industry and stock markets have spent much of this week trying to grasp how a small, relatively unknown Chinese company was able to develop a sophisticated artificial intelligence chatbot on par with Open AI's ChatGPT at a fraction of the cost.
One possible answer being floated in tech circles is distillation, an AI training method that uses bigger "teacher" models to train smaller but faster-operating "student" models.



.jpg?width=178&fit=cover&gravity=faces&dpr=2&quality=medium&source=nar-cms&format=auto&height=100)