The transfer of model knowledge (knowledge distillation) belongs to the category of artificial intelligence and is particularly important in areas such as automation and Industry 4.0.
Imagine there is a large, very clever artificial intelligence (AI) that solves complex tasks but needs a lot of computing power - like an experienced expert. Sometimes, however, a smaller, faster AI is required to perform the same tasks, for example because it has to run on a simple device. This is where the transfer of model knowledge comes into play: the knowledge or "experience" of the large AI model (teacher model) is specifically transferred to a smaller, more efficient model (student model). The small model thus learns the tricks and shortcuts from the large model, delivers similar results but uses far fewer resources.
One example: In a factory, large AI software monitors all machines. To ensure that even small devices or robots in the hall can act intelligently, they only receive the knowledge that is really relevant through knowledge distillation. This allows them to work quickly and save energy while still benefiting from the findings of the main AI.















