Command Palette
Search for a command to run...
Self-Knowledge Distillation
Self-Knowledge Distillation is a machine learning technique that involves enabling a model to learn from itself, aiming to compress large models to enhance efficiency and performance. In the field of computer vision, this method seeks to allow the student model to extract key knowledge from the teacher model while maintaining high accuracy, reducing computational resource consumption, and improving the feasibility of model deployment.