News

Knowledge Distillation (KD) aims to improve a light-weight student network supervised by a large teacher network. The core idea of KD is to explore valuable knowledge from the teacher. Previous works ...
Nowadays, the application of deep learning has significantly advanced motor imagery research, which is one of the major topic of brain-computer interfaces (BCIs). Traditional approaches primarily ...