Journal Home Online First Current Issue Archive For Authors Journal Information 中文版

Frontiers of Information Technology & Electronic Engineering >> 2024, Volume 25, Issue 4 doi: 10.1631/FITEE.2200644

Multi-exit self-distillation with appropriate teachers

College of Computer Science and Technology, Zhejiang University, Hangzhou 310000, China

Received: 2022-12-16 Accepted: 2024-05-06 Available online: 2024-05-06

Next Previous

Abstract

allows early-stop inference to reduce computational cost, which can be used in resource-constrained circumstances. Recent works combine the with self-distillation to simultaneously achieve high efficiency and decent performance at different network depths. However, existing methods mainly transfer knowledge from deep exits or a single ensemble to guide all exits, without considering that inappropriate s between students and teachers may degrade the model performance, especially in shallow exits. To address this issue, we propose Multi-exit self-distillation with Appropriate TEachers (MATE) to provide diverse and appropriate teacher knowledge for each exit. In MATE, multiple ensemble teachers are obtained from all exits with different trainable weights. Each exit subsequently receives knowledge from all teachers, while focusing mainly on its primary teacher to keep an appropriate gap for efficient knowledge transfer. In this way, MATE achieves diversity in while ensuring learning efficiency. Experimental results on CIFAR-100, TinyImageNet, and three fine-grained datasets demonstrate that MATE consistently outperforms state-of-the-art multi-exit self-distillation methods with various network architectures.

Related Research