Paper
19 July 2024 Class-wise restructured knowledge distillation for medical image segmentation
Jiaqing Shen, Xiangchun Yu
Author Affiliations +
Proceedings Volume 13213, International Conference on Image Processing and Artificial Intelligence (ICIPAl 2024); 1321312 (2024) https://doi.org/10.1117/12.3035297
Event: International Conference on Image Processing and Artificial Intelligence (ICIPAl2024), 2024, Suzhou, China
Abstract
Knowledge distillation is an effective model compression and performance enhancement method to instruct students by delivering soft labels or features of the instructor's logits. However, different organ classes in medical images may have similarities in shape, size, texture, etc., which may lead to mutual interference between each class. To solve this problem, we propose Class-wise restructured knowledge distillation (CWRKD). CWRKD generates class-wise features by coarse segmentation prediction of the auxiliary segmentation head with the features of the backbone, and then reconstructs it using our proposed Res module, and utilizes the instructor's features for its to bootstrap it, and also bootstrap the soft labels generated by the auxiliary segmentation head through the teacher's soft labels as a way to reduce the interference problem of similarity between each class. Experiments on different datasets show that CWRKD is effective.
(2024) Published by SPIE. Downloading of the abstract is permitted for personal use only.
Jiaqing Shen and Xiangchun Yu "Class-wise restructured knowledge distillation for medical image segmentation", Proc. SPIE 13213, International Conference on Image Processing and Artificial Intelligence (ICIPAl 2024), 1321312 (19 July 2024); https://doi.org/10.1117/12.3035297
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Image segmentation

Medical imaging

Head

Convolution

Education and training

Visualization

Ablation

Back to Top