Reduce the Cost of Training Data Preparation by Transferring Knowledge from Pre-trained Model

Technical Sharing Session #7

Title of TalkReduce the Cost of Training Data Preparation by Transferring Knowledge from Pre-trained Model
PresenterHenry Hengyuan Zhao (1st year PhD student), supervised by a/Professor Mike Shou
Dept of Electrical and Computer Engineering, NUS
WP2 and WP4, SIA-NUS Digital Aviation Corp Lab
Date27th October 2022, Thursday
SynopsisData and network structure are two essential parts of achieving promising prediction results. On the one hand, collecting visual data for video action or fatigue detection is labour-intensive and time-consuming. On the other hand, large vision transformers (ViT) are the most powerful structure of neural networks, which have tremendously succeeded in various computer vision tasks. To inherit the strong representations from ViT models and alleviate the overfitting problem caused by the shortage of training data. This work proposes a parameter-efficient tuning (PET) method dubbed Important Channel Tuning (ICT) to transfer the knowledge learned from Large ViT models to downstream recognition tasks.