Decoupled optimisation for long-tailed visual recognition

Cong Cong, Shihyu Xuan, Sidong Liu, Shiliang Zhang, Maurice Pagnucco, Yang Song

Research output: Contribution to journalConference paperpeer-review

1 Citation (Scopus)

Abstract

When training on a long-tailed dataset, conventional learning algorithms tend to exhibit a bias towards classes with a larger sample size. Our investigation has revealed that this biased learning tendency originates from the model parameters, which are trained to disproportionately contribute to the classes characterised by their sample size (e.g., many, medium, and few classes). To balance the overall parameter contribution across all classes, we investigate the importance of each model parameter to the learning of different class groups, and propose a multistage parameter Decouple and Optimisation (DO) framework that decouples parameters into different groups with each group learning a specific portion of classes. To optimise the parameter learning, we apply different training objectives with a collaborative optimisation step to learn complementary information about each class group. Extensive experiments on long-tailed datasets, including CIFAR100, Places-LT, ImageNet-LT, and iNaturaList 2018, show that our framework achieves competitive performance compared to the state-of-the-art.
Original languageEnglish
Pages (from-to)1380-1388
Number of pages9
JournalProceedings of the AAAI Conference on Artificial Intelligence
Volume38
Issue number2
DOIs
Publication statusPublished - 24 Mar 2024
Event38th AAAI Conference on Artificial Intelligence, AAAI 2024 - Vancouver, Canada
Duration: 20 Feb 202427 Feb 2024

Fingerprint

Dive into the research topics of 'Decoupled optimisation for long-tailed visual recognition'. Together they form a unique fingerprint.

Cite this