Learning to Continually Learn Rapidly from Few and Noisy Data
| dc.contributor.author | I-Hsien Kuo, Nicholas | en |
| dc.contributor.author | Harandi, Mehrtash | en |
| dc.contributor.author | Fourrier, Nicolas | en |
| dc.contributor.author | Walder, Christian | en |
| dc.contributor.author | Ferraro, Gabriela | en |
| dc.contributor.author | Suominen, Hanna | en |
| dc.date.accessioned | 2026-01-01T08:41:36Z | |
| dc.date.available | 2026-01-01T08:41:36Z | |
| dc.date.issued | 2021 | en |
| dc.description.abstract | Neural networks suffer from catastrophic forgetting and are unable to sequentially learn new tasks without guaranteed stationarity in data distribution. Continual learning could be achieved via replay { by concurrently training externally stored old data while learning a new task. However, replay becomes less effective when each past task is allocated with less memory. To overcome this difficulty, we supplemented replay mechanics with meta-learning for rapid knowledge acquisition. By employing a meta-learner, which learns a learn-ing rate per parameter per past task, we found that base learners produced strong results when less memory was available. Additionally, our approach inherited several meta-learning advantages for continual learning: it demonstrated strong robustness to continually learn under the presence of noises and yielded base learners to higher accuracy in less updates. | en |
| dc.description.sponsorship | This research was supported by the Australian Government Research Training Program (AGRTP) Scholarship. We also thank our reviewers for the constructive feedback. | en |
| dc.description.status | Peer-reviewed | en |
| dc.format.extent | 12 | en |
| dc.identifier.other | ORCID:/0000-0003-3652-9689/work/162782447 | en |
| dc.identifier.scopus | 85171442685 | en |
| dc.identifier.uri | https://hdl.handle.net/1885/733799144 | |
| dc.language.iso | en | en |
| dc.relation.ispartofseries | 2021 AAAI Workshop on Meta-Learning and MetaDL Challenge | en |
| dc.rights | Publisher Copyright: Copyright © The authors and PMLR 2023. | en |
| dc.source | Proceedings of Machine Learning Research | en |
| dc.title | Learning to Continually Learn Rapidly from Few and Noisy Data | en |
| dc.type | Conference paper | en |
| dspace.entity.type | Publication | en |
| local.bibliographicCitation.lastpage | 76 | en |
| local.bibliographicCitation.startpage | 65 | en |
| local.contributor.affiliation | I-Hsien Kuo, Nicholas; AGRTP Stipend Scholar - CECS, The Australian National University | en |
| local.contributor.affiliation | Harandi, Mehrtash; Monash University | en |
| local.contributor.affiliation | Fourrier, Nicolas; Pôle Universitaire Léonard de Vinci | en |
| local.contributor.affiliation | Walder, Christian; School of Computing, ANU College of Systems and Society, The Australian National University | en |
| local.contributor.affiliation | Ferraro, Gabriela; School of Cybernetics, ANU College of Systems and Society, The Australian National University | en |
| local.contributor.affiliation | Suominen, Hanna; School of Computing, ANU College of Systems and Society, The Australian National University | en |
| local.identifier.ariespublication | a383154xPUB45155 | en |
| local.identifier.citationvolume | 140 | en |
| local.identifier.pure | bc5c0c7e-6f13-4fe0-a657-eed13094a330 | en |
| local.identifier.url | https://www.scopus.com/pages/publications/85171442685 | en |
| local.type.status | Published | en |