July 17, 2020
Continual learning aims to learn new tasks without forgetting previously learned ones. We hypothesize that representations learned to solve each task in a sequence have a shared structure while containing some task-specific properties. We show that shared features are significantly less prone to forgetting and propose a novel hybrid continual learning framework that learns a disjoint representation for task-invariant and task-specific features required to solve a sequence of tasks. Our model combines architecture growth to prevent forgetting of task-specific skills and an experience replay approach to preserve shared skills. We demonstrate our hybrid approach is effective in avoiding forgetting and show it is superior to both architecture-based and memory-based approaches on class incrementally learning of a single dataset as well as a sequence of multiple datasets in image classification. Our code is available at https://github.com/facebookresearch/Adversarial-Continual-Learning
Publisher
ECCV
Research Topics
September 30, 2023
Pierre Fernandez, Guillaume Couairon, Hervé Jegou, Matthijs Douze, Teddy Furon
September 30, 2023
September 29, 2023
Yiming Li, Qi Fang, Jiamu Bai, Siheng Chen, Felix Xu, Chen Feng
September 29, 2023
September 27, 2023
Xiaoliang Dai, Ji Hou, Kevin Chih-Yao Ma, Sam Tsai, Jialiang Wang, Rui Wang, Peizhao Zhang, Simon Vandenhende, Xiaofang Wang, Abhimanyu Dubey, Matthew Yu, Abhishek Kadian, Filip Radenovic, Dhruv Mahajan, Kunpeng Li, Yue (R) Zhao, Vladan Petrovic, Mitesh Kumar Singh, Simran Motwani, Yiwen Song, Yi Wen, Roshan Sumbaly, Vignesh Ramanathan, Zijian He, Peter Vajda, Devi Parikh
September 27, 2023
September 22, 2023
Shuangzhi Li, Zhijie Wang, Felix Xu, Qing Guo, Xingyu Li, Lei Ma
September 22, 2023
Who We Are
Our Actions
Newsletter