September 09, 2018
Humans can learn in a continuous manner. Old rarely utilized knowledge can be overwritten by new incoming information while important, frequently used knowledge is prevented from being erased. In artificial learning systems, lifelong learning so far has focused mainly on accumulating knowledge over tasks and overcoming catastrophic forgetting. In this paper, we argue that, given the limited model capacity and the unlimited % TT: unlimited sounds a bit too strong to me... new information to be learned, knowledge has to be preserved or erased selectively. Inspired by neuroplasticity and earlier work on weight regularization for lifelong learning, we propose an online method to compute the importance of the parameters of a neural network, based on the data that the network is actively applied to, in an unsupervised manner. To this end, after learning a task and whenever a new sample is fed to the network, we accumulate an importance measure for each parameter of the network, based on how sensitive the predicted output is to a change in this parameter. This results in importance weights that are data or context dependent. When learning a new task, changes to important parameters can then be penalized, effectively preventing knowledge important for previous tasks from being overwritten. Further, we show an interesting connection between a local version of our method and
Written by
Mohamed Elhoseiny
Marcus Rohrbach
Francesca Babiloni
Rahaf Aljundi
Tinne Tuytelaars
Publisher
ECCV
Research Topics
November 20, 2024
Jianfeng Chi, Ujjwal Karn, Hongyuan Zhan, Eric Smith, Javier Rando, Yiming Zhang, Kate Plawiak, Zacharie Delpierre Coudert, Kartikeya Upasani, Mahesh Pasupuleti
November 20, 2024
November 11, 2024
Sherry Xue, Romy Luo, Changan Chen, Kristen Grauman
November 11, 2024
October 31, 2024
Mike Lambeta, Tingfan Wu, Ali Sengül, Victoria Rose Most, Nolan Black, Kevin Sawyer, Romeo Mercado, Haozhi Qi, Alexander Sohn, Byron Taylor, Norb Tydingco, Gregg Kammerer, Dave Stroud, Jake Khatha, Kurt Jenkins, Kyle Most, Neal Stein, Ricardo Chavira, Thomas Craven-Bartle, Eric Sanchez, Yitian Ding, Jitendra Malik, Roberto Calandra
October 31, 2024
October 16, 2024
Movie Gen Team
October 16, 2024
Foundational models
Latest news
Foundational models