This AI Research Addresses the Problem of ‘Loss of Plasticity’ in Deep Learning Systems When Used in Continual Learning Settings


Image Recognition

ARTICLE SOURCE

A large batch of data was also used to train recent deep learning systems like GPT-3 and DallE. When exposed to fresh data, deep learning systems frequently lose most of what they have previously learned, a condition known as “catastrophic forgetting.” In other words, deep learning techniques do not retain stability in ongoing learning issues. Catastrophic forgetting has recently gotten fresh interest due to the development of deep learning since several articles have been written about preserving stability in deep continuous learning. By demonstrating that early learning in reinforcement learning issues can have a negative impact on later learning, Nishikin et al. They demonstrate that persistent supervised learning issues cause deep learning approaches to lose plasticity and that this plasticity loss can be severe.