Sign in

Measuring Asymmetric Gradient Discrepancy in Parallel Continual Learning

By Fan Lyu and others
In Parallel Continual Learning (PCL), the parallel multiple tasks start and end training unpredictably, thus suffering from training conflict and catastrophic forgetting issues. The two issues are raised because the gradients from parallel tasks differ in directions and magnitudes. Thus, in this paper, we formulate the PCL into a minimum... Show more
December 31, 2022
=
0
Loading PDF…
Loading full text...
Similar articles
Loading recommendations...
=
0
x1
Measuring Asymmetric Gradient Discrepancy in Parallel Continual Learning
Click on play to start listening