Sign in

Accelerated CNN Training Through Gradient Approximation

By Ziheng Wang and Sree Harsha Nelaturu
Training deep convolutional neural networks such as VGG and ResNet by gradient descent is an expensive exercise requiring specialized hardware such as GPUs. Recent works have examined the possibility of approximating the gradient computation while maintaining the same convergence properties. While promising, the approximations only work on relatively small datasets... Show more
August 15, 2019
=
0
Loading PDF…
Loading full text...
Similar articles
Loading recommendations...
=
0
x1
Accelerated CNN Training Through Gradient Approximation
Click on play to start listening