DeepSpeed is a deep learning optimization library that makes distributed training and inference easy, efficient, and effective.
COMMITS
/ docs/code-docs/source/optimizers.rst March 11, 2022
Y
01 adam optimizer (#1790)
Yucheng Lu committed
April 21, 2021
C
1-bit LAMB optimizer (#970)
Conglong Li committed
April 19, 2021
S
assert no Z2/Z3 with pipeline and fix some docs links (#980)
Shaden Smith committed
J
ZeRO-Infinity (#976)
Jeff Rasley committed
March 16, 2021
C
1-bit Adam v2 (#817)
Conglong Li committed
March 11, 2021
October 30, 2020
R
Add CPUAdam optimizer for zero-offload in deepspeed engine (#484)
Reza Yazdani committed