DeepSpeed is a deep learning optimization library that makes distributed training and inference easy, efficient, and effective.
COMMITS
/ docs/_tutorials/zero-one-adam.md February 5, 2025
O
Update GH org references (#6998)
Olatunji Ruwase committed
July 29, 2024
L
Add doc of compressed backend in Onebit optimizers (#5782)
Liangliang Ma committed
July 27, 2023
D
fix: Remove duplicate word the (#4051)
digger yu committed
March 11, 2022
Y
01 adam optimizer (#1790)
Yucheng Lu committed