DeepSpeed is a deep learning optimization library that makes distributed training and inference easy, efficient, and effective.
COMMITS
/ docs/code-docs/source/zero3.rst May 20, 2025
O
Enable ZeRO set/get APIs for NVMe offload (#7046)
Olatunji Ruwase committed
November 25, 2024
L
Cleanup code docs warnings (#6783)
Logan Adams committed
W
Fix Doc Error: ZeRO Stage 2 gradient partitioning (#6775)
Wentao Ye committed
October 14, 2024
O
Add API for updating ZeRO gradients (#6590)
Olatunji Ruwase committed
October 10, 2024
M
Add API to get devices of offload states (#6586)
Masahiro Tanaka committed
September 27, 2024
M
Add APIs to offload states of model, optimizer, and engine (#6011)
Masahiro Tanaka committed
November 17, 2023
September 1, 2023
O
Allow modification of zero partitioned parameters (#4192)
Olatunji Ruwase committed
July 28, 2023
O
Multiple zero stage 3 related fixes (#3886)
Olatunji Ruwase committed
July 10, 2023
D
fix some typo docs/ (#3917)
digger yu committed
April 26, 2023
Z
[DRAFT] Tentative implementation of MiCS (#2964)
Zhen Zhang committed
March 24, 2023
O
Empty ZeRO3 partition cache (#3060)
Olatunji Ruwase committed
February 28, 2023
O
Enable tensor fragments for zero 2 & 3 (#2727)
Olatunji Ruwase committed
November 15, 2022
M
Update docs to autogenerate pydantic config model docs (#2509)
Michael Wyatt committed
April 19, 2021
S
ZeRO-Infinity docs (#979)
Shaden Smith committed
J
ZeRO-Infinity (#976)
Jeff Rasley committed
March 11, 2021
S
small tweaks (#839)
Stas Bekman committed
March 8, 2021
S
ZeRO 3 Offload (#834)
Samyam Rajbhandari committed