DeepSpeed is a deep learning optimization library that makes distributed training and inference easy, efficient, and effective.
COMMITS
/ deepspeed/runtime/base_optimizer.py March 5, 2026
M
Fix hook count performance regression from v0.18.5 (#7886)
Masahiro Tanaka committed
January 20, 2026
M
Fix issue with BF16 optimizer selection (#7788)
Masahiro Tanaka committed
January 17, 2026
M
Fix gradient checkpointing with use_reentrant=True / PyTorch-style backward / ZeRO-3 (#7780)
Masahiro Tanaka committed
December 4, 2025
M
Low-precision master params/grads/optimizer states (#7700)
Masahiro Tanaka committed
November 19, 2025
M
PyTorch-compatible backward API (#7665)
Masahiro Tanaka committed
September 3, 2025
M
Fix scaling and allgather with `torch.autocast` (#7534)
Masahiro Tanaka committed
June 19, 2025
M
Enable torch.autocast with ZeRO (#6993)
Masahiro Tanaka committed
January 15, 2025
Q
`warn` to `warning` (#6952)
Quentin Gallouédec committed
November 19, 2024
L
Add explicit parameters for torch.load (#6751)
Logan Adams committed
March 28, 2024
M
Improve universal checkpoint (#5289)
Masahiro Tanaka committed