DeepSpeed is a deep learning optimization library that makes distributed training and inference easy, efficient, and effective.
COMMITS
/ docs/code-docs/source/conf.py June 9, 2025
F
Fix docs that are rendering Incorrectly (#7344)
Felix Gondwe committed
March 31, 2023
M
Update DeepSpeed copyright license to Apache 2.0 (#3111)
Michael Wyatt committed
February 27, 2023
J
add missing license info to top of all source code (#2889)
Jeff Rasley committed
November 15, 2022
M
Update docs to autogenerate pydantic config model docs (#2509)
Michael Wyatt committed
July 25, 2022
A
Add flake8 to pre-commit checks (#2051)
Alex Hedges committed
July 6, 2022
J
[docs] fix broken read-the-docs build (#2075)
Jeff Rasley committed
December 1, 2021
A
Improve pre-commit hooks (#1602)
Alex Hedges committed
November 19, 2021
J
Several fixes for our read-the-docs build (#1579)
Jeff Rasley committed
December 15, 2020
J
Fixes for RTD build errors (#606)
Jeff Rasley committed
September 17, 2020
S
readthedocs yaml configuration (#410)
Shaden Smith committed
September 10, 2020
S
readthedocs upgrade (#402)
Shaden Smith committed
S
Pipeline parallel training engine. (#392)
Shaden Smith committed
May 29, 2020
J
Transformer kernel release (#242)
Jeff Rasley committed
May 19, 2020
J
ZeRO-2 (#217)
Jeff Rasley committed
April 22, 2020
S
README and RTD improvements. (#198)
Shaden Smith committed
March 17, 2020
S
drafting Jekyll webpage (#143)
Shaden Smith committed