DeepSpeed is a deep learning optimization library that makes distributed training and inference easy, efficient, and effective.
COMMITS
/ docs/_tutorials/getting-started.md March 11, 2022
C
Website posts and tutorial improvements (#1799)
Cheng Li committed
March 9, 2022
J
[docker] simplify and update rocm dockerfile (#1819)
Jeff Rasley committed
January 3, 2022
M
Various small documentation text improvements (#1665)
Manuel R. Ciosici committed
June 8, 2021
C
adjust azureml-examples links (#1015)
Cody committed
April 27, 2021
C
add ds integrations (#963)
Cheng Li committed
April 14, 2021
C
update lr scheduler doc for doing per step or epoch update (#913)
Cheng Li committed
April 2, 2021
A
Add link to AML examples. (#916)
Ammar Ahmad Awan committed
March 25, 2021
S
[debug utils] see_memory_usage fixes (#890)
Stas Bekman committed
March 18, 2021
S
[doc] launcher (#868)
Stas Bekman committed
February 26, 2021
S
document the requirement to call for all ranks (#801)
Stas Bekman committed
January 8, 2021
S
document deepspeed.initialize() (#644)
Stas Bekman committed
December 18, 2020
J
Ability to initialize distributed backend outside deepspeed runtime (#608)
Jeff Rasley committed
November 28, 2020
S
[doc] typo fix and clarification (#563)
Stas Bekman committed
November 12, 2020
J
DeepSpeed JIT op + PyPI support (#496)
Jeff Rasley committed
September 9, 2020
A
Add 1-bit Adam support to DeepSpeed (#380)
Ammar Ahmad Awan committed
July 28, 2020
E
Fixing a typo (#303)
Emmanuel Kahembwe committed
May 19, 2020
J
ZeRO-2 (#217)
Jeff Rasley committed
April 27, 2020
S
Moved environment variable docs. (#203)
Shaden Smith committed
March 18, 2020
S
JSON configuration cleanup. (#151)
Shaden Smith committed
S
Web edits (#147)
Shaden Smith committed
S
Web edits (#146)
Shaden Smith committed
March 17, 2020
S
drafting Jekyll webpage (#143)
Shaden Smith committed