DeepSpeed is a deep learning optimization library that makes distributed training and inference easy, efficient, and effective.
COMMITS
20
in the last month
CONTRIBUTORS
13
active
STARS
41918
total
FORKS
0
total
TOP CONTRIBUTORS
M
Masahiro Tanaka Z
Zhipeng Wang M
Ma, Guokai A
Alexander Grund N
nathon K
Krishna Chaitanya I
instantraaamen L
Logan Adams M
Michael Royzen H
Huang Yifan RECENT COMMITS
Z
Z
K
A
A
Respect `$TRITON_HOME` (#7907)
Alexander Grund
L
Update version (#7903)
Logan Adams
M
Fix Evoformer's multi-arch dispatch root cause (#7881)
Masahiro Tanaka
S
double reduction user-friendly error (#7895)
Stas Bekman
A
[Bloom] Fix hangs of bloom test (#7890)
Artem Kuzmitckii
O
Suppress see_memory_usage logs (#7891)
Olatunji Ruwase
M
Fix hook count performance regression from v0.18.5 (#7886)
Masahiro Tanaka
M
Add document section explaining autocast nesting (#7883)
Masahiro Tanaka