DeepSpeed

DeepSpeed
Original author(s)Microsoft Research
Developer(s)Microsoft
Initial releaseMay 18, 2020; 3 years ago (2020-05-18)
Stable release
v0.12.3 / November 10, 2023; 4 months ago (2023-11-10)
Repositorygithub.com/microsoft/DeepSpeed
Written inPython, CUDA, C++
TypeSoftware library
LicenseApache License 2.0
Websitedeepspeed.ai

DeepSpeed is an open source deep learning optimization library for PyTorch.[1] The library is designed to reduce computing power and memory use and to train large distributed models with better parallelism on existing computer hardware.[2][3] DeepSpeed is optimized for low latency, high throughput training. It includes the Zero Redundancy Optimizer (ZeRO) for training models with 1 trillion or more parameters.[4] Features include mixed precision training, single-GPU, multi-GPU, and multi-node training as well as custom model parallelism. The DeepSpeed source code is licensed under MIT License and available on GitHub.[5]

The team claimed to achieve up to a 6.2x throughput improvement, 2.8x faster convergence, and 4.6x less communication.[6]

See also

References

  1. ^ "Microsoft Updates Windows, Azure Tools with an Eye on The Future". PCMag UK. May 22, 2020.
  2. ^ Yegulalp, Serdar (February 10, 2020). "Microsoft speeds up PyTorch with DeepSpeed". InfoWorld.
  3. ^ "Microsoft unveils "fifth most powerful" supercomputer in the world". Neowin. 18 June 2023.
  4. ^ "Microsoft trains world's largest Transformer language model". February 10, 2020.
  5. ^ "microsoft/DeepSpeed". July 10, 2020 – via GitHub.
  6. ^ "DeepSpeed: Accelerating large-scale model inference and training via system optimizations and compression". Microsoft Research. 2021-05-24. Retrieved 2021-06-19.

Further reading

  • Rajbhandari, Samyam; Rasley, Jeff; Ruwase, Olatunji; He, Yuxiong (2019). "ZeRO: Memory Optimization Towards Training A Trillion Parameter Models". arXiv:1910.02054 [cs.LG].

External links

  • AI at Scale - Microsoft Research
  • GitHub - microsoft/DeepSpeed
  • ZeRO & DeepSpeed: New system optimizations enable training models with over 100 billion parameters - Microsoft Research


Retrieved from "https://en.wikipedia.org/w/index.php?title=DeepSpeed&oldid=1191152036"