DeepSpeed
latest
  • Training Setup
  • Inference Setup
  • Training API
  • Inference API
  • Model Checkpointing
  • Activation Checkpointing
  • ZeRO
  • Mixture of Experts (MoE)
  • Transformer Kernels
  • Pipeline Parallelism
  • Optimizers
  • Learning Rate Schedulers
  • Flops Profiler
  • Autotuning
  • Memory Requirements
DeepSpeed
  • Docs »
  • Overview: module code

All modules for which code is available

  • deepspeed
    • deepspeed.moe.layer
    • deepspeed.ops.adam.cpu_adam
    • deepspeed.ops.adam.fused_adam
    • deepspeed.ops.lamb.fused_lamb
    • deepspeed.ops.transformer.transformer
    • deepspeed.profiling.flops_profiler.profiler
    • deepspeed.runtime.fp16.onebit.adam
    • deepspeed.runtime.fp16.onebit.lamb
    • deepspeed.runtime.fp16.onebit.zoadam
    • deepspeed.runtime.lr_schedules
    • deepspeed.runtime.pipe.engine
    • deepspeed.runtime.pipe.module
    • deepspeed.runtime.pipe.schedule
    • deepspeed.runtime.pipe.topology
    • deepspeed.runtime.zero.stage3
    • deepspeed.utils.distributed
    • deepspeed.utils.zero_to_fp32

© Copyright 2020, Microsoft Revision 3da84185.

Built with Sphinx using a theme provided by Read the Docs.
Read the Docs v: latest
Versions
latest
stable
rtd-staging
Downloads
pdf
On Read the Docs
Project Home
Builds

Free document hosting provided by Read the Docs.