The future of Lightning is here - get started for free now!

More rapid iteration with Lightning Bolts

A collection of well established, SOTA models and components.

Visit Bolts

Lightning Bolts

PyTorch Lightning Bolts is a community-built deep learning research and production toolbox, featuring a collection of well established and SOTA models and components, pre-trained weights, callbacks, loss functions, data sets, and data modules.​

Rigorously Tested

Everything is implemented in Lightning and tested (daily), benchmarked, documented, and works on CPUs, TPUs, GPUs, and 16-bit precision.

Modular

What separates Bolts from all the other libraries out there, is that Bolts is built by and used by AI researchers. This means every single Bolt component is modularized so that it can be easily extended or mixed with arbitrary parts of the rest of the code-base.

Better baselines

Bolts has rigorously tested and benchmarked baselines. From VAEs to GANs to GPT to self-supervised models — you don’t have to spend months implementing the baselines to try new ideas. Instead, subclass one of ours and try your idea!

Subclass, override, and train!

Logistic regressions

Lightning Bolts includes a collection of non-deep learning algorithms that can train on multiple GPUs and TPUs.

Self Supervised Learning

Bolts houses a collection of many of the current state-of-the-art self-supervised algorithms: SimCLR, SwAV, AMDIM, BYOL, CPC-V2, MOCO-V2...

Reinforcement Learning

Bolts contains a variety of DQN models you can extend to build your own reinforcement learning models.