PyTorch
| PyTorch | |
|---|---|
| Original authors |
|
| Developer | Meta AI |
| Initial release | September 2016 |
| Stable release | 2.10.0
/ 21 January 2026 |
| Written in | |
| Operating system | |
| Platform | IA-32, x86-64, ARM64 |
| Available in | English |
| Type | Library for deep learning |
| License | BSD-3 |
| Website | pytorch |
| Repository | github |
| Part of a series on |
| Machine learning and data mining |
|---|
PyTorch is an open-source deep learning library, originally developed by Meta Platforms and currently developed with support from the Linux Foundation. The successor to Torch, PyTorch provides a high-level API that builds upon optimised, low-level implementations of deep learning algorithms and architectures, such as the Transformer, or SGD. Notably, this API simplifies model training and inference to a few lines of code. PyTorch allows for automatic parallelization of training and, internally, implements CUDA bindings that speed training further by leveraging GPU resources.
PyTorch utilises the tensor as a fundamental data type, similarly to NumPy. Training is facilitated by a reversed automatic differentiation system, Autograd, that constructs a directed acyclic graph of the operations (and their arguments) executed by a model during its forward pass. With a loss, backpropagation is then undertaken.
As of 2025, PyTorch remains one of the most popular deep learning libraries, alongside others such as TensorFlow and Keras. A number of commercial deep learning architectures are built on top of PyTorch, including ChatGPT, Tesla Autopilot, Uber's Pyro, Hugging Face's Transformers, and Catalyst.