The AWS Deep Learning AMIs equip data scientists, machine learning practitioners, and research scientists with the infrastructure and tools to accelerate work in deep learning, in the cloud, at any scale. You can quickly launch Amazon EC2 instances on Amazon Linux or Ubuntu, pre-installed with popular deep learning frameworks to train sophisticated, custom AI models, experiment with new algorithms, or to learn new skills and techniques. The Deep Learning AMIs let you create managed, auto-scaling clusters of GPUs for large scale training, or run inference on trained models with compute-optimized or general purpose CPU instances, using Apache MXNet, TensorFlow, the Microsoft Cognitive Toolkit (CNTK), Caffe, Caffe2, Theano, Torch and Keras.
The Deep Learning AMIs are Available in the AWS Marketplace
Try the AWS Deep Learning AMIsGet Started Quickly
To quickly get started with your deep learning Amazon EC2 instance in the cloud, the Deep Learning AMIs are provisioned with many popular deep learning frameworks, each of which provides an easy-to-launch tutorial to demonstrate proper installation, configuration, and model accuracy.
Hassle-free Setup and Configuration
The Deep Learning AMIs install dependencies, track library versions, and validate code compatibility. And with updates to the AMIs every month, you'll always have the latest versions of the engines and data science libraries.
Pay Only for What You Use
Whether you need Amazon EC2 GPU or CPU instances, there is no additional charge for the Deep Learning AMIs. You only pay for the AWS hourly resources needed to store and run your applications.
Deep learning can often be technically challenging, requiring you to understand the math of the models themselves, and the experience in scaling training and inference across large distributed systems. As a result, several deep learning frameworks have emerged which allow you to define models, and then train them at scale. Built for Amazon Linux and Ubuntu, the AWS Deep Learning AMIs come pre-configured with Apache MXNet, TensorFlow, the Microsoft Cognitive Toolkit (CNTK), Caffe, Caffe2, Theano, Torch and Keras, enabling you quickly deploy and run any of these frameworks at scale.
Deep learning frameworks use neural nets, which involve the process of multiplying a lot of matrices. To expedite your model training and deep learning research and development, the AWS Deep Learning AMIs offer GPU-acceleration through pre-configured CUDA and cuDNN drivers, as well as the Intel Math Kernel Library (MKL), in addition to installing popular Python packages and the Anaconda Platform.
The AWS Deep Learning AMIs running on Amazon EC2 P2 instances are pre-installed with NVIDIA CUDA and cuDNN drivers for all supported deep learning frameworks, to substantially accelerate the time to complete your computations.
When using Intel-based processors for your Amazon EC2 instances, the Deep Learning AMIs come integrated with the Intel Math Kernel Library (MKL) to accelerate math processing and neural network routines.
The AWS Deep Learning AMIs come installed with Jupyter (formerly iPython) notebooks loaded with Python 2.7 and Python 3.4 kernels, along with your favorite popular Python packages, including the AWS SDK for Python.
To simplify package management and deployment, the AWS Deep Learning AMIs install the Anaconda2 and Anaconda3 Data Science platform, for large-scale data processing, predictive analytics, and scientific computing.