Discover and explore top open-source AI tools and projects—updated daily.
Streamline deep learning workflows on AWS with pre-built Docker images
Top 34.6% on SourcePulse
AWS Deep Learning Containers (DLCs) provide pre-built, optimized Docker images for popular deep learning frameworks like TensorFlow, PyTorch, and MXNet. Designed for users deploying machine learning workloads on AWS, these containers simplify the setup and execution of training and inference tasks across services such as Amazon SageMaker, EC2, ECS, and EKS, offering a streamlined path to production.
How It Works
The project offers a set of Docker images pre-configured with deep learning frameworks, NVIDIA CUDA for GPU acceleration, and Intel MKL for CPU optimization. These images are hosted on Amazon ECR, ensuring readily available, tested, and optimized environments. The DLCs serve as the default execution environments for Amazon SageMaker jobs, abstracting away complex dependency management and providing a consistent, high-performance runtime.
Quick Start & Requirements
pip install -r src/requirements.txt
), and running build scripts (python src/main.py ...
).AmazonEC2ContainerRegistryFullAccess
, AmazonSageMakerFullAccess
), Docker client, and Python 3.Highlighted Details
pytest
for various AWS deployment scenarios (EC2, ECS, EKS, SageMaker local/remote).Licensing & Compatibility
The core project is licensed under the Apache-2.0 License. However, specific components like smdistributed.dataparallel
and smdistributed.modelparallel
are released under the AWS Customer Agreement. The Apache-2.0 license is generally permissive for commercial use.
Limitations & Caveats
Amazon SageMaker does not support tensorflow_inference
py2 images. Building images for the first time requires downloading base layers, which can be time-intensive. Setting up the environment necessitates significant AWS IAM permissions.
1 day ago
Inactive