NORDICS20 - Assets

Deep Learning on AWS

Amazon Web Services Resources EMEA

Issue link: https://emea-resources.awscloud.com/i/1242450

Contents of this Issue

Navigation

Page 38 of 50

Amazon Web Services Deep Learning on AWS Page 34 You can use Amazon SageMaker components with this stack for tasks such as data labeling and model optimization. You can use Amazon SageMaker for an end-to-end deep learning workflow or in parts. With Kubernetes and Kubeflow stack, you can still use Amazon SageMaker Ground Truth for data labeling and annotation and Amazon SageMaker Neo for model optimization. Additional Considerations for DIY Solution Amazon EKS is a managed service that is not specifically tuned and optimized for deep learning. Kubeflow is not a managed service offered by AWS. You must fine tune and optimize the Amazon EKS and Kubeflow stack for deep learning by implementing best practices. For more information, see Best Practices for Optimizing Distributed Deep Learning Performance on Amazon EKS on the AWS Open Source Blog. Optionally, you can also use Amazon EKS Deep Learning Benchmark Utility, which is an automated tool for machine learning benchmarking on Kubernetes clusters. Optionally, you can also use AWS Deep Learning Containers (AWS DL Containers), which are a set of Docker images for training and serving models in TensorFlow and MXNet on EKS. AWS DL Containers provide optimized environments with TensorFlow and MXNet, NVIDIA CUDA (for GPU instances), and Intel MKL (for CPU instances) libraries. AWS DL Containers are available in the Amazon Elastic Container Registry (Amazon ECR). There are numerous initiatives to allow more native integration between Kubeflow and the AWS platform for deep learning. For the complete list of native integration between Kubeflow and AWS, see Kubeflow on AWS Features. DIY Self-Managed Solution: Use Amazon EC2 There are organizations and deep learning engineers and scientists who may not adopt container strategy for build, train, and deploy, nor have the required skills to operate in a containerized environment. Or, deep learning engineers and scientists may want to use the latest drivers and libraries from the research community and may not have guidance for installation on a containerized environment. The installation and integration for this new software may not be available or easy. In this type of scenario, you can set up a custom DIY cluster on top of Amazon EC2 to develop and scale your experiment.

Articles in this issue

Links on this page

view archives of NORDICS20 - Assets - Deep Learning on AWS