NORDICS20 - Assets

Deep Learning on AWS

Amazon Web Services Resources EMEA

Issue link: https://emea-resources.awscloud.com/i/1242450

Contents of this Issue

Navigation

Page 30 of 50

Amazon Web Services Deep Learning on AWS Page 26 steps involved in retraining and redeployment. Discussed below are the solutions and services that can be used to automate your deep learning production pipeline on AWS. AWS Step Functions for Amazon SageMaker AWS Step Functions allows you to orchestrate multiple steps in the ML workflow to allow for seamless model deployment in the production. AWS Step Functions translates your workflow into a state machine diagram that is easy to understand, easy to explain to others, and easy to change. You can monitor each step of execution as it happens. Today, Amazon SageMaker supports two different patterns for service integration: • Call an Amazon SageMaker instance and let AWS Step Functions progress to the next state immediately after it receives an HTTP response. • Call an Amazon SageMaker instance and have AWS Step Functions wait for a job to complete. Apache Airflow for Amazon SageMaker Apache Airflow is an open source alternative platform that enables you to programmatically author, schedule, and monitor workflows. Using Apache Airflow, you can build a workflow for Amazon SageMaker training, hyperparameter tuning, batch transform and endpoint deployment. You can use any Amazon SageMaker deep learning framework or Amazon SageMaker algorithms to perform these operations in Airflow. You can build a Amazon SageMaker workflow using Airflow SageMaker operators or using Airflow Python Operator. You can also use Turbine, an open-source AWS CloudFormation template, to create an Airflow resource stack on AWS. Kubeflow Pipelines on Kubernetes If you are a DIY customer not using Amazon SageMaker and are leveraging your current investment in Kubernetes on AWS, you can use Kubeflow Pipelines. Kubeflow Pipelines is a platform for building and deploying portable, scalable machine learning (ML) workflows based on Docker containers. A pipeline is a description of an ML workflow, including all of the components in the workflow and how they combine in the form of a graph. This is popular tool that is used by practitioners using Kubernetes for build, train, and deploy. It has native integrations with Kubernetes.

Articles in this issue

Links on this page

view archives of NORDICS20 - Assets - Deep Learning on AWS