Run Azure Machine Learning Anywhere in Multi-cloud or on Premises

Abstract: 

In this session, you will learn how to run machine learning workloads with seamless Azure Machine Learning experience anywhere, including on-premises, in multi-cloud environments, and at the edge. Use any Kubernetes cluster and extend machine learning to run MLOps, model training, real-time inference or batch-inference. You can manage all the resources through a single pane with the management, consistency, and reliability.​

Bio: 

Doris Zhong is a Product Manager in Azure AI Platform organization at Microsoft, and she is focusing on the area of machine learning in hybrid cloud. She loves to communicate with customer to get deep insights, and help solve the real problem. In her early career, she worked on building Microsoft internal GPU training platform, that managed tens of thousands of GPUs, and served thousands of users.

Open Data Science

 

 

 

Open Data Science
One Broadway
Cambridge, MA 02142
info@odsc.com

Privacy Settings
We use cookies to enhance your experience while using our website. If you are using our Services via a browser you can restrict, block or remove cookies through your web browser settings. We also use content and scripts from third parties that may use tracking technologies. You can selectively provide your consent below to allow such third party embeds. For complete information about the cookies we use, data we collect and how we process them, please check our Privacy Policy
Youtube
Consent to display content from - Youtube
Vimeo
Consent to display content from - Vimeo
Google Maps
Consent to display content from - Google