Abstract: You can move from experimentation to production faster by operationalizing data science tasks with Jobs in OCI Data Science. Jobs automate repeatable tasks like retraining and redeploying models. You can apply Jobs to any use case you have, such as data preparation, model training, hyperparameter tuning, and batch inference. In this demo, you’ll see how to:
· Run machine learning (ML) or data science tasks outside of your notebook sessions in JupyterLab
· Operationalize discrete data science and machine learning tasks as reusable, runnable operations
· Automate your typical MLOps or CI/CD pipeline
· Execute batches or workloads triggered by events or actions
· Create batch, mini batch, or distributed batch job inference
Bio: Lyudmil Pelov is Senior Principal Product Manager for Oracle AI, which includes services for creating, managing, and deploying machine learning models, or delivering prebuilt AI models to those with less data science experience. Lyudmil joined Oracle 14 years ago and has extensive experience building and leading successful engineering projects that deliver highly scalable cloud-based and on-premises solutions across a variety of domains.