Data Science Model Deployments and Cloud Computing on GCP

Preview this course

Are you interested in learning and deploying applications at scale using Google Cloud platform? Do you lack hands-on exposure when it comes to deploying applications and seeing them in action? Then this course is for you. You will also learn microservices and event-driven architectures with real-world use case implementations.

Unlimited access to 750+ courses.
Enjoy a Free Trial. Cancel Anytime.

- OR -

30-Day Money-Back Guarantee
Full Lifetime Access.
86 on-demand videos & exercises
Level: Beginner
English
6hrs 55mins
Access on mobile, web and TV

What to know about this course

Google Cloud platform is one of the most rapidly growing cloud providers in the market today, making it an essential skill for aspiring cloud engineers and data scientists.

This comprehensive course covers all major serverless components on GCP, providing in-depth implementation of machine learning pipelines using Vertex AI with Kubeflow, and Serverless PySpark using Dataproc, App Engine, and Cloud Run. The course offers hands-on experience using GCP services such as Cloud Functions, Cloud Run, Google App Engine, and Vertex AI for custom model training and development, Kubeflow for workflow orchestration, and Dataproc Serverless for PySpark batch jobs.

The course starts with modern-day cloud concepts, followed by GCP trial account setup and Google Cloud CLI setup. You will then look at Cloud Run for serverless and containerized applications, and Google App Engine for serverless applications. Next, you will study cloud functions for serverless and event-driven applications. After that, you will look at data science models with Google App Engine and Dataproc Serverless PySpark. Finally, you will explore Vertex AI for the machine learning framework, and cloud scheduler and application monitoring.

By the end of the course, you will be confident in deploying and implementing applications at scale using Kubeflow, Spark, and serverless components on Google Cloud.

Who's this course for?

This intermediate course is designed for those who aspire to become data scientists and machine learning engineers, data engineers, architects, and anyone with a decent exposure in IT looking to start their cloud journey.
The course is ideally suited for individuals who possess a fair idea of how the cloud works and have prior experience in basic programming using Python and SQL.
A tech background with basic fundamentals and basic exposure to programming languages such as Python and SQL along with the Bash command line will further help individuals fast-track their learning.

What you'll learn

  • Deploy serverless applications using Google App Engine, Cloud Functions, and Cloud Run
  • Learn how to use datastore (NoSQL database) in realistic use cases
  • Understand microservice and event-driven architecture with practical examples
  • Deploying production-level machine learning workflows on cloud
  • Use Kubeflow for machine learning orchestration using Python
  • Deploy Serverless PySpark Jobs to Dataproc Serverless and schedule them using Airflow/Composer

Key Features

  • Learn from an easy-to-understand and step-by-step course, along with resource materials  
  • Highly practical explanations and lab exercises to help you grasp the most out of the course
  • Design and deploy Python applications across various services, using gcloud command-line interface

Course Curriculum

About the Author

Siddharth Raghunath

Siddharth Raghunath is a business-oriented engineering manager with a vast experience in the field of software development, distributed processing, and cloud data engineering. He has worked on different cloud platforms such as AWS and GCP as well as on-premise Hadoop clusters. He conducts seminars on distributed processing using Spark, real-time streaming and analytics, and best practices for ETL and data governance. He is passionate about coding and building optimal data pipelines for robust data processing and streaming solutions.