Snowflake - Build and Architect Data Pipelines Using AWS

Preview this course

The course helps you learn Snowflake from scratch and explore a few of its important features. You will build automated pipelines with Snowflake and use the AWS cloud with Snowflake as a data warehouse. You will also explore Snowpark to be worked on the data pipelines.

Unlimited access to 750+ courses.
Enjoy a Free Trial. Cancel Anytime.

- OR -

30-Day Money-Back Guarantee
Full Lifetime Access.
96 on-demand videos & exercises
Level: Beginner
English
8hrs 39mins
Access on mobile, web and TV

What to know about this course

Snowflake is the next big thing, and it is becoming a full-blown data ecosystem. With the level of scalability and efficiency in handling massive volumes of data and also with several new concepts in it, this is the right time to wrap your head around Snowflake and have it in your toolkit.
This course not only covers the core features of Snowflake but also teaches you how to deploy Python/PySpark jobs in AWS Glue and Airflow that communicate with Snowflake, which is one of the most important aspects of building pipelines.

In this course, you will look at Snowflake, and then the most crucial aspects of Snowflake in an efficient manner. You will be writing Python/Spark Jobs in AWS Glue Jobs for data transformation and seeing real-time streaming using Kafka and Snowflake. You will be interacting with external functions and use cases, and see the security features in Snowflake. Finally, you will look at Snowpark and explore how it can be used for data pipelines and data science.

By the end of this course, you will have learned about Snowflake and Snowpark, and learned how to build and architect data pipelines using AWS. You need to have an active AWS account in order to perform the sections related to Python and PySpark. For the rest of the course, a free trial Snowflake account should suffice.

Who's this course for?

This course is ideal for software engineers, aspiring data engineers or data analysts, and data scientists who want to excel in their careers in the IT domain.
Apart from them, this course is also good for programmers and database administrators with experience in writing SQL queries.

Prior programming experience in SQL or at least some prior knowledge in writing queries and Python is a must.

You should have a basic experience or understanding of cloud services such as AWS along with an active AWS account.


What you'll learn

  • Learn about Snowflake and its basics before getting started with labs
  • Check the crucial aspects of Snowflake in a very practical manner
  • Write Python/Spark jobs in AWS
  • Glue Jobs for data transformation
  • Execute real-time streaming using Kafka and Snowflake
  • Interact with external functions and use cases
  • Learn and explore the Snowpark library

Key Features

  • Learn from an easy-to-understand and step-by-step course, divided into 85+ videos along with detailed resource files
  • Integrate real-time streaming data and data orchestration with Airflow and Snowflake
  • Highly practical explanations and lab exercises to help you grasp the most out of the course

Course Curriculum

About the Author

Siddharth Raghunath

Siddharth Raghunath is a business-oriented engineering manager with a vast experience in the field of software development, distributed processing, and cloud data engineering. He has worked on different cloud platforms such as AWS and GCP as well as on-premise Hadoop clusters. He conducts seminars on distributed processing using Spark, real-time streaming and analytics, and best practices for ETL and data governance. He is passionate about coding and building optimal data pipelines for robust data processing and streaming solutions.

40% OFF! Unlimited Access to 750+ Courses. Redeem Now.