GOOGLE CLOUD

Serverless Data Processing with Dataflow

This training is intended for big data practitioners who want to further their understanding of Dataflow in order to advance their data processing applications. Beginning with foundations, this training explains how Apache Beam and Dataflow work together to meet your data processing needs without the risk of vendor lock-in. The section on developing pipelines covers how you convert your business logic into data processing applications that can run on Dataflow. This training culminates with a focus on operations, which reviews the most important lessons for operating a data application on Dataflow, including monitoring, troubleshooting, testing, and reliability.

What you will learn

  • Demonstrate how Apache Beam and Dataflow work together to fulfill your organization’s data processing needs.

  • Summarize the benefits of the Beam Portability Framework and enable it for your Dataflow pipelines.

  • Enable Shuffle and Streaming Engine, for batch and streaming pipelines respectively, for maximum performance.

  • Enable Flexible Resource Scheduling for more cost-efficient performance.

  • Select the right combination of IAM permissions for your Dataflow job.

  • Implement best practices for a secure data processing environment.

  • Select and tune the I/O of your choice for your Dataflow pipeline.

  • Use schemas to simplify your Beam code and improve the performance of your pipeline.

  • Develop a Beam pipeline using SQL and DataFrames.

  • Perform monitoring, troubleshooting, testing and CI/CD on Dataflow pipelines.

Who this course is for

  • Data Engineer

  • Data Analysts and Data Scientists aspiring to develop Data Engineering skills

Level

  • Advanced

Duration

  • 3 x 8 hour session

Language

  • Delivered in English

Prerequisites

  • To benefit from this course, participants should have completed “Google Cloud Big Data and Machine Learning Fundamentals” or have equivalent experience.

  • Basic proficiency with a common query language such as SQL.

  • Experience with data modeling and ETL (extract, transform, load) activities.

  • Experience with developing applications using a common programming language such as Python.

  • Familiarity with machine learning and/or statistics.

Course TOPICS

Module 1: Introduction

  • Course Introduction

  • Beam and Dataflow Refresher

Module 2: Beam Portability

  • Beam Portability

  • Runner v2

  • Container Environments

  • Cross-Language Transforms

Module 3: Separating Compute and Storage with Dataflow

  • Dataflow Shuffle Service

  • Dataflow Streaming Engine

  • Flexible Resource Scheduling

Module 4: IAM, Quotas, and Permissions

  • IAM

  • Quota

Module 5: Security

  • Data Locality

  • Shared VPC

  • Private IPs

  • CMEK

Module 6: Beam Concepts Review

  • Beam Basics

  • Utility Transforms

  • DoFn Lifecycle

Module 7: Windows, Watermarks, Triggers

  • Windows

  • Watermarks

  • Triggers

Module 8: Sources and Sinks

  • Sources and Sinks

  • Text IO and File IO

  • BigQuery IO

  • PubSub IO

  • Kafka IO

  • Bigable IO

  • Avro IO

  • Splittable DoFn

Module 9: Schemas

  • Beam Schemas

  • Code Examples

Module 10: State and Timers

  • State API

  • Timer API

  • Summary

Module 11: Best Practices

  • Schemas

  • Handling unprocessable Data

  • Error Handling

  • AutoValue Code Generator

  • JSON Data Handling

  • Utilize DoFn Lifecycle

  • Pipeline Optimizations

Module 12: Dataflow SQL and DataFrames

  • Dataflow and Beam SQL

  • Windowing in SQL

  • Beam DataFrames

Module 13: Beam Notebooks

  • Beam Notebooks

Module 14: Monitoring

  • Job List

  • Job Info

  • Job Graph

  • Job Metrics

  • Metrics Explorer

Module 15: Logging and Error Reporting

  • Logging

  • Error Reporting

Module 16: Troubleshooting and Debug

  • Troubleshooting Workflow

  • Types of Troubles

Module 17: Performance

  • Pipeline Design

  • Data Shape

  • Source, Sinks, and External Systems

  • Shuffle and Streaming Engine

Module 18: Testing and CI/CD

  • Testing and CI/CD Overview

  • Unit Testing

  • Integration Testing

  • Artifact Building

  • Deployment

Module 19: Reliability

  • Introduction to Reliability

  • Monitoring

  • Geolocation

  • Disaster Recovery

Module 20: Flex Templates

  • Classic Templates

  • Flex Templates

  • Using Flex Templates

  • Google-provided Templates

Module 21: Summary

  • Summary

Ref: T-SDPDF-A-01

Have questions?

No worries. Send us a quick message and we'll be happy to answer any questions you have.

© Copyright 2023. Axalon. All rights reserved.

Facebook site
LinkedIn profile