Event Concluded

TechDogs-"DataBricks: Build and Deploy ETL pipelines with Delta Live Tables"

Data Management

DataBricks: Build and Deploy ETL pipelines with Delta Live Tables

By Databricks

Databricks
Overall Rating

About Event

As data volumes and their complexity continue to rapidly grow, the tools to process that data have not kept pace. The burden is put primarily on data engineering teams, who need to stitch together and maintain many different elements of code and supporting infrastructure to process data from source to destination, which is repeated across dozens of data sources and multiple stages of data processing. Data engineers develop ETL pipelines as individualized tasks, making it difficult to “stitch” together multiple sequences of the ETL pipeline. With Delta Live Tables (DLT), data teams have an ETL framework that uses a simple declarative approach to building reliable data pipelines. DLT automatically manages your infrastructure at scale so you can spend less time on tooling and focus on getting value from data. In this session, you will learn how to build and deploy a declarative streaming ETL pipeline at scale with DLT, how DTL automates complex and time-consuming tasks like task orchestration, error handling, recovery and auto-scaling with performance optimizations. Finally, we will show you how DLT enables data teams to deliver fresh, up-to-date data with built-in quality controls and monitoring ensuring accurate and useful BI, Data Science and ML.

Join The Discussion

- Promoted By TechDogs -

Join Our Newsletter

Get weekly news, engaging articles, and career tips-all free!

By subscribing to our newsletter, you're cool with our terms and conditions and agree to our Privacy Policy.

  • Dark
  • Light