Event Concluded

Data Management
DataBricks: Build and Deploy ETL pipelines with Delta Live Tables
By Databricks

About Event
As data volumes and their complexity continue to rapidly grow, the tools to process that data have not kept pace. The burden is put primarily on data engineering teams, who need to stitch together and maintain many different elements of code and supporting infrastructure to process data from source to destination, which is repeated across dozens of data sources and multiple stages of data processing. Data engineers develop ETL pipelines as individualized tasks, making it difficult to “stitch” together multiple sequences of the ETL pipeline. With Delta Live Tables (DLT), data teams have an ETL framework that uses a simple declarative approach to building reliable data pipelines. DLT automatically manages your infrastructure at scale so you can spend less time on tooling and focus on getting value from data. In this session, you will learn how to build and deploy a declarative streaming ETL pipeline at scale with DLT, how DTL automates complex and time-consuming tasks like task orchestration, error handling, recovery and auto-scaling with performance optimizations. Finally, we will show you how DLT enables data teams to deliver fresh, up-to-date data with built-in quality controls and monitoring ensuring accurate and useful BI, Data Science and ML.
Trending Events & Webinars
Agentic AI Summit Virtual 2025
Wed, Jul 16, 2025
By Open Data Science Conference (ODSC)
OT Security Melbourne
Tue, Jul 22, 2025
By Corinium Global intelligence
CISO Melbourne 2025
Tue, Jul 22, 2025
By Corinium Global intelligence
Cloud Security Melbourne
Wed, Jul 23, 2025
By Corinium Global intelligence
AppSec & DevSecOps Melbourne
Wed, Jul 23, 2025
By Corinium Global intelligence
Join Our Newsletter
Get weekly news, engaging articles, and career tips-all free!
By subscribing to our newsletter, you're cool with our terms and conditions and agree to our Privacy Policy.
Join The Discussion