Implement data engineering solutions using Azure Databricks (DP-750T00) Coming Soon

Master end-to-end data engineering with Azure Databricks and Unity Catalog. This course moves from foundational setup to production deployment, covering environment configuration and enterprise-grade governance. Learn to build robust ingestion pipelines, implement security with Unity Catalog, and deploy optimized workloads. By the end, you will have the practical skills to implement, secure, and maintain scalable lakehouse solutions that meet rigorous enterprise requirements.


Audience Profile

The target audience is data engineers who have fundamental knowledge of data analytics concepts, a basic understanding of cloud storage, and familiarity with data organization principles. They should be comfortable working with SQL and have experience using Python, including notebooks, for data engineering tasks. Learners are expected to have a good understanding of Azure Databricks workspaces and Unity Catalog, along with familiarity with data access patterns and core data engineering and data warehouse concepts. In addition, they should have foundational knowledge of Azure security, including Microsoft Entra ID, and be familiar with Git version control fundamentals.


Prerequisites

  • Fundamental knowledge of data analytics concepts
  • Basic understanding of cloud storage concepts
  • Familiarity with SQL and data organization principles
Detaylari Göster


Course Syllabus

Set up and configure an Azure Databricks environment

  • Explore Azure Databricks
  • Understand Azure Databricks architecture
  • Understand Azure Databricks Integrations
  • Select and Configure Compute in Azure Databricks
  • Create and organize objects in Unity Catalog


Secure and govern Unity Catalog objects in Azure Databricks

  • Secure Unity Catalog objects
  • Govern Unity Catalog objects


Prepare and process data with Azure Databricks

  • Design and implement data modeling with Azure Databricks
  • Ingest data into Unity Catalog
  • Cleanse, transform, and load data into Unity Catalog
  • Implement and manage data quality constraints with Azure Databricks


Deploy and maintain data pipelines and workloads with Azure Databricks

  • Design and implement data pipelines with Azure Databricks
  • Implement Lakeflow Jobs with Azure Databricks
  • Implement development lifecycle processes in Azure Databricks
  • Monitor, troubleshoot and optimize workloads in Azure Databricks