Intermediate2 days

Azure Event Hubs

Cloud-scale event streaming and data ingestion for big data pipelines

Overview

Azure Event Hubs is a fully managed, real-time data ingestion service capable of receiving and processing millions of events per second from any source. Acting as the "front door" for big data pipelines, Event Hubs decouples event producers from consumers and integrates natively with Azure Stream Analytics, Azure Databricks, and Azure Functions. This training covers the full Event Hubs ecosystem — namespace architecture, Apache Kafka protocol support, Schema Registry, Capture to Data Lake, and enterprise security — preparing you to design resilient, high-throughput event streaming architectures.

What you'll learn

  • Understand the Event Hubs architecture: namespaces, event hubs, partitions, and consumer groups
  • Produce and consume events using Python and .NET with the Azure SDK
  • Use Event Hubs as a drop-in Apache Kafka replacement without code changes
  • Configure Event Hubs Capture to land events automatically in Azure Data Lake Storage
  • Use Schema Registry to enforce Avro and JSON schema contracts across teams
  • Integrate Event Hubs with Stream Analytics, Databricks, and Azure Functions

Programme

Day 1 — Architecture, producers & consumers
  • Event Hubs architecture: namespaces, partitions, consumer groups, and offsets explained
  • Standard, Premium, and Dedicated tiers: choosing the right tier for your workload
  • Producing events with the Azure SDK in Python and .NET
  • Consuming events: EventProcessorClient, checkpointing, and load-balanced consumers
  • Apache Kafka on Event Hubs: migrate Kafka applications without changing application code
  • Hands-on: build a producer-consumer pipeline with partition-level consumer groups
Day 2 — Capture, integrations & production readiness
  • Event Hubs Capture: automatically land event streams to ADLS Gen2 in Avro or Parquet format
  • Schema Registry: defining and enforcing Avro and JSON schemas across producer teams
  • Integrating Event Hubs with Azure Stream Analytics, Azure Databricks, and Azure Functions
  • Security: managed identities, private endpoints, IP filtering, and RBAC
  • Monitoring: Azure Monitor metrics, diagnostic logs, and throughput unit alerts
  • Hands-on: build a complete streaming pipeline — ingest, capture, process with Stream Analytics, and visualise in Power BI

Who is this for?

  • Data engineers designing real-time ingestion layers for analytics platforms
  • Backend developers building event-driven, distributed applications
  • Teams migrating Apache Kafka workloads to Azure
  • Architects designing IoT and telemetry data platforms at scale

Prerequisites

  • Basic programming experience (Python or .NET)
  • Familiarity with cloud concepts and messaging or queuing systems
  • Basic Azure experience recommended

Tools & technologies covered

Azure Event HubsApache KafkaAzure SDK for PythonAzure SDK for .NETEvent Hubs CaptureSchema RegistryAzure Stream AnalyticsAzure Monitor
Not sure which course fits your team?
Talk to us — we'll match you to the right training path.
Get in touch