Right Now!!!⚡⚡⚡

Building Real-Time DevOps Intelligence with Event-Driven Architecture

The Problem: Waiting for Data

We were building dashboards, but they were always showing yesterday’s data. A work item gets updated in Azure DevOps, and we’d wait hours—sometimes days—before it appeared in our analytics. By the time we saw the trends, the sprint was over.

We needed to know what was happening right now.

The Solution: Real-Time Event Streaming

We built a real-time pipeline that captures DevOps events the moment they happen and streams them directly into our analytics platform. No waiting. No batching. Just instant visibility.

Here’s how we did it.

The Architecture: From Event to Insight in Seconds

Our real-time flow looks like this:

Azure DevOps Event → Service Hook → Event Hub → Fabric Streaming → Eventhouse → KQL Database

Each component plays a crucial role, and together they create a seamless real-time data pipeline.

Step 1: Capturing Events with Service Hooks

When someone creates, updates, or deletes a work item in Azure DevOps, we need to catch that event immediately. Service Hooks are our webhooks—they listen for these events and forward them instantly.

We created service hooks for all work item events using PowerShell:

# Create Service Hook for Work Item Updates
$subscriptionBody = @{
    publisherId = "tfs"
    eventType = "workitem.updated"
    resourceVersion = "1.0"
    consumerId = "webHooks"
    consumerActionId = "httpRequest"
    publisherInputs = @{
        projectId = $projectId
    }
    consumerInputs = @{
        url = "https://{namespace}.servicebus.windows.net/{eventhub}/messages"
        httpHeaders = @{
            "Content-Type" = "application/json"
            "Authorization" = $sasToken
        }
    }
}

$response = Invoke-RestMethod -Uri $subscriptionsUrl -Method Post -Headers $headers -Body ($subscriptionBody | ConvertTo-Json -Depth 10)

This script creates hooks for:

  • workitem.created – When a new work item is created
  • workitem.updated – When a work item changes
  • workitem.deleted – When a work item is removed
  • workitem.restored – When a deleted item comes back

The moment any of these events fire, Azure DevOps sends a JSON payload directly to our Event Hub endpoint.

Step 2: Streaming to Event Hub

Event Hub is our high-throughput message broker. It receives thousands of events per second and queues them for processing. The beauty of Event Hub is its HTTP endpoint—Service Hooks can POST directly to it without any intermediate services.

The Event Hub receives events like this:

{
  "id": "27646e0e-b520-4d2b-9411-bba7524947cd",
  "eventType": "workitem.updated",
  "publisherId": "tfs",
  "message": {
    "text": "Bug #5 (Some great new idea!) updated by Jamal Hartnett.",
    "html": "...",
    "markdown": "..."
  },
  "resource": {
    "id": 2,
    "workItemId": 0,
    "rev": 2,
    "revisedBy": {
      "displayName": "Jamal Hartnett"
    }
  },
  "createdDate": "1/22/2026 5:53:48 PM"
}

Events flow into Event Hub in real-time, ready for the next step.

Step 3: Fabric Real-Time Intelligence Streaming

Fabric Real-Time Intelligence connects our Event Hub to our Eventhouse database. We configured a streaming pipeline that ingests data continuously:

{
  "sources": [
    {
      "name": "CustomEndpoint-Source",
      "type": "CustomEndpoint",
      "properties": {}
    }
  ],
  "destinations": [
    {
      "name": "Eventhouse",
      "type": "Eventhouse",
      "properties": {
        "dataIngestionMode": "ProcessedIngestion",
        "databaseName": "DevOpsEventHouse",
        "tableName": "Logs",
        "inputSerialization": {
          "type": "Json",
          "properties": {
            "encoding": "UTF8"
          }
        }
      }
    }
  ],
  "streams": [
    {
      "name": "teststream-stream",
      "type": "DefaultStream",
      "inputNodes": [
        {
          "name": "CustomEndpoint-Source"
        }
      ]
    }
  ]
}

This configuration creates a live stream: Event Hub → Fabric Streaming → Eventhouse. Data flows continuously, not in batches.

Step 4: Storing in Eventhouse

Eventhouse is Fabric’s real-time analytics database. It’s built on KQL (Kusto Query Language) and designed for high-velocity data ingestion. As events arrive, they’re immediately available for querying.

We created our table schema:

.create-merge table Logs (
    id:string,
    eventType:string,
    publisherId:string,
    message_text:string,
    message_html:string,
    message_markdown:string,
    detailedMessage:string,
    resource_id:string,
    resource_url:string,
    resource_workItemId:string,
    resourceVersion:string,
    resourceContainers_collection_id:string,
    resourceContainers_account_id:string,
    resourceContainers_project_id:string,
    createdDate:string,
    EventProcessedUtcTime:datetime,
    PartitionId:long,
    EventEnqueuedUtcTime:datetime
)

Events land in this table within seconds of happening in Azure DevOps.

Step 5: Querying with KQL

Now comes the magic: we can query this data in real-time using KQL. Want to see all work items updated in the last hour?

Logs
| where eventType == "workitem.updated"
| where EventProcessedUtcTime > ago(1h)
| project EventProcessedUtcTime, message_text, resource_workItemId
| order by EventProcessedUtcTime desc

Want to see which users are most active?

Logs
| where eventType startswith "workitem"
| extend UserName = extract(@"updated by ([^.]+)", 1, message_text)
| summarize Count = count() by UserName
| order by Count desc
| take 10

These queries run against live data. No refresh needed. No waiting.

The Real-Time Experience

Here’s what happens when someone updates a work item:

  1. 0 seconds: Developer changes work item state from “Active” to “Resolved”
  2. < 1 second: Service Hook fires, sends event to Event Hub
  3. < 2 seconds: Event Hub receives and queues the event
  4. < 3 seconds: Fabric Streaming ingests event into Eventhouse
  5. < 4 seconds: Event appears in KQL database, ready for querying
  6. < 5 seconds: Dashboard refreshes, showing updated metrics

From action to insight in under five seconds. That’s real-time.

The Code That Makes It Work

Our PowerShell script automates the entire setup:

# create-service-hooks-eventhub.ps1
param(
    [string]$EventHubNamespace,
    [string]$EventHubName,
    [string]$EventHubSasConnectionString
)

# Build Event Hub HTTP endpoint
$eventHubUrl = "https://$EventHubNamespace.servicebus.windows.net/$EventHubName/messages"

# Create service hook subscription
$subscriptionBody = @{
    publisherId = "tfs"
    eventType = "workitem.updated"
    consumerId = "webHooks"
    consumerActionId = "httpRequest"
    consumerInputs = @{
        url = $eventHubUrl
        httpHeaders = @{
            "Content-Type" = "application/json"
            "Authorization" = $EventHubSasConnectionString
        }
    }
}

# Create the hook
Invoke-RestMethod -Uri $subscriptionsUrl -Method Post -Headers $headers -Body ($subscriptionBody | ConvertTo-Json)

Run this script once, and you have real-time event streaming configured. No manual setup. No complex infrastructure.

What We Can Do Now

With real-time data flowing, we can:

Monitor Activity Live: See work items being created and updated as they happen, not hours later.

Detect Issues Immediately: When a work item gets stuck in a state for too long, we know right away.

Track Velocity in Real-Time: Watch sprint progress update second by second, not day by day.

Respond Faster: If a critical bug gets created, we’re notified instantly, not when the next batch job runs.

Build Live Dashboards: Power BI dashboards that refresh automatically, showing current state, not historical snapshots.

The Business Impact

Before real-time streaming:

  • Status meetings: “Let me check yesterday’s data…”
  • Sprint planning: “Based on last week’s metrics…”
  • Issue detection: “We noticed this problem two days ago…”

After real-time streaming:

  • Status meetings: “As of right now, we have 12 active work items…”
  • Sprint planning: “Our current velocity shows…”
  • Issue detection: “This just happened—let’s address it now.”

We went from reactive to proactive. From historical to current. From waiting to knowing.

The Technical Win

This architecture is elegant because it’s simple:

  • No polling: Events push to us, we don’t pull
  • No batching: Data flows continuously
  • No delays: Each component is optimized for speed
  • No complexity: Standard Azure services, well-integrated

We’re using Service Hooks (webhooks), Event Hub (message broker), Fabric Streaming (data pipeline), and Eventhouse (analytics database). Each is purpose-built for real-time scenarios.

What’s Next

Real-time data opens new possibilities:

  • Live sprint boards that update as work happens
  • Instant notifications when critical events occur
  • Predictive analytics based on current patterns
  • Automated responses to specific event triggers

But the foundation is set: we have real-time data flowing from Azure DevOps to our analytics platform. Everything else builds on this.

The Code Repository

All our scripts are version-controlled and reusable:

  • create-service-hooks-eventhub.ps1 – Sets up Service Hooks for Event Hub
  • create_service_hooks_eventhub.py – Python alternative
  • Event stream configuration in Fabric-git/Streaming/
  • KQL table schemas in Fabric-git/Streaming/testEH.Eventhouse/

We can spin up real-time streaming for any Azure DevOps project in minutes.


This real-time pipeline demonstrates the power of event-driven architecture. By connecting Azure DevOps Service Hooks → Event Hub → Fabric Real-Time Intelligence → Eventhouse, we’ve created a system that knows what’s happening right now, not what happened yesterday.