FABRIC CATEGORY: Data Activator and Power Automate AI builder Usage

Any person can run power automate flows from dataverse. But only the Fabricator can trigger them from fabric lakehouse data changes using data activator in workspace.

Here we combined the power of fabric with flexibility of power automate to manage our image collection process.

We firstly created data activator and selected our data sources, so activator knows when and where to trigger.

We configured it so it will only trigger when an image file is added to our Images folder for students.

Define action and create flow to run.

Here we define an action and it gives us endpoint to be used as flow trigger. We will come back to here at last step.

We need to create a rule to call the action we defined. This rule allow us to add additional conditions to our filter if needed, let us choose which action to call. Also we can add additional parameters to be sent to power automate flow.

And lastly our power automate flow: The endpoint we received before needs to be set for connection of trigger.

We are using power platforms ai builder to recognize the data and categorize it for further usage.

We send our response to sharepoint for further operations.

As Fabricator it is important to automate our business and keep it tidy and neat. This is the way of the fabricator.

Automating Fabric Deployment with DevOps Scheduled Pipelines

In this article, we’ll are being very serious and will discuss how to set up a DevOps schedule to call a PowerShell script for deploying an Azure Service Fabric application.

Previously, we created a pipeline to promote changes across Development, Test, and Production workspaces. However, a robust Application Lifecycle Management (ALM) process requires more automation. This time, we’ll use scheduled PowerShell scripts in Azure DevOps to streamline deployment tasks.

s


DevOps YAML Pipeline for Scheduled Deployment

trigger: none

schedules:

  – cron: “0 2 * * *”  # Schedule to run at 2 AM daily

    displayName: Nightly Deployment

    branches:

      include:

        – main

    always: true

pool:

  vmImage: ‘windows-latest’

variables:

  ClusterEndpoint: “<your-cluster-endpoint>”  # Endpoint of the Service Fabric cluster

  AppPackagePath: “<path-to-your-application-package>”  # Path to the application package

  ApplicationName: “<application-instance-name>”  # Name of the application instance

  ApplicationTypeName: “<application-type-name>”  # Application type name

  ApplicationTypeVersion: “<application-type-version>”  # Version of the application type

  DeploymentPipelineName: “FabricDeployment”  # Name of the deployment pipeline

  SourceStageName: “Development”  # Source environment for deployment

  TargetStageName: “Test”  # Target environment for deployment

  DeploymentNote: “Daily Deployment”  # Description or note for the deployment

steps:

  – task: PowerShell@2

    displayName: “Deploy Azure Service Fabric Application”

    inputs:

      filePath: “$(System.DefaultWorkingDirectory)/scripts/Deploy-ServiceFabricApp.ps1”

      arguments: >

        -ClusterEndpoint $(ClusterEndpoint)

        -AppPackagePath $(AppPackagePath)

        -ApplicationName $(ApplicationName)

        -ApplicationTypeName $(ApplicationTypeName)

        -ApplicationTypeVersion $(ApplicationTypeVersion)

      failOnStderr: true

Above yaml will call the powershell script in following link.

https://github.com/microsoft/fabric-samples/blob/main/features-samples/fabric-apis/DeploymentPipelines-DeployAll.ps1

Power BI alerts to warm you up!

Faruk the Fabricator who travelled all the way from sunny Istanbul, got really frozen outdoors. So he came back unusually alerted and went straight into work, to warm up.

Data activators is up. When failure is close, our Fabricator will know.

An alert has been set for the KPI column. When it falls below a certain value we send an email and alert the user.

More will come with table based data activators. This is just the beginning.

Faruk the Fabricator is just warming up…

PS: we claimm PlugNPlay badge for the sending notifications to TEAMS

Have you heard or maybe experienced Existential Crisis?

Our Faruk the Fabricator is living it now. Since yesterday, he was challenging himself on how to display the results in the best possible way… but he never questioned his Loooove to Fabric. And he also knows that the Fabric is the Marauder’s Map of data—you always know where everything is, even if it’s trying to hide.

Here are the results of our data in Power BI report. We can see different metrics for student data.

We can also put KPI and compare ourselves with the so called Sorting Hat.

It is our first year since Wayfinder Academy was created and we introduced our Logiquill portal and we already show same performance, wait a few more years and we will leave him in the dust, as it already wasn’t dusty enough.

Here are our components inside Fabric

Our dax code in Power BI:

Here is why we also claim Chameleon badge, in addition to Dash it Out:

Solution is responsive. Adapts to all devices and screen sizes.