All together now

Badge claimed: Stairway to Heaven

With our solution being a hybrid solution, combining both Power Platform and other resources for the best of both worlds, we use multiple APIs to tie it all together.

Dataverse Web API

Data going back and forth between our Azure Functions backend-for-frontend and Dataverse is handled by the Dataverse Web API. We query and update our Dataverse tables using OData, which is parsed and transformed into DTOs used between our frontend and backend, all with proper authorization logic, making sure the data is secure.

...

var resourceTypeRequestUrl = $"{dataverseBaseUrl}/api/data/v9.2/creep_resourcetypes?$filter=creep_resourcetypeid eq {resourceTypeId}";

var resourceTypeResponse = await client.GetAsync(resourceTypeRequestUrl);

...

var result = new ChunkWithResourceTypeDto(
	Id: chunk.creep_spawnchunkid.ToString(),
	Name: chunk.creep_spawnchunk ?? string.Empty,
	ResourceTypeId: chunk._creep_resourcetype_value,
	LocationId: chunk._creep_spawnlocationid_value,
	MinerId: chunk._creep_miner_value,
	Amount: chunk.creep_amount,
	StatusCode: chunk.statuscode,
	ResourceType: resourceTypeDto);

return new OkObjectResult(result);

...

Entra ID

To make sure our different resources only has access to the necessary data, we use Managed Identities for authentication. In our Azure Functions app we request properly scoped access tokens for use with the Dataverse Web API.

...

TokenCredential credential = new DefaultAzureCredential();
var scope = $"{dataverseBaseUrl}/.default";
var accessToken = await credential.GetTokenAsync(
	new TokenRequestContext(new[] { scope }),
	CancellationToken.None);

...

Copilot Studio

To allow our Power Automate flows to communicate with Copilot Studio, we have configured Dataverse connectors, using flow actions to trigger our Copilot agents.

Azure Resource Manager

Our CI/CD configuration handles building and deployment from our repositories in Azure DevOps to our Azure resources. Azure Pipelines deploys using an Azure Resource Manager service connection and tasks/CLI that call Azure’s management (ARM) APIs.

Master of endpoints

Our helpful proactive teams bot uses multiple APIs to make magic happen. 

The two most obvious ones are Microsoft Graph API (for user details) and Microsoft Teams API (for the bot code itself)

We also implement the Azure Maps API to render a helpful map to pinpoint where a report was generated, and most importantly we have the Dataverse Web API to create Opportunities in CRM:

->

->

Our bot’s Adaptive Card not only saves users time, it delivers a clean, modern, and engaging experience out of the box.
And since it runs inside Microsoft Teams, it automatically adapts to every device and screen size.
That leaves us free to focus on digging for gold (or dirt) instead of fighting with breakpoints.

OneFlow and LINK Mobility Sponsor Badge and more

We here by claim the badges:

  • Stairway to Heaven,
  • Thieving Bastards,
  • Power User Love
  • OneFlow Sponsor Badge
  • LINK Mobility Sponsor Badge

The components in the current solution:

We have created a solution containing both pro code and low code, AI, Microsoft APIs and third party tools.

  • Code App for collection Dream Project request
  • Power Automate that uses Dataverse, OneFlow, AI Builder, Outlook and Link Mobility actions
  • Model-driven app as an backoffice app to follow up project. Perfect low code app for managing dream project requests.
  • OneFlow Portal for managing contracts

A go through of the solution

The Code App is vibe coded using TypeScript and React and used for collecting Dream Project requests. This creates a Dataverse record that triggers a power automate flow that creates a contract in OneFlow. Pure pro code as far as the eye can see 👀

Further, we created a Power Automate flow that gets triggered when a new Dream Project is created and this automates our contract process by using OneFlows actions for creating the contract and adding the customer to the contract. LINK Mobility and Outlook action are used to send an SMS and email to the customer responsible for quick follow up and contract validation before sending it. This is a perfect combination of pro code and low code.

The flow looks like this.

The AI Prompt looks in the Power Automate looks like this and this is sendt til the AI Prompt.

The “Run a prompt” action sends the data to the prompt below and gets the outline for the contract details back. And, we are using this in the contract generation.

We used Template Groups in OneFlow to populate documents from power automate.

The fields are created in the OneFlow portal. These must be created and added to the template before they are available in the power automate flow.

The fields are added to the Template in OneFlow like this.

After sending the contract out from the OneFlow portal the customer gets and email that they can sign.

The contracts looks like this after been generated by the Power Automate flow. The description is generated using AI Prompts feeded with data from the dream project request from Dataverse.

The LINK Mobility

We are using the LINK Mobility Action in Power Automate that lets us send SMS with ease using a source number, platform id and partner id and a phone number to send the SMS to. This works like charm.

The SMS that is sent looks like this:

Business value:
The tools helps as stich our solution together and improve the value for our customers by providing faster contract creation and automation, and notification by SMS for quick follow up by customer success manager.

Claims:

Stairway to Heaven

In our contract generation flow we are using three Microsoft APIs for solving our business need

  • Dataverse API
  • AI Prompt API
  • Outlook API

Thieving bastards

Using OneFlow and LINK mobility as payable third party solutions to improve our business process by using existing tools in the marked that is developed by marked experts and that keeps being updated and improved.

OneFlow

Using OneFlow for contract generation and posted on LinkedIn

LINK Mobility

Using LINK Mobility for notification to customer success mangers and more SMS notification in an upcoming sprint. And posted on LinkedIn

Power User Love

By combining the power of pro code for more customizable look, feel and usability for collecting data and using low code apps for backoffice project follow up with a more fix look and feel with drag and drop.

Code App using TypeScript and React for more control over the user interface and interactivity:

Using Model-driven app as a backoffice with more “strict” look and field, but perfect for following up and adding data.

Proof of LinkedIn post

https://www.linkedin.com/posts/fredrikengseth_oneflow-and-link-mobility-sponsor-badge-and-activity-7420762390587166720-9Gl4?utm_source=share&utm_medium=member_desktop&rcm=ACoAACAQyBoBq-xtaRrmS1pVkkIip0jNA_TMbIo

Automating Solutions ALM with Github Actions and AI

Our developers should focus their time as little as possible on repeating tasks like, deployment, release notes, updating technical documentation, who did what at what time, and the list goes on. It’s a really important job but it keeps eating our valuable time…

Our solution to this is a solution designed to streamline Dataverse solution management while enforcing ALM best practices. It combines Dataverse Pipelines, GitHub Actions, AI-powered documentation & Teams notifications to deliver fully automated, auditable, and governed deployments.

Deployment Stage record in Power Platform Pipelines

Automated Solution Management

  • Power Platform Pipelines – Developer triggers the deployment for the respective solution.
  • Cloud flows – Triggered on pre-deployment step & integrates with Github & Microsoft Teams.
  • GitHub Actions export, unpack, commit & create pull requests for the solution.
  • PR outcome triggers a cloud flow in order to notify users and continue/stop the deployment.
Triggered on pre-deployment step
#StairwayToHeaven

Governance & Deployment Control

  • Github PR review acts as a pre-deployment approval step, giving teams control over which solutions that can reach the target environments.
  • Deployment outcomes are sent back to Dataverse and Teams, providing real-time feedback.
  • Branch strategy (dev for development, main for production) keeps production stable and auditable.
Triggered from Github Action (pr-feedback-to-dataverse.yml)
Deployment Stage Run is updated with link to Github PR for more details

AI-Powered Documentation

  • GitHub Copilot analyzes every PR and generates technical documentation automatically.
  • Changelogs, impact analysis, and test recommendations are included, making knowledge transparent and up-to-date.
  • Documentation is versioned and stored alongside solutions for easy reference.


Benefits

  • Faster Deployments: Automation reduces manual steps and errors.
  • Full Governance: PR workflow enforces approvals and branch protection.
  • Better Transparency: Teams see real-time deployment status and AI-generated documentation.
  • Audit-Ready: Every change, approval, and deployment is logged and version-controlled.

Azure API: minimal

Leveraging existing services effectively is key to successful implementation. By building on proven Azure capabilities, we can both reduce costs and introduce telemetry that enables proactive monitoring of our web application’s health.

Within the Model Builder portal, this approach is realized with the following Azure services:

  • Azure Blob Storage is used to store generated projections and models, providing a scalable and cost efficient storage foundation:
  • Azure OpenAI is used to extract tags from the initial image uploaded by the user, enabling automated enrichment and downstream processing:
  • Azure Application Insights is used to collect telemetry from the application, giving visibility into performance, usage patterns, and potential issues so they can be identified and addressed proactively.

Stairway to Heaven: Microsoft Cloud Integration in Action

Our solution demonstrates how multiple Microsoft Cloud APIs can be combined into a single, event-driven architecture, with Dynamics 365 Finance & Operations (FO) acting as the production system of record.

The flow starts when a production order is created in D365 FO. FO exposes the order through its OData / Business Events APIs, which are consumed by an Azure Function. This serverless component validates and transforms the data, ensuring the core ERP system remains isolated from downstream processing.

From the Azure Function (dataverse API), the production event is published to Azure Event Hub, using Azure’s native messaging APIs. Event Hub enables reliable, scalable streaming and decouples FO from consumers, allowing the architecture to grow without impacting the ERP system.

The streamed events are then ingested by Microsoft Fabric, where data pipelines process the production data in near real time. Fabric acts as the central data and analytics layer, feeding insights to Power BI dashboards and triggering actions in the Power Platform. In some scenarios, enriched data can also be written back to D365 FO through its APIs, closing the loop.

By combining 1) Dynamics 365 FO APIs, 2) Dataverse API and Event Hub APIs, and 3) Microsoft Fabric and Power BI APIs, this architecture shows how Microsoft cloud services can work together. earning its place on the Stairway to Heaven.

Thieving Ssslytherins on a stairway to HELL

Thieving bastards

As true Slytherins we of course have to steal, and steal we did!

  • ngrok
    We used third party ngrok to expose our local endpoint our local development server to the Internet!
  • SASS + REACT = <3
    Clever clogs utilize existing frameworks for simplifying styling: Sass – CSS with superpowers in combination with React makes a perfect match.
  • Link Mobility
    It’s good to know that your message has been delivered. Therefore you get a confirmation when your message has been delivered to the Howler!

Stairway to HELL

In our solution we combine these three Cloud APIs to make our solution floooow through the cloud!

  • Azure Speech Services through SpeechSDK
  • Azure OpenAI API
  • Dataverse Web-API

Wizard Tracking: 3D geolocation in Canvas App

In our solution, users will be gathering ingredients using object detection in a Canvas App. The AI model used for this has been trained on objects around the conference venue, and so we wanted to enhance the connection between the app and the real world. Already having access to the users geo location through the geolocation web API inside the Canvas App and any PCF components, we decided to these data to place the active users on a 3D representation of the venue, expressing our power user love by merging 3D graphics with the OOB Canvas App elements.

We were able to find a simple volume model of the buildings on the map service Kommunekart 3D, but these data seem to be provided by Norkart, which is not freely available.

Like the thieving bastards we are, we decided to scrape the 3D model off of the site, by fetching all the resources that looked like binary 3D data. We found the data was in B3DM format and we found the buildings in one of these. We used Blender to clean up the model, by removing surrounding buildings and exporting it to glTF 3D file format, for use in a WebGL 3D context.

The representation of the 3D model, we decided to do with Three.js, which let us create an HTML canvas element inside the PCF component and using its WebGL context to render out the model in 3D. The canvas is continuously rendered using requestAnimationFrame under the hood, making it efficient in a browser context. The glTF model was loaded using a data URI, as a workaround for the web resource file format restrictions.

The coordinates from the user’s mobile device comes in as geographical coordinates, with longitude, latitude and altitude. The next step was to map these values relative to a known coordinate in the building, which we chose to be the main entrance. By using the main entrance geographical coordinates, we could then convert that to cartesian coordinates, with X, Y and Z, do the same to the realtime coordinates from the user, and subtract the origin, to get the offset in meters. The conversion from geographic to geocentric coordinates were done like so:

// eslint-disable-next-line @typescript-eslint/consistent-type-definitions
export type CartesianCoordinates = { x: number; y: number; z: number };

// eslint-disable-next-line @typescript-eslint/consistent-type-definitions
export type GeographicCoordinates = { lat: number; lon: number; alt: number };

// Conversion factor from degrees to radians
const DEG_TO_RAD = Math.PI / 180;

// Constants for WGS84 Ellipsoid
const WGS84_A = 6378137.0; // Semi-major axis in meters
const WGS84_E2 = 0.00669437999014; // Square of eccentricity

// Function to convert geographic coordinates (lat, lon, alt) to ECEF (x, y, z)
export function geographicToECEF(coords: GeographicCoordinates): { x: number; y: number; z: number } {
	// Convert degrees to radians
	const latRad = coords.lat * DEG_TO_RAD;
	const lonRad = coords.lon * DEG_TO_RAD;

	// Calculate the radius of curvature in the prime vertical
	const N = WGS84_A / Math.sqrt(1 - WGS84_E2 * Math.sin(latRad) * Math.sin(latRad));

	// ECEF coordinates
	const x = (N + coords.alt) * Math.cos(latRad) * Math.cos(lonRad);
	const y = (N + coords.alt) * Math.cos(latRad) * Math.sin(lonRad);
	const z = (N * (1 - WGS84_E2) + coords.alt) * Math.sin(latRad);

	return { x, y, z };
}

This gave us fairly good precision, but not without the expected inaccuracy caused by being indoors.

In our solution the current position is then represented by an icon moving around the 3D model based on the current GPS data from the device.

To connect this representation to realtime data from all the currently active users, we decided to set up an Azure SignalR Service, with an accompanying Azure Storage and Azure Function App for the backend, bringing it all to the cloud, almost like a stairway to heaven. With this setup, we could use the @microsoft/azure package inside the PCF component, receiving connection, disconnection and location update message broadcast from all other users, showing where they are right now.

Stairway to Heaven

We are using the following API in our solution:

  • Azure Cognitive Services to implement speech-to-text:

SharePoint API to invite the users to the portal and work on the SharePoint lists

Relevance Search API to implement search of the upcoming events on the portal to help the students complete the onboarding by including socialization activities in the new school.

Giving businesses a (Power) platform

Since many of the business owners of Diagon Alley spend most of the days on their feet in the store, they require a working solution that is easy to use and accessible wherever they might be. Because of this, we have decided to give them different possible solution within the Power Platform. Power Page for customer interaction, Power Automate to help reducing time demanding tasks and a data driven Power App that fit perfectly on a small Teams application on a store-owners cellphone.

Automating the office work

The daily-to-day operations of any business is to make sure there always is enough of their products, and making sure they always are stocked. To help the owners keep the supplies under control, we developed an Power Automate flow that would help them generate an Supplier Agreement contract in SharePoint.

The initial thought was to create a content type on a document library that would inherit metadata properties from the SharePoint columns and automatically fill inn the SharePoint property fields and thereby create a valid contract.

Unfortunately, this require that we edit the template locally, but we are working on computers with a safety policy that doesn’t allow us to connect to this fields when they are in another tenant. Still, we found a solution by downloading an empty document from the library, populate the fields quick parts connected to the document properties and re-upload this to the library. This allowed us to generate the agreement anyway, by using a Power Automate flow that populated the documents quick part fields.