Final Delivery

Our solution is built on a physical Minecraft Idea. Each players go around the area to scan Qr codes. Each QR scan represents a user mining some resources – but they should be careful, it could happen that there are some resources there that you dont want. We used both Low-Code and Pro-Code in our solution.

Beyond the gameplay, the application shows how real-time data can be captured and used through simple physical interactions combined with digital systems. By turning data collection into a game, we make participation more engaging while still maintaining full visibility and control of what is happening in real time. The same approach can be reused in real-world scenarios such as asset tracking, live monitoring, or event-based systems, which makes the solution relevant beyond the game and easy to relate to practical use cases.

Category: Redstone Realm

Our solution is a real business application disguised as a game. Built on a physical, Minecraft-inspired concept, it uses QR codes and real-world movement to generate live operational data, similar to how IoT sensors, assets, or events work in real scenarios. Players interact through simple and accessible interfaces on mobile and web, while administrators manage and monitor everything through structured back-office tools.

By turning data collection into gameplay, the solution makes participation engaging while still maintaining full control and visibility of what is happening in real time. This approach demonstrates how physical interaction and digital systems can work together to create meaningful data, support decision-making, and mirror real operational challenges in a way that is easy to understand and fun to experience.

Category: Governance & Best Practices

In our development, we aimed to follow best practices, to keep our solution clean and structured.  

Pipelines in Power Platform 

We created three environments in Power Platform Dev-Test-Prod and created a pipeline for deployments. 

DevOps  

We also created DevOps for the team. This was used to share documentation for our solution, keep everybody on track. It was also used to store our repos, and pipelines for our Frontend and Backend application.  

Security Group – Entra ID 

A security group was created for the team, to provide appropriate access to our components in our solution, both in Power Platform and Azure.

AI Governance 

Users should be clearly informed when they are exposed to AI-generated content. We therefore clearly marked all AI-generated content to avoid misleading our users. As part of the solution, we activated Frontier to access the M365 Agent Overview. Neither of us had used this before, so this provided valuable insight into how we can control our agents, especially as the number of agents grows. 

Secure Access 

All resources are connected using Azure Managed Identity to eliminate hardcoded secrets and credentials. 

Category: Data, AI & Analytics

Our solution focuses on real-time operational data, where immediate feedback and live data are essential. Players generate data while playing, and they expect to see the results and statistics directly on the score point screen together with us as admins keeping track on everything that is going on. Because of this, our primary goal was to reduce latency in reading and writing.

Data Platform Choice

We use Dataverse as our operational data platform, where we store player data, team data, spawn locations, spawn chunks, and resource types. Dataverse also allowed us to use Copilot Studio agent with very little set up required.

We considered using Fabric as well as a solution with better scaling, but since we are not expecting enormous data, we landed on Dataverse as the better option.

Data Visualization

In the beginning, we discussed using Power BI as our visualization layer, but since we had some experience with Power BI not being optimal for live data (in terms of cost), we decided to outsource this task for the pro-code!

AI and Analytics Layer

To provide our data with more depth and additional insights we decided to add some AI Agents to our solution. These are not used to modify the data the players collect, but rather to enhance the gameplay, by providing creative puns and jokes about the current/or newest stats! Then we also wanted to gain more insights from the data, instead of having to manually go through the tabels, and that was the reason for creating the second agent, used to provide us admins more insights in the data. We let the AI mine the gems from our data!

Category: Low-Code

Our solution is built with a low-code first approach. That includes the continuous evaluation of all parts of our solution, and whether a low-code tool is actually the right choice for each specific need. In this very case, with Dataverse as the system of record and single source of truth, we found a natural fit for a combination of Power Apps, Power Automate and Copilot Studio.

Dataverse

We have used Dataverse out-of-the-box, with standard tables, extended with custom columns to fit our data-model.

Power Platform Environments

We set up dev, test and prod environments to be able to test, before shipping our solutions to prod.

The power platform pipelines is used with a host environment, the low code way to push the solutions.

We have also divided up our solution, so that it is easy to work on one area without having to think about dependencies issues. With the creep Publisher.

Power Apps

Power Platform solution: 02 – Itera Scope Creepers Back office

For our main administration tool, we created a model-driven app. We use it to:  

  • Control and have overview of the Data.
  • A way to add new Teams and Players to the team.
  • Add new resources when they are published.
  • Control when the agent should do commentary or not.

Power Automate

Solution: 04 – Itera Scope Creepers – Flows and Power FX 

To make our data flow with little effort we are using Power Automate.  

Here is more about our flows:

A good reason for us to use Power Automate is because we also use Dataverse for our data model and a model driven app on top of those tables. This keeps things in the same universe. It is also very simple to expand on our automation need.  

Using Power Automate we create spawn chunks which can be collected by miners. 
When a miner collects the spawn chunk we want to know that the spawn chunk has been claimed.

Our miner’s interaction with the QR-code is all pro code, using WEB API towards Dataverse. We could probably do all automation the same way, but we have a Low Code-first approach. So when our pro code-cousin has told Dataverse who has claimed a spawn chunk we let Power Automate do the rest. Using Automatic trigger Power Automate will react when a row is modified with a new miner, and then make sure the spawn chunk is set to claimed and our spawn location released for new spawns.

A scheduled flow goes off every 3 minutes. It collects all available spawn locations (available=yes).It triggers a child flow which generates a spawn chunk.It also triggers a flow which checks locations for TNT. The spawn chunks that have not been picked in the last 15 minutes will be deleted and can be replaced by a new resource.When a resource is claimed it triggers a flow that set the status for the chunk and opens the location for a new resource.

Copilot Studio Agents

Solution: 03 – Itera Scope Creepers – Copilot studio Agents

We are using the Microsoft Dataverse MCP Server as a Tool in both our agents:

We gave  both agents clear instructions and guardrails, and use generative AI to and get valuable and fun insights on the data, both for players and for the back office workers, with low code as the motor.

Category: Code Connoisseur

Although our solution has been built with a low-code-first approach, some of our components were clearly better off built with other techniques. For our desktop-targeted dashboard, our mobile-targeted QR code gathering tool and its dedicated backend, a pro-code solution was a natural choice.

Backend (for frontend)

Our backend API is a neatly organized group of Azure Functions, written in C#, running on .NET 10 in a Linux environment. Easily readable endpoints ensure a clean flow of data and requests, with emphasis on clear separation of concerns. 

Here is an example of one of our endpoints:

The updating of our database is handled by calling the Dataverse Web API. Authentication and authorization is handled through Managed Identity.

The backend API follows solid programming principles like proper error handling for each endpoint, efficient transfers with DTOs, and data classes that accurately model the elements of the solution like Miners, Teams, Resources, etc. 

The QR codes we have placed around the venue connect directly to a route in the backend, which then internally manages the gathering of a resource.

Frontend (dashboard + QR code gathering)

The frontend is built as a modern React SPA web application, hosted as an Azure Static Web App. The backend-for-frontend is linked in and proxied, gaining both the benefit of not having to deal with CORS issues and keeping a separation of concern between these two parts of the solution.

Se more details about or frontend in these blog posts:

Category: Digital Transformation

Now, getting the data is not the problem when we have the possibility to generate mock data with AI or be sneaky by gamifying data production to our advantage. The real value is when we can use our data to tweak, optimize, and make better decisions moving forward.

We meet customers who have a lot of data, such as IoT devices feeding data in real time. While many face the challenge of data living in many different systems, another major problem is using this data efficiently to optimize current production or make the workday more efficient through better insights.

In our solution, we tried to mimic this challenge, but with Minecraft gamification to make it more fun to develop. We use AI as live commentary in our solution, motivating players by turning Dataverse tables into fun, catchy statements that add more depth to the gameplay. We also use another AI agent to help us make sense of the data and optimize gameplay. This allows us to talk to our data, understand which players we should engage, identify low-performing locations, and detect abnormalities in the data, such as someone writing scripts to win the game and use these insights to handle things better.

Minecraft ConceptReal-World Equivalent
QR code scanIoT sensor Event
Spawn locationPhysical Asset/Machine
Resource minedAsset state, Usage, or Reading → Errors
PlayerEmployee or operator
DashboardOperations control center
AI commentaryAI supported operational insights/alerts
AI agent insightsAI supported operational insights/alerts

All together now

Badge claimed: Stairway to Heaven

With our solution being a hybrid solution, combining both Power Platform and other resources for the best of both worlds, we use multiple APIs to tie it all together.

Dataverse Web API

Data going back and forth between our Azure Functions backend-for-frontend and Dataverse is handled by the Dataverse Web API. We query and update our Dataverse tables using OData, which is parsed and transformed into DTOs used between our frontend and backend, all with proper authorization logic, making sure the data is secure.

...

var resourceTypeRequestUrl = $"{dataverseBaseUrl}/api/data/v9.2/creep_resourcetypes?$filter=creep_resourcetypeid eq {resourceTypeId}";

var resourceTypeResponse = await client.GetAsync(resourceTypeRequestUrl);

...

var result = new ChunkWithResourceTypeDto(
	Id: chunk.creep_spawnchunkid.ToString(),
	Name: chunk.creep_spawnchunk ?? string.Empty,
	ResourceTypeId: chunk._creep_resourcetype_value,
	LocationId: chunk._creep_spawnlocationid_value,
	MinerId: chunk._creep_miner_value,
	Amount: chunk.creep_amount,
	StatusCode: chunk.statuscode,
	ResourceType: resourceTypeDto);

return new OkObjectResult(result);

...

Entra ID

To make sure our different resources only has access to the necessary data, we use Managed Identities for authentication. In our Azure Functions app we request properly scoped access tokens for use with the Dataverse Web API.

...

TokenCredential credential = new DefaultAzureCredential();
var scope = $"{dataverseBaseUrl}/.default";
var accessToken = await credential.GetTokenAsync(
	new TokenRequestContext(new[] { scope }),
	CancellationToken.None);

...

Copilot Studio

To allow our Power Automate flows to communicate with Copilot Studio, we have configured Dataverse connectors, using flow actions to trigger our Copilot agents.

Azure Resource Manager

Our CI/CD configuration handles building and deployment from our repositories in Azure DevOps to our Azure resources. Azure Pipelines deploys using an Azure Resource Manager service connection and tasks/CLI that call Azure’s management (ARM) APIs.

Our base is for you!

Hacking is important, but your mental health is importanter!
Make sure to have a stroll through our Japanese inspired gardens.

Enjoy a coffee around the camp fire:

Have a day at the local marketplace:

If you need to stop by the nether this is readily available!

Have a break from the real hacker world and escape into our base. All is welcome!

UX for all

Badge claimed: Client Side Salsa

Desktop real-time data Dashboard

Our dashboard and QR code gathering interfaces were both built into the same modern React SPA web app, hosted as an Azure Static Web App, with a dedicated backend-for-frontend (Azure Functions) to provide an added layer or security with regards to accessing our data in Dataverse. The backend uses the Dataverse Web API through tokens retrieved using a Managed Identity, all being abstracted from the frontend itself.

The frontend uses SWR for data-fetching with caching, revalidation, and request deduplication, making sure of performance and stability.

Web app routing is handled with React Router, efficiently utilizing the browser’s own History API for snappy navigation.

Best practices for code structure (and source control) secures maintainability and allows for efficient collaboration.

Using Tailwind for CSS styling and a mobile-first approach, a holistic and performant UX provides responsiveness and cross-device support:

Mobile QR code gathering interface
Mobile version of the dashboard interface

Crawl through the data.

So our Creeper Commentator Copilot Agent is connected to dataverse with the Microsoft Dataverse MCP Server

We want it to generate a lot of different things, based on the data inside the tables we have there

So it crawls and crawls,

And it adds valuable insights to the data for the participants

Don´t Reinvent the Wheel

Claimed Badge: Thieving Bastards

We dont want to reinvent the wheel, but borrow, rent and perhaps scrape our resources to our solution. So, lets try to wrap up the resources we have gathered and collected for our application!

First off, we of course had to leverage the Minecraft themed assets for our solution, to make the experience as much Minecraft like as possible. We scraped the fonts from Minecraft and downloaded the skin types we used.

We also used LinkMobilitys API that we incorporated in our solution. For more information go to this blog post! -> https://acdc.blog/itera26/on-the-clock-24-7/10842/

CI/CD OCD

Badge claim: Power of the Shell

With a dashboard (and QR code gathering) web app running as an Azure Static Web App, linked to a separate Azure Functions app for the backend that handles all communication with Dataverse, we needed a proper setup for efficient development, collaboration and deployment.

Using Azure DevOps to host both repos, we used Azure Pipelines for CI/CD.

The backend runs .NET 10 in Linux, using the DotNetCoreCLI@2 and AzureFunctionApp@2 tasks for build and deployment.

trigger:
  branches:
    include:
      - main

pr:
  branches:
    include:
      - main

variables:
  buildConfiguration: "Release"
  azureSubscription: "Microsoft Azure Sponsorship(30b24b6e-ef03-42c4-bba5-20a33afd68e4)"
  functionAppName: "itera-scope-creepers-api"

stages:
  - stage: BuildAndDeploy
    displayName: "Build & Deploy"
    jobs:
      - job: Build
        displayName: "Build API Function App"
        pool:
          vmImage: "ubuntu-latest"

        steps:
          - checkout: self

          - task: UseDotNet@2
            displayName: "Use .NET SDK 10.x"
            inputs:
              packageType: "sdk"
              version: "10.x"

          - task: DotNetCoreCLI@2
            displayName: "Restore NuGet packages"
            inputs:
              command: "restore"
              projects: "API.csproj"

          - task: DotNetCoreCLI@2
            displayName: "Build"
            inputs:
              command: "build"
              projects: "API.csproj"
              arguments: "--configuration $(buildConfiguration)"
              publishWebProjects: false

          - task: DotNetCoreCLI@2
            displayName: "Test (all *Tests.csproj projects if present)"
            inputs:
              command: "test"
              projects: "**/*Tests.csproj"
              arguments: "--configuration $(buildConfiguration)"
              publishTestResults: true

          - task: DotNetCoreCLI@2
            displayName: "Publish function app"
            inputs:
              command: "publish"
              projects: "API.csproj"
              arguments: "--configuration $(buildConfiguration) --output $(Build.ArtifactStagingDirectory)/publish"
              publishWebProjects: false
              zipAfterPublish: true

          - task: AzureFunctionApp@2
            displayName: "Deploy Azure Function App"
            inputs:
              azureSubscription: "$(azureSubscription)"
              appType: "functionAppLinux"
              appName: "$(functionAppName)"
              package: "$(Pipeline.Workspace)/**/*.zip"

The frontend is a client-side rendered React app, using Vite as the bundler and pnpm as the package manager for increased security, and is both built and deployed using the AzureStaticWebApp@0 task.

trigger:
  branches:
    include:
      - main

pool:
  vmImage: ubuntu-latest

variables:
  NODE_VERSION: '22.22.0'

steps:
  - task: NodeTool@0
    displayName: 'Use Node.js $(NODE_VERSION)'
    inputs:
      versionSpec: '$(NODE_VERSION)'

  - script: |
      corepack enable
      pnpm config set node-linker hoisted
      pnpm install --frozen-lockfile
    displayName: 'Install dependencies with pnpm'

  - script: pnpm build
    displayName: 'Build app'

  - task: AzureStaticWebApp@0
    displayName: 'Deploy to Azure Static Web App (itera-scope-creepers)'
    inputs:
      azure_static_web_apps_api_token: '$(AZURE_STATIC_WEB_APPS_API_TOKEN)'
      app_location: '/'
      api_location: ''
      output_location: 'dist'

For local development, the SWA CLI is used to emulate the linked backend.

CI/CD FTW! 🤓

Lets get them to play.

How to gather data from iot stations without having them. Using retro ways!

The QR code!

Gathering of data – The game – Oh, so retro:

This is a way for the hackkers to get up and move during the hackaton, so this is the most retro thing of all. No sitting on a chair to play a game, but the really RETRO way of go to a location to play, you have to physically move!

So we really feel that we are a Community champion, we have encouraged and helped all the hackers, judges and the comittee to get up and move and get their step count up.

Health is number one!

Goooooood morning!

Before the sun come up

Early bird cathes the worm!

Competition

We have a competition going

BUT DOH!

The comitee has run a script that mines the whole night. Thank you for giving us so much data 😀

Lets see what will happen.

Teamspirit

We have kept our team spirit througout the Hackaton! And will continue to spread good vibes with our team and table setup!

Two of us won the 5-kamp! So we put in some good effort!