Craftsman – Automated Plugin Deployment for Power Platform with Azure DevOps

For the Arctic Cloud Developer Challenge, we implemented a fully automated CI/CD pipeline that builds, tests, and deploys our 404ACDC Plugin to Power Platform without any manual intervention. This eliminates the traditional pain points of plugin deployment—no more opening the Plugin Registration Tool, manually uploading DLLs, or risking configuration drift between environments.

Pipeline Architecture

Our pipeline consists of three sequential stages, each depending on the successful completion of the previous one.

The Build Stage restores NuGet packages (including the Dataverse SDK), compiles the solution in Release configuration, runs unit tests, and publishes the plugin assembly as a build artifact. The pipeline triggers automatically whenever code changes are pushed to the main branch in the plugin or test project directories.

As shown in our pipeline configuration, we have set up smart triggering that only runs the pipeline when relevant files change:

trigger:
  branches:
    include:
    - master
    - main
  paths:
    include:
    - 404ACDC_Plugin/**
    - PluginTests/**

The Deploy Stage downloads the build artifacts and uses Microsoft’s official Power Platform Build Tools to push the updated assembly to Dataverse. Before deployment, we run a WhoAmI check to verify our service connection is valid. Here is the core deployment configuration from our pipeline:

- task: PowerPlatformWhoAmI@2
displayName: 'Verify Power Platform Connection'
inputs:
authenticationType: 'PowerPlatformSPN'
PowerPlatformSPN: '$(ServiceConnectionName)'

- task: PowerPlatformUpdatePluginAssembly@2
displayName: 'Update Plugin Assembly'
inputs:
authenticationType: 'PowerPlatformSPN'
PowerPlatformSPN: '$(ServiceConnectionName)'
AssemblyPath: '$(Pipeline.Workspace)/drop/plugin/404ACDC_Plugin.dll'
UpdateAssemblyVersion: true

The Register Steps Stage uses PowerShell and the Microsoft.Xrm.Data.PowerShell module to register or update plugin steps, wiring our code to the appropriate Dataverse messages and entities.

Security Approach

We never store credentials in our pipeline YAML. Instead, we use an Azure DevOps variable group that contains our environment configuration, as referenced in the pipeline:

variables:
solution: '**/*.sln'
buildPlatform: 'Any CPU'
buildConfiguration: 'Release'
pluginProjectPath: '404ACDC_Plugin/404ACDC_Plugin.csproj'

- group: PowerPlatform-Variables # Contains ServiceConnectionName, EnvironmentUrl, etc.

The service principal credentials are stored as encrypted secret variables, masked in all pipeline logs. The service principal itself has only the minimum permissions needed to register plugins.

Build and Test Configuration

Our build stage ensures code quality by compiling and running tests before any deployment occurs:

- task: VSBuild@1
displayName: 'Build Solution'
inputs:
solution: '$(solution)'
platform: '$(buildPlatform)'
configuration: '$(buildConfiguration)'
msbuildArgs: '/p:DeployOnBuild=false'

- task: VSTest@2
displayName: 'Run Unit Tests'
inputs:
platform: '$(buildPlatform)'
configuration: '$(buildConfiguration)'
testSelector: 'testAssemblies'
testAssemblyVer2: |
**\*test*.dll
!**\*TestAdapter.dll
!**\obj\**

Why This Matters

This approach gives us repeatability (every deployment follows the identical process), auditability (complete history of every pipeline run), speed (deployments that took 15-20 minutes manually now complete in a few minutes), and safety (no exposed credentials, no deploying broken builds).

How we did it

The pipeline YAML in our repository is the actual configuration driving our deployments. Our Azure DevOps project contains the corresponding service connection, variable group, and deployment environment (PowerPlatform-Production) referenced in the pipeline. The stage dependencies ensure we never deploy untested code:

- stage: Deploy
displayName: 'Deploy to Power Platform'
dependsOn: Build
condition: succeeded()

This automated deployment foundation demonstrates that we have applied professional DevOps practices to our Power Platform solution, ensuring reliable and consistent deployments throughout the challenge.

It’s a wrap — Final delivery at ACDC 2026 🚀

This is our final contribution for the ACDC2026 Hackathon, our final delivery.
We made a video of the end to end solution, enjoy the video.

Image bellow describes the overall design of the solution.

🧱 Redstone Realm

Showing Jetson Nano Edge AI LLM providing guidance for the customer off-grid.
  • Built a real, working solution while actively exploring new Microsoft platform capabilities
  • Used in-box AI such as Prompt Columns and Copilot to embed AI directly into the data model and user experience
  • Grounded AI output in structured data to keep interactions predictable and explainable
  • Used Code Apps to experiment with new ways of building user-friendly experiences and validate ideas quickly
  • Experimented with Edge AI using an NVIDIA Jetson Nano to run LLMs closer to execution
  • Explored trade-offs between edge-based and cloud-based AI through hands-on experimentation

Redstone Realm, for us, was about building, testing, learning — and pushing understanding forward using real tools on real platforms.

Relevant blogpost:
Existential Risk: intelligence without agency, Nvidia Jetson Nano, Glossy Pixels | Arctic Cloud Developer Challenge Submissions

🛡️ Governance & Best Practices

  • Stored all secrets in Azure Key Vault, accessed at runtime by Azure Functions and Power Automate via environment variables
  • Used a clear DEV / TEST / PROD environment strategy with a structured ALM setup for predictable deployments
  • Maintained clear architectural separation between UI, integration, and execution, with deterministic and testable backend logic
  • Applied consistent naming conventions across fields, flows, and assets
  • Used a medallion data structure (raw, refined, curated) to ensure data quality and traceability
  • Used Copilot as an assistive, explanatory layer — not an autonomous decision-maker

Governance was built in from the start to ensure the solution is secure, maintainable, and trustworthy beyond the hackathon.hackathon.

Relevant blogposts: ALM implemented

🧠 Data, AI & Analytics

Even in a hackathon setting, we designed with structure and responsibility in mind.

From raw blocks to blazing insights: use Microsoft Fabric to take messy data through a structured refinement process, model it into trusted semantic layers, unlock visual storytelling with Power BI, and build a foundation with Fabric IQ that helps both AI agents and data scientists uncover the real value in your datasets. If something doesn’t add value, keep polishing until it sparkles! 💎

Relevant blogposts: From raw blocks to data diamonds

⚡ Low-Code

  • Used low-code to move fast while keeping structure and maintainability intact
  • Built a back-office Model-Driven App for governance, search, and operational overview
  • Used Prompt Columns to embed AI directly in the data model and enable predictable Copilot behavior
  • Leveraged new Power Platform capabilities to deliver advanced functionality quickly and securely
  • Established an analytics foundation using Microsoft Fabric with a medallion architecture (raw, refined, curated)

    Relevant blogposts: OneFlow and LINK Mobility Sponsor Badge and more, Go With The Flow | Arctic Cloud Developer Challenge Submissions


🧑‍💻 Code Connoisseur

  • Built a Code App using vibe coding, outside traditional Model-Driven and Canvas patterns
  • Implemented Azure Functions in C# (.NET 8) with RCON API integration to a Minecraft server
  • Ensured backend logic is deterministic, testable, and decoupled from UI and AI
  • Experimented with Edge AI using NVIDIA Jetson Nano, leveraging Linux shell tooling and low-level configuration
  • Explored trade-offs between edge-based LLMs and cloud-hosted AI services
  • Kept business logic in code, with clear separation between experience, AI, and execution

Code was used deliberately — where control and predictability mattered most.

Relevant blogposts: OneFlow and LINK Mobility Sponsor Badge and more

    🌍 Digital Transformation

    • Built a solution that starts from user intent rather than technical specifications
    • Transformed intent into structured data that can be reasoned about, adjusted, and reused
    • Used AI and Copilot to explain consequences and trade-offs instead of automating decisions
    • Connected business logic, data, and visualization into a continuous feedback loop
    • Used Minecraft as a visualization engine to make outcomes tangible and easy to understand
    • Demonstrated how low-code, pro-code, and AI can work together to support better decisions

    The transformation was not just technical — it changed how users understand and act on complex decisions.

    Relevant blogposts: OneFlow and LINK Mobility Sponsor Badge and more, NASTY! If it doesn’t work, expose it to the world | Arctic Cloud Developer Challenge Submissions

    Dataminer

    Automatically Syncing Minecraft Materials from the Wiki into Dataverse

    Keeping material availability up to date in a Minecraft building workflow sounds simple… until you decide to automate it properly.

    Instead of manually maintaining a list of materials every time Minecraft introduces new blocks, I built a workflow that reacts automatically when the Minecraft Wiki publishes updates.

    The result?
    New materials become available for building approvals without anyone lifting a finger.


    The Problem

    Our building flow relies on a Dataverse table that defines which materials are allowed to be used when players submit Minecraft house builds for approval.

    Minecraft updates frequently, and new materials are introduced all the time.
    Manually tracking these changes would be:

    • Easy to forget
    • Error-prone
    • Extremely un-fun

    So naturally, automation was the only reasonable answer.


    The Solution Overview

    The solution is built around a simple idea:

    If the Minecraft Wiki announces a new material, Dataverse should know about it automatically.

    To achieve this, I created a Power Automate workflow that listens for update emails from the Minecraft Wiki and turns those updates into structured data.


    How It Works (Step by Step)

    1. Minecraft Wiki Sends an Update Email

    Whenever the Minecraft Wiki publishes a list of newly added materials, an email is sent out containing those updates.

    This email becomes the trigger point for the entire system.


    2. Power Automate Listens for the Wiki Email

    A Power Automate flow is configured to trigger when an email arrives from the Minecraft Wiki.

    The flow checks:

    • Sender (to ensure it’s actually from the Wiki)
    • Subject or content indicating a materials update

    Only valid update emails continue through the workflow.


    3. Extract New Materials from the Email

    The flow parses the email content and identifies the newly added materials.

    Each material is treated as its own data record, rather than just text in an email.


    4. Create Records in Dataverse

    For every new material found, the flow creates a new row in the Materials Dataverse table.

    This table is the single source of truth for:

    • What materials exist
    • What materials are allowed
    • What materials builders can select

    5. Materials Are Instantly Available in the Building Flow

    Because the building approval flow reads directly from the Dataverse Materials table, the new materials are immediately available:

    • In Canvas Apps
    • In approval logic
    • In validation rules

    No redeployment.
    No manual updates.
    No “why can’t I use this block?” messages.


    Why This Is Fun (and Slightly Ridiculous)

    From a pure business perspective, this might be overkill.
    From an automation perspective, it’s perfect.

    This setup demonstrates:

    • Event-driven automation
    • Email parsing as a data source
    • Dataverse as a dynamic configuration layer
    • Zero-touch updates to user-facing logic

    And most importantly:
    Minecraft content updates now trigger enterprise-grade automation.

    Power Automate picture

    Final delivery – Team 404: Building the Impossible (And What Actually Worked)

    Our mission is to connect reality in the real world to the reality in the digital world. Creating an entertainment platform building on creativity in all ways from the visual to the engineering focused requirement specification to using natural language. This approach unites all kinds of people in the game and removes most boundaries for expressing yourself creatively. It bridges age, cultures and location. It unifies the world in a time when we need it.

    TL;DR: What We Set Out to Build

    The Vision: Build in Minecraft using three completely different input methods:

    1. Physical blocks on a camera-monitored plate (build IRL → Minecraft)
    2. Voice commands through an AI contact center agent
    3. Web ordering via a beautiful Power Pages interface

    The Reality: Two out of three ain’t bad. And what we did build? It’s pretty spectacular.

    What Actually Works (And Why It Matters)

    The Physical Building Plate

    Status: FULLY OPERATIONAL

    • micro:bit v1 PIR sensor detects motion → triggers Azure Function
    • Continuous webcam stream (Flask + OpenCV) captures snapshots
    • Stored in Azure Blob Storage with SAS tokens
    • GPT-4o Vision analyzes images for block detection
    • Custom .NET Dataverse plugin processes AI responses
    • Power Automate orchestrates the workflow
    • MCP server builds in Minecraft
    • Power Pages control center and shop

    Why this is impressive: We bridged the physical and digital worlds using a device from 2015, cutting-edge AI vision, and enterprise cloud infrastructure.

    The Web Interface

    Status: BEAUTIFUL AND FUNCTIONAL

    • Power Pages with minimalist design
    • Responsive across all devices (370px to 4K)
    • Pre-built structure templates (house, tower, castle, platform)
    • Real-time building via MCP API
    • Link Mobility SMS integration for ownership notifications
    • SharePoint-based event notification system

    Why this is impressive: Low-code perfection. Template-based deployment. Setup time: 15-25 minutes.

    The Voice Channel

    Status: PLANNED FOR V2 (wink wink)

    What happened: We hit a roadblock with Copilot Studio voice features in our “Early Access” environment. The error: “Voice features are currently not available for your bot.”

    Lesson learned: Sometimes “cutting edge” means you’re the one getting cut. We chose stability for demo day.

    The good news: The architecture is ready. The contact center is configured. The agent is built. We just need the voice activation to work post-hackathon.

    Redstone Realm: Business Solutions with AI

    Our Pitch:

    We built a hybrid solution that combines:

    • Power Platform (Power Pages, Power Automate, Dataverse, Copilot Studio)
    • Azure (Functions, Computer Vision, IoT Hub, Blob Storage, Automation Runbooks)
    • AI (GPT-4o Vision, Mistral-Small 2503, natural language processing)
    • Real-world hardware (micro:bit sensors, webcams, physical building plates)

    This isn’t just cloud-native—it’s world-bridging. We took Microsoft 365 tools and made them control physical reality.

    Measurable Impact:

    • Physical motion → Digital structure in <15 seconds
    • Multi-channel input (physical + web, voice planned)
    • Automated SMS notifications via Link Mobility
    • Event management system used by 2+ other teams

    Innovation:

    • MCP Protocol (so new most devs haven’t heard of it)
    • GPT-4o Vision for spatial reasoning
    • Custom Dataverse plugin for AI response processing
    • Autonomous decision-making with safety controls

    Data, AI & Analytics: Mining Data Diamonds

    Our Pitch:

    We built an AI-powered computer vision pipeline that:

    1. Analyzes images using GPT-4o Vision + Azure Custom Vision
    2. Detects blocks with confidence scoring (0.0-1.0)
    3. Maps coordinates from camera space to Minecraft world
    4. Processes AI responses via custom .NET Dataverse plugin
    5. Stores structured data in Dataverse for auditing and analysis

    The Pipeline:

    Webcam → Blob Storage → GPT-4o Vision → JSON Response → 
    Custom Plugin → Dataverse → Power Automate → MCP Server → Minecraft
    

    Why it’s sophisticated:

    • AI confidence thresholds (85%+ for autonomous action)
    • Batch processing with partial success handling
    • Comprehensive audit trail in Dataverse
    • Pattern recognition and learning capabilities
    • Image-to-block-to-coordinate transformation

    Data Products:

    • Timestamped snapshots with metadata
    • Block detection records with confidence scores
    • Build history and analytics
    • AI decision audit logs

    We didn’t just use AI—we built a production-ready AI pipeline with safety controls, monitoring, and transparency.

    Low-Code: Power Platform Excellence

    Our Pitch:

    We built three complete low-code solutions without PCF components:

    Solution 1: Minecraft Builder Interface

    • Power Pages responsive website
    • Copilot Studio AI chat interface
    • Power Automate orchestration flows
    • Dataverse for data management
    • Link Mobility custom connector (using paconn for advanced config)

    Solution 2: Event Notification System

    • Power Pages registration
    • SharePoint agenda management
    • Power Automate scheduled checks (every minute)
    • Automatic email delivery
    • Smart “processed” logic (no code required)

    Solution 3: “DO NOT PRESS” Governance App

    • Canvas App with intentional complexity
    • Power Automate integration
    • SharePoint logging
    • Azure Runbook orchestration
    • Automated email notifications

    Why this demonstrates mastery:

    • Zero PCF components
    • Entirely drag-and-drop configuration
    • Custom connector creation with paconn
    • Policy templates for API manipulation
    • Template-based deployment (<30 min setup)
    • Reusable across scenarios

    Code Connoisseur: When You Need Code

    Our Pitch:

    Sometimes low-code isn’t enough. When that happened, we wrote production-quality code:

    Custom .NET Dataverse Plugin (C# .NET 4.6.2)

    csharp
    // ImageBlockUpdatePlugin - Processes AI vision responses
    - Triggers on Dataverse Update event
    - Deserializes GPT-4o Vision JSON
    - Batch updates with partial success handling
    - Comprehensive error logging
    - 80%+ success rate in production

    MCP Server (TypeScript/Node.js)

    typescript
    // Minecraft Builder MCP Server
    - Full Model Context Protocol implementation
    - RESTful API (10+ endpoints)
    - Docker-ready deployment
    - Open-sourced on GitHub
    - Used by multiple teams

    PowerShell Automation

    powershell
    # Azure DevOps CI/CD Pipeline
    - Automated Minecraft plugin deployment
    - mcrcon for graceful server management
    - Scheduled tasks for continuous deployment
    - Zero-downtime updates

    micro:bit Python (MicroPython)

    python
    # Extreme memory optimization
    - 16KB RAM constraint
    - Manual garbage collection
    - AT command serial communication
    - Edge detection algorithms
    - Rate limiting implementation

    Why this is impressive:

    – Full stack expertise (C#, TypeScript, Python, PowerShell)
    – Production-ready error handling
    – Performance optimization (memory, network, compute)
    – Open source contribution (MCP server) – CI/CD automation

    Digital Transformation: Measurable Impact

    Our Pitch:

    We didn’t just automate—we **transformed how people interact with digital spaces.

    Problem: Building in Minecraft requires keyboard skills, technical knowledge, and time.

    Solution: Three input channels, zero Minecraft knowledge required.

    Impact 1: Physical Building Accessibility Before: Kids/non-gamers can’t participate After: Place physical blocks, see digital results instantly Measurable: Less than 15 second latency from motion to build

    Impact 2: Event Automation Before: Manual email reminders, missed sessions After: 100% automated, 1440 checks per day, zero manual work Measurable: Used by 2+ other teams, 100% delivery success

    Impact 3: Build Democratization Before: One person building, team watching After: Multiple inputs (physical + web), collaborative creation Measurable: Shared MCP API, multiple concurrent builders

    Impact 4: Developer Efficiency Before: Manual plugin deployment, server downtime After: Full CI/CD with Azure DevOps, graceful shutdowns, player notifications Measurable: 15-minute deployment cycle, zero failed deployments

    Business Value: Time saved: 90% reduction in manual notifications Accessibility: Non-technical users can build complex structures Collaboration: Multiple input channels, team building Developer productivity: Automated deployment pipelines

    Real-world applicability: Architecture visualization (physical models to digital twins) Remote collaboration (distributed teams building together) Educational tools (teaching coding through visual building) Accessibility solutions (multiple interaction modes)

    Building in Minecraft after recieving instructions from the web portal.

    The Technical Achievement

    What We Actually Built:

    Core Stack:

    • 5 Azure services (Functions, Blob Storage, Computer Vision, IoT Hub, Automation)
    • 6 Power Platform components (Pages, Automate, Dataverse, Copilot Studio, Custom Connector, Plugin)
    • 4 custom code projects (.NET, TypeScript, Python, PowerShell)
    • 3 input channels (2 working, 1 in progress)
    • 2 complete plug-and-play solutions 1 open-source contribution (MCP server)

    Integration Points:

    micro:bit → ESP8266 WiFi → Azure Function → Blob Storage → GPT-4o Vision → Custom Plugin → Dataverse → Power Automate → MCP Server → Minecraft Plugin → Minecraft World

    Safety & Governance:

    • AI confidence thresholds
    • Action limits (10 blocks autonomous)
    • Audit trails (full Dataverse logging)
    • Kill switch (physical button override)
    • Rate limiting (5 req/sec)

    What We Learned and why this matters

    We didn’t build everything we planned. Server capacity ran out. Voice integration hit roadblocks. Some features had to wait for V2.

    Three days. Five developers. Two working input channels. One spectacular demonstration of what happens when you combine:

    • Power Platform low-code simplicity
    • Azure cloud scalability
    • Custom code excellence
    • AI vision intelligence
    • Physical world interaction
    • Open source collaboration

    This is hybrid development at its finest.

    Wins:

    ✅ MCP Protocol works brilliantly for AI-to-game integration
    ✅ GPT-4o Vision can do spatial reasoning
    ✅ Power Platform scales from simple to complex
    ✅ micro:bit from 2015 can still be cutting-edge
    ✅ Open source collaboration speeds everyone up

    Challenges:

    ⚠️ “Early Access” environments are risky for demos
    ⚠️ 16KB RAM requires extreme optimization
    ⚠️ Server capacity planning is critical
    ⚠️ Integration debugging takes longer than expected

    Proud Moments:

    🎉 First successful physical-to-digital build
    🎉 MCP server used by other teams
    🎉 AI confidence system working in production
    🎉 Zero failed deployments
    🎉 Other teams using our notification system

    For the judges:

    We know you’re evaluating across multiple categories. We’ve built solutions that demonstrate excellence in:

    • Redstone Realm: Multi-cloud Microsoft stack with AI
    • Data & AI: Production computer vision pipeline
    • Low-Code: Three template-ready solutions
    • Code Connoisseur: Four custom code projects
    • Digital Transformation: Measurable business impact

    We didn’t finish everything. But what we finished? It’s exceptional 😉


    The Team

    404: Diamonds Not Found

    We may not have found all the diamonds, but we built something that creates them automatically.

    Morten, Erik, Esther, Per og Kariann.

    Crawler: Searching our huge building catalogue

    As our catalogue grows our agents need a convenient way to search our database of builds. To solve this issue we created a custom page that we could integrate into our Model driven app. The app leverages Dataverse tables as its sources leveraging the ability to delegate the filtering so it’s made in Dataverse and not in the app, layouts for a responsive UI and only out of the box components.

    You can either search using the search boxes separately

    Or a combination of multiple boxes

    Final Delivery

    Dear judges, please feel free to search through this blog post to find you key words for you categories. Enjoy reading, love PixelPoints <3

    Solution

    Overall technical sketch and diagram, The solution is a Power Platform–centric architecture that integrates external systems, SharePoint, Azure services, and a Minecraft client to enable secure data exchange, automation, and gameplay interaction through APIs and agents.

    At its core, Power Platform acts as the orchestration layer, while Azure handles API exposure and backend services, and the Minecraft client consumes these capabilities through a custom API and agent.

    Power Platform

    Power Apps is used to build our user interface outside Minecraft. A player will be able to sign in with their Minecraft credentials to fetch relevant information about their player profile and inventory by triggering an API call to our Minecraft server. You have to be logged on in-game in order to fetch the information.

    When logging in you call a custom API through a Custom Connector. It will retrieve information about profile and inventory that is used to populate the Power Apps with data such as credits and who the player is.

    PixelStreet, our own Minecraft Wall street, for getting knowledge, ask question about player inventory, PointCoins (our own currency), and how to build and craft items using Copilot Studio. Our agent is connected to online resources to find information about the Minecraft and with connection to Microsoft Foundry and Azure Functions in order to get information about live data on our Minecraft server. You can get intel about prices that are fluctuating and updating depending on different world events and players buying and selling.

    The Copilot Studio Agent we added a topic that has an action to perform a “Custom search”. It uses a web search and the input is coming from the conversation with the user. So the user is prompted to ask for what they want to craft, then how much and what they have in their inventory before doing a search online to fetch information. For the low code lovers, we have used Power Fx to formulas used in the topic.

    The output wasn’t structured enough, so we added a custom prompt to transform the output to a useful format. It was given instructions on how to handle the input from the custom search and what the output should be.The custom prompt is built using instructions and natural language and help from Copilot to build the great instructions.

    Pretty cool to build these custom prompts and get some very valuable output when combined with a clever little search to the big internet 😀

    Minecraft

    We have made some stellar interfaces in Minecraft. A custom Market place that function as a shop and to see your inventory.

    When you hover an item, you’ll be presented a graph and dashboard showing trends in prices. You’re able to buy and sell items by right and left clicking them.

    Autocrafting: show current options

    Autocrafting: creating planks from available options

    Demo

    Azure

    Read the post on client side salsa to understand a little bit more about how we use the different APIs, Azure Functions and custom APIs we have built: https://acdc.blog/wp-admin/post.php?post=10649&action=edit

    Automatising deployment of Azure Resources: https://acdc.blog/wp-admin/post.php?post=10645&action=edit

    SharePoint and their Microsoft 365 friends

    SharePoint is where we store all our documents generated by our flows and external services. When a code of conduct is signed, it is sent to a SharePoint library and then updated with the correct metadata. It also proves easy access to the rest of the solution with the PixelStreet app, PowerBI reports and the Teachers’ Lounge. This is where we gather all the information from Dataverse about the Player, their Teacher and the Contact Person.

    The Teacher’s lounge is the Team where the teachers hang out and gets notified by a Teams bot whenever new players arrive. In addition to the Team bot notification, the teacher responsible will get some Planner tasks appointed to them.

    Governance

    Please read our post on ACDC Craftmanship to get more intel on what we have done for governance: https://acdc.blog/wp-admin/post.php?post=11137&action=edit

    Pipelines

    Copilot Control System framework for agents

    • Use principles to work by

    Power Platform ALM – DEV – TEST – PROD

    • Avoid using Default Environment

    TEST and PROD being Managed environments configured with security groups.

    DLP policies

    Copilot Studio Overview

    Power Platform Pipelines

    Privileged Identity Management for activating Global Admin for our users only for a period of time instead of permanently. We needed to have the Global Admin role to enable features like Managed Environments and Power Platform Pipelines.

    Purview

    Source control

    On the Low Code part, as showed before, we are using Power Platform Pipelines and the git control OOTB functionality. It was a great balance between easy implementation, tracking changes, having always a backup available and of course ease the deployment of our solutions through our environments.

    On the Pro Code part, of course source control is an essential part of our solution as well! (Not using git for it would be criminal!). We optet for storing all the elements on our solution in a single repository, since we are just two developers who work well together and use branching for working on and deploying our changes.

    This is what our repository looks like!

    • CSharp: we are storing here our main C# solutions. That is:
      • AzureFunctionsApp: different Azure Functions we use for integrations.
      • DataverseJobs: one time jobs and scrapping and testing to Dataverse and endpoints.
      • EarlyBound: generation of Early Bounds to work more effectively.
    • Docs: we store all our docs here, which are sincronised used the Wiki as code functionality in Azure DevOps.
    • infra: we store here all our infrastructure (CI) code, and some of the CD part of it.
      • biceps: our biceps code is stored here, as we use IAC for Azure.
      • scripts: we have a lot of useful scripts for easing the configuration of our VM, setting up our Minecraft Server, updating our plugin…
    • plugins/PointPixelAPI: this is our Minecraft Plugin. Here resides all the functionality we developed for it, mainly our Minecraft shop, autoctrafting system and API endpoints.

    Continous Implementation (CI)

    The first part we consider in this process is how we deploy all our resources in Azure. Since we are continously deploying new things, we use biceps code to ensure every customization made happens without manual intervention and, in case the unimaginable happens, we can always roll back.

    That includes AIFoundry, AzFunctionApps, Keyvault, Purview, a VM…

    We are also using Managed Identity, so we also set the access between resources through biceps code.

    Also, we have some scripts for automatically installing PaperMC and configuring the Virtual Machine so that we are able to connect to it and play. Also, we assign the VM a static IP.


    Security and authentication

    For security and authentication in Azure we opted to go for RBAC access management + automatically created Managed Identitities.

    And of course, we are using Keyvaults for managing our secrets throughtout our whole solution:

    Continous Deployment

    We have been mainly using scripts for deploying our plugin to our Virtual Machine. That way a process that might see quite tedious (building in Maven -> connecting to VM -> transfering plugin build -> restarting server -> checking server restarts successfully) happens seamlessly!

    Small comment: I am the developer and I’m very biased of course, but it’s tiny little part absolute fauvorite of the project! Hehe.

    After that, we also set a pipeline to run that command at the speed of a commit push merge to main branch.


    Testing process

    Since we are two developers developing our Java Plugin at the same time, we agreed on setting some unit testing. That way, we make sure core functionality in the app never gets broken as we agidly add new functionality!

    Documentation

    Part of the good communication and ease to work together on a team comes from having good documents we all read, contribute to and comment!

    Best practices

    Wrapping custom APIs in a Custom Connector to retrieve information about a player and it’s inventory with two different endpoints instead of using HTTP requests in Power Automate for instance, making it more secure and safe, but also enable others to use it in the organisation PointPixels so it’s reusable and centralised when it comes development and maintenance.

    The Custom connector has two actions, Inventory and Player Information.

    Using solutions… Yes, we know it’s given, but we mention it because it’s important. All of our components are packaged in solutions, and these solutions are then used to update TEST and PROD environments with changes and updates

    The solutions are connected to Git so that we have source control through Git Connection.

    To keep our secrets secure in Power Platform, we’ve added environment variables of the Secret data type, which are linked to our Azure Key Vault.

    Code

    Autocrafting API

    Retrieve all the recipes in for use in AI Foundry
    You can also find information about all the existing recipes in the game. Here there is an example:

    We created an API with that information that serves as an MCP layer we use for showing available crafting recipes to Minecraft players.

    Here are the explanation an example of all the different endpoints:

    [GET] /recipe/{id}/{count} returns the recipe for a given item based on the id. It is also provide an amount to recalculate the ingredients needed.

    [POST] /showCurrentCraftingOptions display all the available crafting opportunities based on a inventory and tools input:

    — If a crafting table is available more items will be unlocked.

    [POST] /showRemainingItems/{id}/{count} returns a list of missing ingredients to craft an item based on the inventory and tools available

    [POST] /simulateCraftingResult/{id}/{count} simulates the inventory state after crafting is done for a specified item and amount based on a inventory and tools input. math: inventory – recipies expended + newly crafted items.

    Microsoft Foundry: Economic Agent to predict future prices based on events

    To understand the power our AI Foundry Agent has over our Minecraft world, it is convenient to have a small reminder about what our solution consists on, and also how the Data Model looks like (very briefly!).

    Microsoft API -> Canvas App

    The Pro Code layer: Minecraft API
    We have a really cool Minecraft plugin built in Java that, among other thing, exposes an API that shares real time information about the players.

    As an example, here we have an endpoint for retrieving the inventory of a player:

    About our solution

    We created our own Minecraft server in which we simulate a living economy system. That sounds really cool, right?! That’s how we designed our data model to achieve such a goal!

    Our data model

    The most important part to understand here is the Minecraft Item Value. We represent the ever changing value of a Minecraft Item by creating a record for every variation in the price, recording both the price and the time range in which it is valid.

    Cool! Where comes AI into play?

    Right, right! We have the perfect scenario for generating simulations. And of course, we are not going to develop a custom algorithm that calculates it. We have data driven Generative AI !

    So we create a beautiful AI Agent in Foundry that we are using from Azure Functions to generate the data and create the records in Dataverse.

    With that in place, we decided to make it fun by adding the possibility to define economy crashing events, such as: wars, economic recessions, inflation, economical crisis or even Trump invading Greenland!

    So, to make that work, we added this prompt as instructions:

    You are a price simulation engine for a Minecraft economy.

    Return ONLY a JSON array of objects with exactly these fields:
    – id (string)
    – price (number with 2 decimals)
    – validFrom (ISO 8601 string)
    – validUntil (ISO 8601 string)

    Rules:
    – Generate prices for every item in “items”.
    – Use “history” as the trend signal.
    – validFrom starts at “from”.
    – validUntil = validFrom + stepMinutes.
    – Generate exactly “points” time steps per item.
    – Prices must be > 0.
    – No extra keys. No explanations. No markdown. Output JSON only.


    And, having an input as follows, we simulate Trump is taking over Greenland, for example:

    event: the economic event that’s taking place.
    items: list of items we want to simulate.
    history: last 2 prices of the items given.
    from: initial timedate to generate data.
    stepMinutes: interval between every fluctuation.
    points: how many iterations we want to generate for each item.

    That is the result!

    At the heart of the code section of our delivery is a virtual machine responsible for hosting the Minecraft server

    Low code

    Low-Code Excellence — Minimal Code, Maximum Impact 🧱✨

    In the Arctic Cloud Developer Challenge submissions on ACDC.blog, low-code isn’t a fallback — it’s a super-power. Across multiple team builds, the focus is on using Power Platform’s drag-and-drop, builder-friendly capabilities to deliver real world value quickly, while keeping implementations maintainable, practical, and delightful.

    Here’s how low-code mastery shines through:

    1. Full functionality with zero PCF components
    Several solutions rely entirely on Power Platform building blocks — Canvas Apps, Model-Driven Apps, Power Pages, Power Automate, Dataverse, and Copilot Studio — without writing custom PCF or heavy code. These show that low-code can still be complete and compelling.

    2. Responsive interfaces built with Power Pages
    A highlight is the Minecraft Builder Interface built as a Power Pages website — fully responsive, intuitive, and template-driven. It lets users interact with game elements, build structures, and trigger events without ever seeing a line of code.

    3. Orchestration flows that feel like magic
    Power Automate flows weave business logic, condition paths, approvals, and external requests into automated pipelines that literally build structures in Minecraft. This shows how low-code logic can span from data triggers to API actions, replacing manual toil with orchestration brilliance.

    4. Copilot and Power Fx unlock embedded intelligence
    Teams embed Copilot Studio agents and use Power Fx formulas directly in topics and data-model prompts — essentially weaving AI-assisted logic into low-code constructs, not just UI/screens. This elevates low-code solutions with intelligence without scripting.

    5. Reusable templates and connectors
    Low-code is taken further with template-ready setups and custom connectors (built with paconn) so that solutions can be deployed in minutes and reused across scenarios. This is practical low-code architecture, not just quick prototypes.

    Power Fx in Agents

    Data, AI & Analytics — From raw blocks to data diamonds 💎

    In the PointTaken26-1 PixelPoint category on ACDC.blog, the team showcases creative and technically rich solutions that weave data engineering, AI-driven insights, and dynamic analytics into one cohesive ecosystem — even inside a Minecraft world.

    1. Building a data-centric architecture
    The PixelPoint solution is anchored by a Power Platform and Azure architecture that collects, orchestrates, and stores data from external systems, APIs, and gameplay interactions. At its core, Dataverse acts as the central data store for players, inventories, prices, and economy trends.

    2. Real-time and trend analytics
    Rather than static reports, data flows into real-time dashboards — sometimes literally inside the Minecraft interface itself. For example, hovering over an item in their custom shop presents trend data such as current price, hourly and 24-hour price movement, and calculated trends, all derived from Dataverse records. This is a vivid demonstration of surfacing analytics where it matters most.

    3. AI powered data generation and simulation
    The team uses Microsoft Foundry AI Agents to simulate economic data for their Minecraft economy. Instead of hand-crafting models, they rely on data-driven generative AI to produce price variations, allowing them to model economic events like recessions or inflation within the game world. This showcases AI not just as insight but as a data generator that feeds back into analytics.

    4. Diagnostics and operational insights
    For deeper operational visibility, PixelPoint leverages Azure Application Insights to monitor both Azure Functions and AI Foundry models. This not only helps with debugging but also adds another layer of analytics around backend performance and usage patterns.

    5. Agents and contextual intelligence
    Their Copilot Studio Agent taps into Foundry, external search, and structured AI prompts to bring contextual insights to end users — enabling players to query for crafting information or item trends. By transforming unstructured search output into usable data, they demonstrate AI as a bridge between raw data and decision-ready information.

    Digital Transformation

    We have developed an interactive learning platform that helps students understand trading, economics, and market dynamics in a practical and engaging way. The solution is built inside Minecraft, where students can buy and sell items in a simulated market that reacts to real-world concepts such as supply and demand, pricing, scarcity, and external events like wars or global changes.

    The platform is teacher-controlled, allowing educators to manage student activity. This ensures a safe learning environment while giving teachers full oversight of progress and outcomes.

    To create a scalable and intelligent solution, the platform is integrated with Microsoft technologies including Dataverse, APIs, Canvas Apps, Microsoft Teams, and SharePoint. Game data (transactions, prices, inventory, and student actions) is sent through APIs into Dataverse, where it is stored and processed. Canvas Apps are used for dashboards and administration, giving teachers real-time insights and control, while Teams and SharePoint are used for collaboration, communication and information storing.

    This solution helps educational institutions do more with less by automating data collection, reporting, and monitoring, reducing manual administration for teachers. It significantly improves the student experience by turning complex economic theory into hands-on learning, while giving educators clear, actionable insights through automation and centralized data.

    The platform demonstrates a real-world, intelligent digital transformation by combining automation, data integration, and immersive learning to improve both educational outcomes and operational efficiency.

    Power BI

    To help users gain insight of how the economy fluctuates, we have set up some simple and easy to understand dashboards in PowerBI, accessible from the PixelPoint SharePoint site. Are the price of gold going up or down? This is where you find out!

    Redstone Realm — Business Logic, Built to Adventure ⚙️🟥

    Rather than focusing on a single app or interface, the teams design connected ecosystems that work across devices, modalities, and user contexts — desktop and web, keyboard and touch, dashboards and chat interfaces.

    1. Business-first solutions across the Microsoft stack
    The solutions integrate familiar tools such as SharePoint, Teams, Viva experiences, Dataverse, and Azure data services to solve concrete business problems. This approach grounds the creativity in reality — proving that the same tools used for collaboration, document management, and operations can also power imaginative, interactive experiences.

    2. Multi-modal, inclusive experiences
    Whether users interact via web apps, Teams chat, dashboards, or conversational agents, the solutions are designed to meet people where they are. Accessibility and usability are treated as first-class concerns, ensuring smooth experiences regardless of device, input method, or working style.

    3. AI as an infused capability, not a bolt-on
    AI is woven directly into the fabric of the solutions. Using Copilot Studio, agents, LLM-powered reasoning, and intelligent prompts, teams create systems that help users mine insights faster, make better decisions, and automate complex reasoning — the equivalent of forging an AI-infused pickaxe rather than placing individual blocks.

    4. Redstone-style orchestration and automation
    Behind the scenes, logic flows, agents, and integrations act like redstone circuits and command blocks — triggering actions, reacting to events, and coordinating services across the platform. These intelligent automations turn static data into responsive systems that adapt to user input and context.

    5. Trust by design: privacy, governance, and experience
    Despite the creativity and speed, the solutions remain rooted in enterprise principles. Data privacy, security, and governance are respected throughout, ensuring that innovation doesn’t come at the cost of trust. The result is solutions that are adventure-worthy and production-ready.


    Redstone Realm Takeaway

    The Redstone Realm category highlights solutions that balance imagination with impact. By combining Microsoft 365, Dynamics 365, Azure, and AI-powered agents, the teams show how modern business platforms can be assembled like redstone machines — modular, responsive, and endlessly extensible.

    Go with the flow

    Go With the Flow: From Dataverse Status Change to a Minecraft Build 🧱

    This Power Automate flow is a full end-to-end automation that turns a simple status change in Dataverse into an actual construction inside Minecraft. Yes — low-code meets blocks.

    Let’s walk through what happens.


    🔁 Trigger: Building Status Changes to Active

    The flow starts when the Building Status column in a Dataverse table changes to Active.
    This status signals that a building is ready to move from idea to planning.


    ✅ First Approval: Planning Phase

    As soon as the status becomes Active, the flow kicks off an approval to decide whether the building can move into the planning stage.

    • If approved:
      • The Dataverse record is updated to Planning
      • The flow continues automatically

    If it’s rejected, the process stops right there — no rogue buildings allowed.


    🏗️ Second Approval: Ready to Build

    Once the building is officially in Planning, the flow starts a second approval, this time asking for permission to actually build it.

    • If approved:
      • The building status is updated to Building
      • The real fun begins

    📐 Fetching Building Instructions

    Now that construction is approved, the flow retrieves all building instructions from a separate Dataverse table that contains:

    • Building layers
    • Coordinate values
    • Material information

    Each row represents a layer or block placement instruction for the Minecraft structure.


    🔢 Coordinate Conversion

    Before sending anything to Minecraft, the flow:

    • Converts the stored coordinate values
    • Applies offsets and transformations defined in the table
    • Prepares the exact X, Y, Z values needed by the Minecraft API

    This allows the same building instructions to be reused and placed dynamically.

    Dataverse coordinates


    🌍 Building in Minecraft via HTTP PATCH

    With coordinates and materials ready, the flow sends an HTTP PATCH request to a Minecraft API endpoint.

    This request includes:

    • Exact block coordinates
    • Material type (stone, wood, etc.)
    • Placement instructions

    Minecraft receives the request — and the structure is built automatically, block by block.

    No manual placement. No creative mode chaos. Just pure automation.


    🟢 Final Step: Update Status

    Once the API confirms a successful build:

    • The relevant Dataverse rows are updated
    • The building status reflects that construction is complete (or moved to the next logical state)

    This keeps Dataverse perfectly in sync with what actually exists in the Minecraft world.


    Why This Is Cool (and Slightly Unhinged)

    • Uses Dataverse as a source of truth
    • Chains multiple approvals into a single flow
    • Converts structured data into real-world (or real-game) actions
    • Proves that Power Automate can, in fact, build houses in Minecraft

    This is the flow structure

    This is what we manage to build from the flow