CCCF (CrayCon Creepers Central Farmers) is an end-to-end automated Minecraft resource marketplace. Customers order via Power Pages portal, sign contracts via OneFlow, get SMS confirmation via Link Mobility, and AI bots fulfill orders automatically. All data flows through Dataverse.
Redstone Realm
Full Power Pages portal at https://cccpfactoryportal.powerappsportals.com/ with customer storefront, order management, and admin dashboards. 13 web pages, 18 templates. Dataverse tables for orders, orderlines, resources, harvesters, and harvest summaries. Web API enabled for orders. Azure AD, LinkedIn, Twitter, Facebook authentication. Complete business flow from browse to purchase to contract to fulfillment to tracking.
Governance and Best Practices
Governance wasn’t an afterthought in CrayCon Creepers Central Farming. It’s how we kept the whole “always harvesting, always ordering” idea from turning into a fragile demo. We built with a clear environment strategy (Dev → Test → Prod), so experiments stay contained, validation happens before anything reaches real users, and production remains stable. On top of that, we treated the portal like an actual production surface: anonymous visitors only see public-friendly content, while operational views (admin dashboards, harvester assignments, facility metrics) are protected with authentication and granular table permissions aligned to real roles and responsibilities. (https://acdc.blog/crayon26/security-and-governance-building-systems-that-wont-collapse-under-pressu…)
On the security and data side, we focused on preventing accidental “oops”-moments rather than relying on perfect maker behavior. DLP policies are used to control connector usage and stop risky data flows by default, so operational data doesn’t quietly leak into services that don’t belong in the same trust boundary. That also supports compliance and privacy principles: least privilege, separation of environments, and clear control points for where data can move. The result is a portal and platform setup that’s designed to be resilient under pressure, not just functional when everything goes right. (https://acdc.blog/crayon26/security-and-governance-building-systems-that-wont-collapse-under-pressu…)
We also leaned heavily into traceability and accountability. Every meaningful change is designed to be captured through source control and reviewed before promotion, giving us a clean audit trail of what changed, why, and who approved it. Our ALM flow uses GitHub Actions + PR review as a governance gate, with deployment status and context pushed back into Teams/Dataverse so the team gets real-time visibility (and the ability to stop unsafe changes early). We even use AI in a “governed” way, not to make decisions for users, but to improve transparency by generating documentation/changelogs alongside the code so the system stays explainable and maintainable as it grows. (https://acdc.blog/crayon26/craycon-creepers-automating-solutions-alm-with-github-actions-and-ai/112…)
In our CCCP Factory Portal, we also treated Power Pages security as defense in depth. Public visitors can view basic, non-sensitive resource information, but anything operational like the admin dashboard, harvester assignments, production metrics, and alerts is locked behind authentication and controlled with security roles and granular table permissions. That way we can give the right people exactly the access they need (for example read-only for supervisors, scoped access per team) without handing out broad admin rights, and we avoid exposing data that directly affects real operational decisions. (https://acdc.blog/crayon26/security-and-governance-building-systems-that-wont-collapse-under-pressu…)
Finally: we did install the Power Platform CoE Starter Kit to reinforce governance visibility. It took some wrestling (developer/default environment realities are… spicy), but we were able to get some data out of it, giving us the beginnings of the monitoring baseline we want for long-term operations and responsible scaling. That’s the theme across the whole solution: fun concept, serious foundations, secure defaults, clear ownership, and trustworthy building blocks.
Login throttling (5 attempts per 5 min, 15 min lockout). X-Frame-Options SAMEORIGIN. SameSite cookie policy. Role-based table permissions separate customer, admin, and anonymous access. API keys in environment variables, not code. All orders tracked in Dataverse with timestamps. Bot harvest events logged with bot ID, resource type, quantity. INSECURE_CODING disabled by default.
Data, AI and Analytics
Dataverse as data backbone: ccc_orders, ccc_orderlines, ccc_resources, ccc_harvesters, ccc_harvester_summary tables. Bots POST harvest telemetry every 10 items to Power Automate webhook, which writes to Dataverse. Portal dashboards query live data. 17 LLM providers integrated: OpenAI, Claude, Gemini, Groq, Mistral, DeepSeek, Replicate, HuggingFace, Cerebras, vLLM, Grok, Mercury, Azure OpenAI, OpenRouter, Qwen, LLaMA. Switch providers via JSON config.
Low-Code
Power Pages portal built entirely in maker tools. Liquid templates, content snippets, site settings. Power Automate flows handle OneFlow contract generation, Link Mobility SMS, Dataverse operations. No custom .NET code. 80% business logic in Power Platform. Pro code only for Minecraft protocol and AI orchestration.
Code Connoisseur
remote_control_bot.js: WebSocket bot control with Prismarine Viewer integration. 9 job types: farmer-wheat, farmer-potatoes, farmer-beets, farmer-carrots, sparse-farmer, brigadier, guard, scout, wanderer. Mindcraft AI framework with 17 LLM providers, profile-based config, 100+ skill actions. React/TypeScript website with live ACDC badge dashboard. Cloudflare Workers deployment. TUI CLI to control bots
Digital Transformation
Customer self-service 24/7. Instant order creation triggers automated contract and SMS. AI bots harvest continuously. Real-time production data in portal. Full audit trail. No manual data entry, no phone calls for status, no paper contracts.
Uses Microsoft 365 & Teams as the primary collaboration and experience hub.
Combines Microsoft Fabric, Dataverse, the Power platform, Azure Functions and more into a unified workflow.
Integrates AI agents in Fabric and Copilot Studio for data interpretation of geoJSON and creating resource insights.
Focuses on accessibility, usability, and smooth user experience using an adaptive Power App and Teams default accessibility properties
Provides real‑time notifications through a Teams Bot and thereby improving responsiveness and collaboration.
Providing automatic storing of opportunities using Dynamics
Governance & Best Practices
Uses official, publicly available Norwegian geodata ensuring transparency and compliant sourcing.
Keeps data in Microsoft Fabric + Dataverse, benefiting from enterprise‑grade security, governance and access control.
AI agents are explicitly instructed to handle inconsistent formats reducing risk of misinterpretation.
The Copilot Studio agent uses traceable and explainable knowledge sources (NGU, Geonorge).
Workflow ensures AI recommendations feed into human‑in‑the‑loop processes by focusing on consultancy requests and advisor review
Interactions (requests, opportunities, notifications) are logged and stored for auditability.
Data, AI & Analytics
Ingests large‑scale geodata from a 430k‑line XML + multiple CSV metadata files.
Built a Fabric Lakehouse to store, structure and refine all raw geographic information.
Created pipelines + dataflows to process and transform data into a clean SQL analytics table.
Set up Dataverse virtual tables for live synchronization of processed geodata to Power Platform.
Developed a data‑aware AI Agent inside Fabric for interpreting geoJSON and using an complex geometry logic.
Copilot Studio agents that takes advantage of external knowledge sources, implementing the Fabric Data agent as a supportive agent and trigger automated processes in Power Automate
Final output enables AI-driven insights, area‑based resource predictions and intelligent automation.
Low‑Code
End‑user interaction handled via a Power App with native map control for shape and area selection.
Copilot Studio agent handles reasoning about geodata, resource discovery and external API querying without custom code.
Uses Dataverse virtual tables to create a low‑code bridge between Fabric and Power Platform.
Power BI as an dashboard to help decide if an request is a suitable opportunity using Dynamics
User journeys (select area → request consultancy → receive SMS → create CRM opportunity) are built mostly with low‑code components.
Code Connoisseur
Provisioned an Azure Function via Bicep, demonstrating infrastructure‑as‑code mastery.
Built custom functions to:
Convert Markdown → HTML for clean rendering inside Power Apps.
Encode inputs to Base64 when needed.
Developed a Teams Bot using Teams SDK 2.0, integrating:
Microsoft Graph
Dataverse CRM
Azure Maps
Azure Tables for proactive notification logic
Implemented autonomous alert logic tied to an external speaker “sales‑bell” system via custom code.
Demonstrates how pro‑code extends low‑code elegantly and purposefully.
Digital Transformation
Converts complex Norwegian geological data into simple, actionable insights for businesses.
Reduces risk and cost in early‑stage assessments (site evaluation, planning, resource estimation).
Automation:
Data ingestion
Data transformation
Resource detection
Notification
CRM opportunity creation
Improves customer & employee experience with:
A simple map-based interface
Automated consultancy booking
SMS confirmation
Proactive Teams alerts
Demonstrates measurable real‑world impact by accelerating project startup and democratizing access to geodata.
Expanding on the solution
Our initial idea
We wanted to bring the simplicity of the Minecraft-way of discovering and gathering natural resources into the real world by helping organizations collect, process, and act on geological and geographical data. By gathering these publicly available geodata from official Norwegian sources, bringing them into a Datalake in Microsoft Fabric and synchronizing it with Dataverse, we wanted to end up with a solution that enable companies in real estate, agriculture, mining, infrastructure and related sectors to reduce risk, lower costs, improve sustainability planning, and accelerate project startup.
By using the Power Platform with Power BI for insights, Power Automate for workflows, Power Apps for interaction, and Copilot Studio for AI assistance, we could provide future customers with accessible, actionable resource knowledge delivered directly through Teams as a unified collaboration and interface hub. Ultimately, leveraging Microsoft’s cloud and low‑code ecosystem to make geodata more usable, intuitive, and strategically valuable.
Using Fabric and Data agent
Norway’s open geodata portal, Geonorge, provides extensive natural‑resource information and for our solution, we decided to use the “N50 Kartdata” dataset from Kartverket as the primary data source. By downloading a XML file, with over 430.000 lines of geodata, and CSV files with supportive metadata, we could populate a Fabric Lakehouse with all this data. Then using pipelines and dataflows, we could process the data into a single SQL analytics table for easier querying. A virtual table in Dataverse were created to automatically synchronize the data and making it easily available to the Power Platform ecosystem, enabling efficient and accurate retrieval of resource information whenever needed.
Within the same Fabric workspace, an AIdriven data agent was created and finely instructed to ensure seamless use of the dataset and handling of geoJSON data that wasn’t always following the standard formatting. This proved very useful as the standard map controller in Power Apps allows us to select a shape on a map, but the JSON data returned from the controller was not always in a correct geoJSON format as it could return an array with the geoJSON as a property on the elements within. The agent also had to be instructed how to handle circle shapes, as this is defined in geoJSON as a single point with a radius property, and the agent kept misinterpreted the radius as a real-world distance measured in meters.
Low code acces using the Power platform
We have created a Power App for our end users/customers, that allows our customers to use the default Power Apps map controller to select an area on a map and submit requests for consultancy assistance on possible resources. When a request is submitted a Power Automate flow is triggered, send the area coordinates to an agent in Copilot studio, which uses these coordinates to determine what resources exist there. This agent is configured with Knowledge-sources from the website of “Norway’s national geological survey” (https://www.ngu.no/) and the API’s of Geonorge (https://www.geonorge.no/verktoy/APIer-og-grensesnitt/) as general sources, and supplied by using our Fabric Data agent as a supportive agent-in-agent for detailed data. The user are able to book a concultancy with TNT by using the “Book Consultancy”-button, made available in the same app. This will generate an opportunity for our advisors to work on. Additionally this button triggers a power automate-flow that utilizes the Link Mobility-connector to notify the user by SMS, confirming their request for consultancy.
Combining with Pro-code functionality
We have provisioned an Azure function using a Bicep template, to help us with certain workflows that are typically difficult to manage in the Power platform alone. For example, the Copilot agent returns answers in Markdown language, which is not particularly suitable for our end-user to view in the Power App. So we have a script in the Azure function that converts this to HTML, which is much more readable within a richtext textfield. We also have a function that can receive inputs from a request and convert it to base64 format.
Transforming the world using autonomous agents & proactive Teams Bot
In addition, we have an autonomous geological report agent that will occasionally discover noteworthy information that our advisors should be alerted about. This works on the same data as previously mentioned, in addition to other open sources with valuable knowledge about geology.
A Teams Bot, built with Teams SDK 2.0, integrates with Microsoft Graph, Dataverse (CRM), Azure Maps and more – and even makes some noise on an external speaker if there is a new report generated. We have removed the volume knob, so it does not help to mute the laptop and set the phone in airplane mode. They will have to act! It’s like a sales bell – only cooler!
The notifications are sent proactively by keeping track of the reports in an Azure Table. Notifications are delivered to all users – but only once. They can then choose to investigate further, ignore it, or create an opportunity directly by hitting a button!
As consultants and developers tired of tedious and repetitive tasks, we wanted to create a toolbox for effective project management. Using AI and automation integrated in the tools we use daily, we hope to get more time to dive into the actual interesting parts of our jobs – getting back into the flow and explore the Minecraft Microsoft world.
Our Agile Toolboxxx is divided into modules:
👩🏻💻 The Project Management App
We have created a model-driven app for project management, where a project manager creates new projects with the necessary details, and with few clicks create a Teams channel for collaboration, a DevOps project with iterations and team members, and send out meeting invites to the recurring sprint meetings to all team members in Outlook, as well as creating contracts for the projects and send them out for signing.
All integrations from the app to other systems (Teams, Outlook, DevOps, OneFlow) is handled using low-code Power Automate flows, and truly brings out the best of the Microsoft Platform. This doesn’t only significantly reduce the time spent on administrative tasks, more importantly, it ensures that every critical part of a project is created automatically and consistently, without relying on manual steps or individual discipline.
By standardizing how projects are set up, the app guarantees that structures, data, and relationships are always in place and created in the same way. This consistency makes project execution predictable for everyone involved – team members always know what to expect, where to find information, and how work is organized. At the same time, it enables effective automation and AI: workflows can be automated end-to-end, data can be trusted, and AI agents can operate on complete, well-structured project information to deliver real value.
👾 The Technical Debt Game
The neverending discussion in all development projects – devs wanting the correct solution, project managers wanting the quick solutions and the board wanting the cheap solution. Our Technical Debt Game is created for educational purposes, to make everyone see the importance of balance between cost, time and technical debt.
In a single-purpose Canvas App, delivery decisions and technical debt accumulation are simulated across a 10-round project. The game is fully driven by Power Fx logic in the Canvas App, with Dataverse as the backend for game state, choices, events, and leaderboard results.
Players choose between fast, balanced, or robust delivery options, trading time and budget against growing technical debt, with the goal of completing the delivery with the lowest possible debt. Random events triggered by accumulated technical debt introduce realistic delivery disruptions, reinforcing cause-and-effect thinking.
The solution is intentionally simple, workshop-friendly and discussion-driven. Designed to spark reflection on real delivery trade-offs, or just to compete against your colleagues in long, boring corporate meetings 🥸.
(A hidden easter egg reward curiosity, because learning should still be fun.)
👨🏻🔧 AI-assisted Pull Request Reviews
In many projects, best practices are well documented but rarely used consistently. Code reviews are manual, time-consuming, and depend heavily on who happens to review the change. Issues are often discovered late, even though the rules were written down from the start.
We chose to move those best practices into the pull request workflow, where changes are already reviewed. The rules live in the repository as a simple best-practices.md file and act as the single source of truth.
An Azure DevOps pipeline exports the Power Platform solution, commits the changes, and creates a pull request automatically.
When the PR is created, Power Automate is triggered. It reads the PR context, fetches the changed files and the best-practices document, and sends this information to Microsoft AI Foundry for review.
The AI posts structured feedback directly on the pull request, grouped by severity. The review supports developers rather than blocking them. If anything is marked as CRITICAL or BLOCKER, a Bug is created automatically in Azure DevOps.
Secrets are handled securely through Azure Key Vault.
The design is intentionally simple and extensible. While the demo uses one AI reviewer, the same pattern supports multiple specialized reviewers/agents, such as security, best practices, or user impact. Power Automate orchestrates the process, and AI provides consistent, early feedback where it matters most.
🤖 CodeCraft AI
After finishing the initial processes of a project, we go into a critical part. The part where we attend to technical debt, users are starting to take ownership and developers tend to wander off to new projects.
Our app CodeCraft AI is here to help in this situation. If a user has a question about the functionality in the solution, these can be asked directly in the Teams Copilot chat, instead of having to trouble one of the developers. Work items and bugs can be created directly from the chat, no need to spend time writing user stories in DevOps. And if a critical bug is being reported, the responsible developer will be notified by SMS immediately.
During handover to new technical consultants, this becomes especially valuable. Instead of reading through pages of documentation that may or may not cover what they actually need, consultants can get direct answers to their questions when they need them.
A Power BI dashboard gives the responsible project manager an overview of all work items of all projects, so that projects that have been going on for a while and might not be top priority will not go under the radar.
The Medallion Architecture: Bronze → Silver → Gold
Bronze (Raw Blocks): Captures everything as-is from Azure DevOps, Event Hub, APIs. Raw JSON, unprocessed, complete.
Silver (Cleaned & Validated): Data quality checks, standardization, validation. Cleaned Parquet files, structured, and reliable.
Gold (Data Diamonds): Star schema with dimension tables (projects, repositories, teams, users, iterations) and fact tables (work items, commits, pull requests, branches). Daily aggregations pre-compute metrics. Analytics-ready.
The Journey: Raw events → Cleaned data → Star schema → Daily summaries → Instant insights.
Power BI Visualizations: Insights That Tell Stories
KPI cards give a quick view of team pulse. Interactive dashboards reveal productivity patterns, track trends, and enable instant analysis by team, project, and iteration. Data becomes visual, trends become obvious, and questions get answered instantly.
Fabric DataAgents: Strong Foundations for AI Agents
Semantic models (AgileToolboxxModel) enable natural language queries. SQL endpoints provide direct access to the gold layer. MCP servers connect agents to Fabric semantic models. Agents discover data sources dynamically and query structured analytics intelligently.
Real-Time Streaming: Data That Never Sleeps
Azure DevOps Service Hooks → Event Hub → Fabric Streaming → Eventhouse (KQL). Events are captured the moment they happen. KQL queries analyze streaming data, with events processed in under 5 seconds. From action to insight—instantly.
This data-first approach demonstrates how Fabric transforms raw events into actionable insights, from medallion architecture to Power BI visualizations, from Fabric DataAgents to real-time streaming. Every layer refines; every transformation reveals gems. That’s not just data processing; that’s data diamond mining.
Low-Code
🎮 Canvas Apps
The Technical Debt Game is a small, focused Canvas App with a single goal: make delivery trade-offs visible and discussable.
All game mechanics (rounds, resource calculations, probability-based events, win/lose logic, leaderboard and easter eggs) are implemented directly in Power Fx, without custom code.
Dataverse is used as the backend for scenarios, choices, events and scores, keeping the app simple, transparent and easy to extend.
⚙️ Power Fx as the engine
Complex decision logic, event probability tied to accumulated technical debt, and dynamic UI feedback are all handled using Power Fx formulas.
Clear use of Patch, conditional logic and calculated state shows deliberate design choices rather than default patterns.
The app favors “good and understandable” over over-engineering, supporting discussion rather than hiding logic behind abstractions.
🔁 Power Automate where automation adds value
Advanced Power Automate flows automate real delivery work: project setup, Teams and DevOps provisioning, sprint creation, holiday-aware iteration planning, contract creation and signing, and critical bug notifications.
Flows orchestrate across Dataverse, Teams, Azure DevOps, external APIs and third-party services without introducing new platforms or custom services.
🧩 Extending the platform, not rebuilding it
Existing Microsoft tools (Dynamics 365, Teams, Azure DevOps, Power BI) are extended and connected instead of replaced.
Legacy investments are respected and enhanced through low-code integration rather than rewritten.
✨ Low Code Philosophy in Practice
Small apps with clear purpose.
No PCF, no custom backends, no unnecessary perfection.
Bold UI choices balanced by professional framing.
This solution demonstrates how low code can be used for more than automate forms; to model behavior, teach complex concepts, and remove friction from real delivery work using the Power Platform as it was intended.
Code Connoisseur
The Code Stack: Everything is Code
Infrastructure as Code: Bicep templates + PowerShell scripts deploy everything.
Applications as Code: TypeScript + React + Vite create type-safe, high-performance frontends.
AI Agents as Code: MCP servers + version-controlled prompts enable collaborative AI.
Code Crawlers as Code: Python parsers extract relationships and build knowledge graphs.
API Orchestration as Code: Python async/await coordinates six Microsoft Cloud APIs.
Data Pipelines as Code: PySpark transforms Bronze → Silver → Gold.
This “code-first” approach shows how code can solve everything—from infrastructure to AI agents, from web apps to search indexes. Every part is version-controlled, reproducible, and elegant. This isn’t just development. This is code connoisseurship.
Each language was chosen for a specific purpose: Bicep for version-controlled infrastructure, TypeScript for compile-time safety, Python for asynchronous orchestration, PySpark for distributed processing, and KQL for time-series analysis.
Why code was needed to solve our problems:
Manual Deployments → Bicep + PowerShell automate everything. Code makes deployments reproducible, reviewable, and fast. Result: Zero manual steps, consistent deployments.
Inconsistent Environments → Parameterized templates ensure consistency. Code eliminates configuration drift. Result: One script deploys to multiple tenants.
Slow Frontend Performance → TypeScript + Vite + React optimizations. Code enables optimization at compile time. Result: <1 second load times, 60fps interactions.
Disconnected APIs → Python orchestration connects six APIs. Code enables seamless integration. Result: Six APIs orchestrated seamlessly.
Raw Data, No Insights → PySpark transforms data to gold. Code enables automated transformation. Result: Analytics-ready data, instant insights.
Delayed Data → Event-driven streaming enables real-time. Code enables real-time processing. Result: Real-time data, <5 second latency.
Digital transformation
👩💼 Project Managers
Automated project setup (Teams, DevOps, iterations, meetings) reduces admin work at project start.
Built-in governance ensures projects start correctly every time.
Less coordination overhead, more focus on delivery.
👨💻 Consultants & Developers
Faster feedback through AI-assisted pull request reviews directly in Azure DevOps.
Reduced technical debt through training, gamification, and continuous guidance during the project.
AI knowledge chat helps answer technical questions about existing solutions and implementations.
Quality issues are detected earlier, before they reach test or production.
🧑🤝🧑 Customers & End Users
Better transparency into project progress via Power BI reporting across projects.
Faster access to documentation and help through AI-powered user chat.
Customer questions and gaps automatically translate into structured user stories or bugs.
🏢 Leadership & Stakeholders
Cross-project insights through standardized Power BI reporting
Improved predictability, quality, and traceability across delivery.
Scales across teams and projects without adding new tools or processes.
🌍 Overall Digital Transformation Impact
Works in the real world by enhancing existing workflows, not replacing them.
Automates low-value work so people can focus on high-value outcomes.
Improves both employee and customer experience using intelligent automation.
Governance & Best Practices
The Philosophy: Trustworthy AI Integration
Responsible AI governance isn’t optional – it’s foundational. We address ethics, transparency, data privacy, security, fairness, regulatory compliance, and risk management in every component. AI integrated thoughtfully into real-world use cases, designed to be accountable and trustworthy.
Ethics & Safety: AI with Conscience
Safety evaluation ensures AI refuses harmful requests. Adversarial testing probes ethical boundaries. AI recognizes right from wrong.
Transparency: Visible Reasoning
Thought process visibility shows how AI reasons. Citations reveal data sources. Users understand AI decision-making. Explainable AI builds trust.
Data Privacy & Security: Multi-Layer Protection
Multi-Tenant Isolation: Search index filtering by customer, project, repository. Access control enforces document-level permissions. Customer A never sees Customer B’s data.
Azure Security: Managed Identity provides unified authentication. Key Vault stores all secrets securely. Zero credentials in code, zero traces left behind.
User Access Control: Not everyone has access to everything. User-based and group-based permissions enforced. Defense-in-depth architecture.
Fairness & Compliance
Dual assistant modes adapt to user type. Data governance through medallion architecture. Audit trails enable compliance. Risk management integrated into every layer.
The Bottom Line
We didn’t just build AI. We built responsible AI. Every component addresses governance – ethics, transparency, privacy, security, fairness, compliance, risk management. Responsible AI governance is built into every layer. That’s accountable, trustworthy AI design.
This governance-first approach demonstrates how responsible AI is integrated into real-world use cases, from safety evaluation to access control, from transparency to fairness. Every safeguard is intentional, every protection is built-in. That’s not just AI development -that’s responsible AI governance.
Redstone Realm
Built entirely on the Microsoft 365 & Dynamics 365 stack (Power Platform, DevOps, Teams, Power BI, Azure AI Foundry).
Low cost by reusing existing tools and licenses — no new platforms, no heavy custom code.
Easy to implement with modular, low-code solutions and standard APIs.
Quick business value by automating small, repetitive daily tasks that add up over time.
AI used where it matters: faster insights, better reviews, smarter knowledge access.
Secure and responsible: secrets in Key Vault, AI is advisory and transparent.
Improves employee and customer experience without changing how people already work.
Quick rewards and high business value, is it too good to be true? 👉🏻 No. The secret is keeping it simple and improving the small, repetitive tasks we perform every day.
Time saved isn’t lost revenue, it’s time reinvested in real value creation.
Our solution is built on a physical Minecraft Idea. Each players go around the area to scan Qr codes. Each QR scan represents a user mining some resources – but they should be careful, it could happen that there are some resources there that you dont want. We used both Low-Code and Pro-Code in our solution.
Beyond the gameplay, the application shows how real-time data can be captured and used through simple physical interactions combined with digital systems. By turning data collection into a game, we make participation more engaging while still maintaining full visibility and control of what is happening in real time. The same approach can be reused in real-world scenarios such as asset tracking, live monitoring, or event-based systems, which makes the solution relevant beyond the game and easy to relate to practical use cases.
Category: Redstone Realm
Our solution is a real business application disguised as a game. Built on a physical, Minecraft-inspired concept, it uses QR codes and real-world movement to generate live operational data, similar to how IoT sensors, assets, or events work in real scenarios. Players interact through simple and accessible interfaces on mobile and web, while administrators manage and monitor everything through structured back-office tools.
By turning data collection into gameplay, the solution makes participation engaging while still maintaining full control and visibility of what is happening in real time. This approach demonstrates how physical interaction and digital systems can work together to create meaningful data, support decision-making, and mirror real operational challenges in a way that is easy to understand and fun to experience.
Category: Governance & Best Practices
In our development, we aimed to follow best practices, to keep our solution clean and structured.
Pipelines in Power Platform
We created three environments in Power Platform Dev-Test-Prod and created a pipeline for deployments.
DevOps
We also created DevOps for the team. This was used to share documentation for our solution, keep everybody on track. It was also used to store our repos, and pipelines for our Frontend and Backend application.
Security Group – Entra ID
A security group was created for the team, to provide appropriate access to our components in our solution, both in Power Platform and Azure.
AI Governance
Users should be clearly informed when they are exposed to AI-generated content. We therefore clearly marked all AI-generated content to avoid misleading our users. As part of the solution, we activated Frontier to access the M365 Agent Overview. Neither of us had used this before, so this provided valuable insight into how we can control our agents, especially as the number of agents grows.
Secure Access
All resources are connected using Azure Managed Identity to eliminate hardcoded secrets and credentials.
Category: Data, AI & Analytics
Our solution focuses on real-time operational data, where immediate feedback and live data are essential. Players generate data while playing, and they expect to see the results and statistics directly on the score point screen together with us as admins keeping track on everything that is going on. Because of this, our primary goal was to reduce latency in reading and writing.
Data Platform Choice
We use Dataverse as our operational data platform, where we store player data, team data, spawn locations, spawn chunks, and resource types. Dataverse also allowed us to use Copilot Studio agent with very little set up required.
We considered using Fabric as well as a solution with better scaling, but since we are not expecting enormous data, we landed on Dataverse as the better option.
Data Visualization
In the beginning, we discussed using Power BI as our visualization layer, but since we had some experience with Power BI not being optimal for live data (in terms of cost), we decided to outsource this task for the pro-code!
AI and Analytics Layer
To provide our data with more depth and additional insights we decided to add some AI Agents to our solution. These are not used to modify the data the players collect, but rather to enhance the gameplay, by providing creative puns and jokes about the current/or newest stats! Then we also wanted to gain more insights from the data, instead of having to manually go through the tabels, and that was the reason for creating the second agent, used to provide us admins more insights in the data. We let the AI mine the gems from our data!
Category: Low-Code
Our solution is built with a low-code first approach. That includes the continuous evaluation of all parts of our solution, and whether a low-code tool is actually the right choice for each specific need. In this very case, with Dataverse as the system of record and single source of truth, we found a natural fit for a combination of Power Apps, Power Automate and Copilot Studio.
Dataverse
We have used Dataverse out-of-the-box, with standard tables, extended with custom columns to fit our data-model.
Power Platform Environments
We set up dev, test and prod environments to be able to test, before shipping our solutions to prod.
The power platform pipelines is used with a host environment, the low code way to push the solutions.
We have also divided up our solution, so that it is easy to work on one area without having to think about dependencies issues. With the creep Publisher.
Power Apps
Power Platform solution: 02 – Itera Scope Creepers Back office
For our main administration tool, we created a model-driven app. We use it to:
Control and have overview of the Data.
A way to add new Teams and Players to the team.
Add new resources when they are published.
Control when the agent should do commentary or not.
Power Automate
Solution: 04 – Itera Scope Creepers – Flows and Power FX
To make our data flow with little effort we are using Power Automate.
A good reason for us to use Power Automate is because we also use Dataverse for our data model and a model driven app on top of those tables. This keeps things in the same universe. It is also very simple to expand on our automation need.
Using Power Automate we create spawn chunks which can be collected by miners. When a miner collects the spawn chunk we want to know that the spawn chunk has been claimed.
Our miner’s interaction with the QR-code is all pro code, using WEB API towards Dataverse. We could probably do all automation the same way, but we have a Low Code-first approach. So when our pro code-cousin has told Dataverse who has claimed a spawn chunk we let Power Automate do the rest. Using Automatic trigger Power Automate will react when a row is modified with a new miner, and then make sure the spawn chunk is set to claimed and our spawn location released for new spawns.
A scheduled flow goes off every 3 minutes. It collects all available spawn locations (available=yes).It triggers a child flow which generates a spawn chunk.It also triggers a flow which checks locations for TNT. The spawn chunks that have not been picked in the last 15 minutes will be deleted and can be replaced by a new resource.When a resource is claimed it triggers a flow that set the status for the chunk and opens the location for a new resource.
Copilot Studio Agents
Solution: 03 – Itera Scope Creepers – Copilot studio Agents
We are using the Microsoft Dataverse MCP Server as a Tool in both our agents:
We gave both agents clear instructions and guardrails, and use generative AI to and get valuable and fun insights on the data, both for players and for the back office workers, with low code as the motor.
Category: Code Connoisseur
Although our solution has been built with a low-code-first approach, some of our components were clearly better off built with other techniques. For our desktop-targeted dashboard, our mobile-targeted QR code gathering tool and its dedicated backend, a pro-code solution was a natural choice.
Backend (for frontend)
Our backend API is a neatly organized group of Azure Functions, written in C#, running on .NET 10 in a Linux environment. Easily readable endpoints ensure a clean flow of data and requests, with emphasis on clear separation of concerns.
Here is an example of one of our endpoints:
The updating of our database is handled by calling the Dataverse Web API. Authentication and authorization is handled through Managed Identity.
The backend API follows solid programming principles like proper error handling for each endpoint, efficient transfers with DTOs, and data classes that accurately model the elements of the solution like Miners, Teams, Resources, etc.
The QR codes we have placed around the venue connect directly to a route in the backend, which then internally manages the gathering of a resource.
Frontend (dashboard + QR code gathering)
The frontend is built as a modern React SPA web application, hosted as an Azure Static Web App. The backend-for-frontend is linked in and proxied, gaining both the benefit of not having to deal with CORS issues and keeping a separation of concern between these two parts of the solution.
Se more details about or frontend in these blog posts:
Now, getting the data is not the problem when we have the possibility to generate mock data with AI or be sneaky by gamifying data production to our advantage. The real value is when we can use our data to tweak, optimize, and make better decisions moving forward.
We meet customers who have a lot of data, such as IoT devices feeding data in real time. While many face the challenge of data living in many different systems, another major problem is using this data efficiently to optimize current production or make the workday more efficient through better insights.
In our solution, we tried to mimic this challenge, but with Minecraft gamification to make it more fun to develop. We use AI as live commentary in our solution, motivating players by turning Dataverse tables into fun, catchy statements that add more depth to the gameplay. We also use another AI agent to help us make sense of the data and optimize gameplay. This allows us to talk to our data, understand which players we should engage, identify low-performing locations, and detect abnormalities in the data, such as someone writing scripts to win the game and use these insights to handle things better.
Managing land usage in a Minecraft multiplayer environment requires structure, fairness, and clear governance to avoid conflicts between players building in shared worlds. To address this, we have implemented a digital building permit solution using the Microsoft Power Platform. Dataverse serves as the core data foundation, Power Pages provides a user-friendly portal where players apply for building permits, and Dynamics 365 Copilot Service Workspace supports case management and escalation when human review is required.
When submitting an application, players define the full build area using start and end (x, y, z) coordinates and describe in detail what they plan to build. This information is stored in Dataverse and made available through Microsoft Fabric to a set of agents built in Microsoft Foundry. Multiple agents analyze each application in parallel, evaluating coordinates and descriptions to detect overlaps with existing builds, restricted zones, excessive depth or height, and other potential rule breaches. The agents operate with access to the critical tables and fields in the case management process, ensuring decisions are based on authoritative and up-to-date data.
If all automated checks pass, the application is deemed ready for approval and a building permit is granted. If clear violations are identified, the application can be rejected or escalated for manual review. Power Automate orchestrates the overall process, handling status changes and approvals, while Customer Insights Journeys is used to notify applicants via SMS and email. Applicants are informed when their permit has been approved, rejected, or requires further review, and chat agents provide real-time interaction and support throughout the application lifecycle. The result is a scalable, automated, and transparent system that protects player creations, minimizes conflicts, and maintains order across the multiplayer server.
Redstone realm
As a Minecraft server admin it is difficult to resolve neighbor problems, and who was where first, and so on. Without organizing who owns what, and what rules apply where mayhem is inevitable.
Additionally approving applications based on zoning and researching what rules apply where, and what permits apply for what zoning, and overlapping zones is time consuming, and might require a team that knows the server well if scaled up.
This is why we have made the block permit solution. It leverages AI to do research, validate the risk, and comment on permit applications. It makes the process take minutes, not months/weeks.
We leverage agent pods (Microsoft Foundry Workflows) that are dedicated AI teams for a specific purpose. For example checking the permit applications compliance with zoning rules for where the application is for.
With these agent teams the research proses is easy, and compliance suggestions, risk analysis, and negotiations are just minutes away.
User interface – How to get a building permit on the Minecraft Server
We are leveraging Power pages for players to apply for a building permit in the portal. Currently, we are only accepting applications from players with Entra ID.
However, there are plans to evolve the registration process setting up username and password based on email and password with double opt-in authentication.
Home page
The main navigation has five pages, where players can apply for a building permit, as well viewing my applications and building permits.
Application form
Once building permit applications are successfully submitted, they are managed through the case management functionality within Dynamics 365 Copilot Service Workspace. All communications related to the building application process, culminating in the issuance of a permit upon approval, are coordinated via Dynamics 365 Customer Insights.
View my Building applications and Building Permits
Players can also see their building applications and their status (pending, approved or rejectedI) as well as viewing their building permits once approved.
Customer Service representatives can be reached in several ways if players have any questions regarding their permits or applications. Click on the “Send an Email” and your email client will open, then submit; a logic will then create a case in Customer Service.
Governance & Best Practices
The ability to observe everything that happens is important. And we use the Microsoft foundry to observe our agents.
These dashboards provide transparency to the usage of our agents.
But how do we prevent our agents from just “halusinating” the rules and decisions. And how do we prevent the fact that sometimes it just does not find all the data we want.
The solution is an agent that is specifically designed to be critical and give feedback to the researcher agents. Essentially giving the work of saying “try harder” to an agent. Each instance of the combination: Fabric data agent, Microsoft foundry agent, and read team agent is placed in a foundry workflow that works as a agent pod that makes shure that it figures out the correct answer.
Identities are important throughout the usage of agents. To be able to use the Fabric data agents you have to have access in fabric, even when you use the workflows. This means that the container apps that hosts the Foundry workflows as a API has to have a managed identity that has access in fabric.
The illustration below shows the apps in use to manage the approval process.
Data & AI
If you don’t have good data, you cant have good AI.
We might only have a single data scientist in our team that has 3 months experience with the role. But that does not mean that we don’t have data traveling from Dataverse to fabric and a medallion structure.
We have a dataflow. This is not optimal, but in our tenant the linking directly to fabric through shortcuts was broken. Shortcuts was the option A, but we had to adapt to use plan B.
We have a medallion structure that is made in the spirit of a data scientist.
In our bronze layer we just get the data to where it should be. In our silver layer we filter out the data we do not want to use. And in our gold layer we make the data useful with nice merges and useful tables. There are many ways to do a medallion structure, and this does the job I want it to do, but there is room for improvement in the silver layer.
This is the part where I rip the band aid off and say we have not had time to do power BI but we have visualization of our data at least.
Data is at the core of our solution. And as a verry new scientist that is way better at AI than data I know ai will out preform me by far. That’s why mining in our data is done with Fabric data agent.
The example above is a query created by Fabric data agent to figure out the contents of permit with the ID A100C206-FEF8-F011-92B9-00224806C768. In this case it searches in bronze for demo purposes.
These fabric data agents are further used by implementing in to other agents like Copilot, and Microsoft Foundry agents. These implementations help drive data driven decisions, and empowers humans to make data driven decisions quicker, and more accurately.
Low-Code
Low Code is NOT Dead… Not for me, not for us, not for the Enderdogs.
That Low Code is dead is an expression that we have heard for a while.. and yes we do see that Agents will do more of the Low Code than we do today. But Still, there is so much we CAN do with so little.
Our main purpose here was to create a solution within Minecraft. Our Business Case is that you need to have a permit before you can start to build your building in Minecraft, and for that we use;
Power Pages
Dynamics Customer Service
Power Platform
Contact Center
We also use Microsoft Fabric and Microsoft Foundr.
We start in Power Pages, where the Applicant can start to create an application. The Application is built with standard schema in Power Pages. When the application is created, we send it into Dynamics CE, to send we use an power automate.
In Dynamics CE, customer service and Dataverse we get the application as a case. And within the case we have the application. N:1 Now we can start the process that handles the case.
The Approval Process
First screening is done with the use of AI. The ai can comment the case, give suggestions, negotiate, and do risk analysis for the application. When the Agent is ready it sends back to Dataverse an updated status for Approved or Rejetcted. We are also able to Approve or Reject manually if the agents decides that it is required.
Connected To Dynamics Marketing (Customer Insight Journey). In Customer Insight Journey we have created a trigger that gets the data from the Case in Dynamics Customer Service. We are creating a journey that sends out emails to those who has been Approved or rejected.
Contact Center
We have set up a chat channel from the contact center, that is integrated in the portal The setup for channels is configured in the Customer Workspace Admincenter. After the setup we copy the code and add it into the portal. And Whola you have installed a chat so that the user easily can chat directly to you. An Agent starts the conversation, but it can easily connect you to an human resource. This Agent is configured in Copilot Studio.
Now the Human Agent can start the conversation with the person that is on the other side.
Copilot helps store the conversation on your case
Record Creation and update Rule.
We have also configured the Record Creation and Update Rule to manage emails from the web.
The best part of this is that you do not have to do so much because it is already configured for you. You just have to add your requirement for the case.
And activate it and you are good to go.
Connected to Teams and Outlook
The Bible
Thank you for the tip
We use planner to create task in M365
Booking
We can store all our documents in SharePoint and connect it to Dynamics 365
And pf course collaborate with Microsoft Copilot.
Code connessieur
We love code so much that we code inn to solutions that is not meant to be coded in yet.
We use a total of 5 coding languages in our solution, Powershell, Python, Terraform, C#, Bash.
We have IaC on all the infrastructure in Azure by using Terraform.
This includes the Foundry Workflows
Container apps:
And AI agents:
Our entire terraform infrastructure is built on ease of config, and vibe code friendliness (because that is a thing we need to think about now). all foundry configs are folder and JSON based
This config will create 10 agents, 4 flows, and one model.
The same mindset works for the services deployments in container applications. Just add your script, and config in a folder, and it wil just deploy the service infra.
But Iac is far from the only place we have code.
We have notebooks in fabric that create valuable data and really mines for pure diamonds in our data.
And yes it is vibe coded, but it works flawlessly.
Digital Transformation
As a player at a multiplayer server it feels good to have a space you can call your own regardless of what the other players think. Permits are a nice way to claim a area as your own, and proving that you are the rightful owner of the land.
Going beyond that it can be a good tool to make shure players are aligned on common areas by adding zoning rules. The combination makes it really easy to manage what gets built where. A good example can be a cabin area with thick trees. The zonining rule can dictate that “No open fires are allowed unless inspected and approved. Height limit 25 blocks. Size limit 40×40. Minnimum distance to approved permits 50 blocks. Styling must be mainly stone and wood.”
The zoning description would make the zone only approve “cabin like” permits.
But Permits are not cool if you use hours to apply, and months before you get the permits approved(like in the real world).
The application process is super simple, and have good user instructions for the end users to understand the process.
But what happens when you have sent the application. How does it approve the application, and help humans approve/make decisions.
This is where AI comes inn to the picture. The firs screening of all permit applications are done with AI. We have a team of 14 AI agents. We have dedicated agents for validating the coning compliance, assessing risk of the application, finding out if neighbors should be warned, and negotiating the permit terms if its close to valid.
These agents have origin in Microsoft foundry, but they are each connected to their own Fabric data agent that they can communicate their data investigation needs to, and it will figure out the data.
The docker container that contains unmined and also run.sh, a bash script to run unmined. Chose this instead of building a console app because it’s already running linux. This is the code to run the unmined console, the environment variable area is to limit the size of the map so it doesn’t turn to big and takes to long. Uploads the rendered imaged into Azure Blob storage.
All cases that are approved can easily be viewed in Dynamic365.
We can also see the approved applications in a view in Zone Manager portal
The Approval process runs with Customer Insight journey
Custom code and Web API Fully functional, client-side JavaScript implementation for a Minecraft zone and application management system. The code is intended for integration with Power Pages/Power Platform sites, with extension points for backend connections (Dataverse, Power Automate). The manager supports zone creation, application workflow, and interactive map visualizations.
1. Global State Management
A single source of truth for dynamic, interactive components.
const STATE = { zones: [], // Created zones map: [], // Zones currently shown on the map (deploy queue) applications: [], // Building/mining applications zoom: 1, // Map zoom factor activeTab: ‘applications’ };
Benefit: Keeps all UI in sync with code-driven updates.
2. Zone and Application Data Models
Zones: Each zone has coordinates (startX/Y/Z, endX/Y/Z), biome, category, protection and PvP flags, maxPlayers, and unique id.
Applications: Building requests featuring playerName, role, experience, requestedZone, type, dimensions, and status.
function createZoneElement(zone, seed) { // Organic SVG shape per zone (see generateZoneSVG) }
Example SVG Generation for “Organic” Zone Shape:
function generateZoneSVG(zone, width, height, seed) { // Color per zone category, organic curved borders using calculated path }
Benefit: Makes map visually attractive and clear to use.
Application Marker Display
Pending applications appear as building icons on the map, using coordinate and type data.
5. Tab Navigation and UI Management
Tab switching logic:
function switchTab(tabName) { STATE.activeTab = tabName; // Update UI classes, render specific content by tab }
Benefit: Mimics SPA-style navigation; content fetch is scalable (supports async API integration).
6. User Feedback & Validation
Error Message Display: When a form is invalid or zones overlap.
Success Alerts: E.g., after deployments or successful creation.
Confirmation Dialogs: For destructive actions (deletion, deploy).
7. Extensibility & Integration Points
Places in the code marked with // TODO are prepared for linking live flows to Dataverse and Power Automate APIs:
// TODO: Call Power Automate flow to update Dataverse
Usage Instructions
Embed zone-manager.js in a Power Pages site alongside the appropriate HTML structure (element IDs must match).
Edit sample data in loadSampleData() as needed, or replace with calls to backend API/Dataverse for production.
Manage zones/applications through forms, buttons, and interactive map.
Trigger deploy to simulate finalizing zone layouts (extend deployment logic as needed).
API Integration: Connect event hooks to Dataverse/Power Automate for real data handling.
Security: Implement server-side validation for production.
Styling: Enhance CSS for better responsive usability.
Performance Tuning: Optimize for large numbers of zones/applications as needed.
3D Zone Overlap Logic
function checkZoneOverlap(zone1, zone2) { // …math for checking bounding box overlap in X, Y, Z… return overlapX && overlapY && overlapZ; }
Zone SVG Path Generation (“Organic” Shape)
function generateOrganicPath(width, height, seed) { // Generates smooth, wavy polygon for zone display }
Application Card Render (with actions)
function createApplicationCard(app) { // Renders HTML for player requests, with approve/reject buttons }
Delivery Complete: Ready for Acceptance & Integration
Client-side sample and demonstration code is robust.
Extendable for backend integration; clearly documented.
Clean architecture for zone/app management and map interface.
Azure Functions
The idea was to use Power automate to trigger the Azure functions.
The PostRenderFunction receives a .zip file of the whole minecraft map. Validate the zip file and create a new “Render Job” row in Dataverse and use that GUID to send as naming of the .zip file. After the blob has been saved it will run the Azure Container Instance. And return message of Job ID, status “Queued” since it has been sent to the ACI. And the name of the world Blob Storage.
The GetRendeFunction will receive the JobID and check against “Render Jobs”. And if it has ran it will return the image. If not it will check against Blob storage with the GUID of the file. If it’s in rendered it will return the rendered image and change the status of the job.
A little Introduction to our final delivery here at ACDC;
We have built an end-to-end production flow where business demand starts inside the enterprise stack, gets approved and signed digitally, becomes an ERP production order, and is then executed automatically by an AI bot that gathers materials and reports progress back in complete with real-time updates and dashboards.
Minecraft is our safe, visual execution layer; the real-world use case is deployable physical robots harvesting and performing tasks in mines, quarries, and remote locations
Our project mimics real life customer work that we have worked on with Rana Gurber. Where they are deploying a dog-bot that gathers data in the mines, increasing safety for workers.
The flow (start to end)
Before going through the categories, a brief description of the architectural flow is good to have for some context later
1) Approval + Signing turns intent into an approved transaction
A purchase requisition is created in Dynamics 365 and enters an Approval/Signing stage. When it hits that stage, a Power Automate flow triggers the OneFlow contract creation, adds participants, publishes it for signing, and listens for contract status updates. Once fully signed, the flow retrieves contract details and automatically creates the Purchase Order header + lines in Dynamics 365.
2) Purchase Order triggers a Production Order and the worker starts
Once the approved PO exists, it triggers creation of a Production Order in our very own Minecraft FO Module. The Production Order reads the Bill of Materials (BOM) and determines required resources. In our module, those requirements are already translated into Minecraft blocks/items.
3) Execution happens in Minecraft: mine, gather, craft, deliver, and report back
An AI-controlled NPC worker is spawned in the Minecraft world and executes the translated task list:
mines required resources,
gathers/harvests materials,
crafts required items,
…..and sends the data to Power BI and our Canvas app Dashboard in Real time
4) Completion + status goes back to ERP (loop closed)
When the materials are ready, completion status is sent back so the Production Order can be updated/closed in F&O.
Boom, and you have automated production.
Now to the fun part;
Categories
Redstone Realm
We built a complete business process that begins where real work begins in internal systems. Approvals and contract signing are mechanisms that convert intention into trusted demand.
From there, Dynamics 365 Finance & Operations becomes the system of record for purchasing and production planning, and our downstream services treat it like the authoritative source, not just another data feed.
We deliver a real business solution for Minecraft server automation and management, integrating Microsoft Azure (Service Bus, Event Hub) and APIs. The experience is built around in-game automation (quest system + bot feedback), with an AI-autonomous bot that executes tasks while using self-preservation logic.
Finance & Operations overview and one-click ordering experience
Inside Dynamics 365 Finance & Operations, we provide a clear, user-friendly overview of Minecraft worlds and their connected resources. Worlds and resources are presented in a structured view with visual indicators so users can instantly understand key status informationsuch as whether a world is online, what resource type it represents, and whether it is currently ordered, without needing to read through large volumes of data.
Built to extend without redesign
The underlying structure is designed to scale predictably as the solution grows: it’s straightforward to expand the model, add new screens, connect reporting, and support additional integrations without having to rebuild the foundation.
The bridge between Dynamics 365 Finance & Operations and the Minecraft world where D365 business events (for example, production requests for raw materials) are dispatched through Azure messaging and orchestrated by a Logic App workflow “TriggerAI”. The orchestrator transforms structured business data into an AI-readable instruction, and the bot executes the job in the world, navigating terrain, collecting materials, depositing outputs, and reporting completion back through the same cloud path.
Player experience: narrated diary + conversational recall
We extended the experience beyond pure automation by adding a diary and interaction layer: Steve can be asked questions like “What happened on January 23rd?” and responds using memories grounded in diary entries and logs via Retrieval Augmented Generation. We also provide audio narration of diary entries through a containerized text-to-speech service and a web diary player that lets users browse history, play audio, and chat through a clean interface.
Why we deserve points in this category:
We deserve Redstone Realm points because we built a real business solution on Microsoft tech where Minecraft is just the “factory floor” or the real life mine.
The process starts where business work actually starts: inside the company. Approvals and contract signing turn a request into something trusted, and then Dynamics 365 Finance & Operations becomes the single source of truth for purchasing and production planning. We don’t treat FO like a dashboard feed. we treat it like the system that controls what’s allowed to happen.
From there, we use Azure to make the flow fast and reliable. When FO raises a business event (like a request for stone, wood, ore), it’s sent through Azure messaging and picked up by our orchestrator (TriggerAI). TriggerAI turns the structured business data into an instruction the AI can understand, and then the bot executes the work in Minecraft: it navigates, gathers the right materials, deposits them, and reports status back through the same path so the loop closes.
We also made it usable for real people. This can be and will be adopted in the future for many many businesses, and we did it in Minecraft.
In FO, users get a clean overview of worlds and resources with visual status indicators, and they can order or cancel with one click, and no complicated steps. The structure is built so we can extend it without redesigning everything: new screens, reporting, and integrations plug in cleanly. And on the experience side, we added a diary + chat layer with audio narration so users can follow what happened and ask “what did Steve do?” without digging through raw logs.
In short: it’s a Microsoft-first business workflow (FO + Azure + automation + AI) that turns enterprise requests into automated execution gathering resources, with a smooth user experience end-to-end,exactly what the Redstone Realm category is asking for.
Data, AI & Analytics
Telemetry sync + analytics-ready foundation
We collect and sync player inventories and bot telemetry (for example: backend/azure-telemetry.js and the HTTP Bridge mod),using analytics on quests, bot orders, and player activity. This also lays the groundwork for AI-driven features through the Mineflayer bot and quest automation patterns.
We transform raw Minecraft server logs into structured event records (deaths, combat, exploration, achievements, mining, crafting, farming, building, etc.) using a dedicated parser. Outputs are stored as dated artifacts (logs/diary/chat per day), creating a time-series dataset that can be queried and analyzed over time instead of being trapped as text files.
Memory indexing + retrieval for grounded answers
The RAG indexing system processes both diary prose and technical logs, chunks them with overlap to preserve context, embeds the chunks, and retrieves the most relevant passages using cosine similarity. When users ask questions, we embed the query, fetch top matches, and inject them into the prompt so Steve can recall specific incidents because they exist in indexed memory.
Traceability between enterprise transactions and in-world actions
We log D365 event details (for example, identifiers, quantities, resource types, event times) alongside in-world task execution. This enables correlation questions like “Which D365 order triggered the collection of X blocks?” by aligning timestamps and task IDs for end-to-end traceability.
Engagement analytics from chat history
Chat sessions are stored as JSONL conversation turns with role, content, timestamps, and session IDs. This allows analysis of interaction patterns like conversation length, common question topics, and response latency based on recorded usage.
Orchestration layer as a workflow boundary
We use a Logic App named “TriggerAI” as an orchestration boundary that receives structured business events and transforms them into AI-ready instructions, keeping routing and transformation logic maintainable and observable as a workflow rather than hard-coding every integration step in services.
An easy POST of the Trigger AI
>
Data consumption from Minecraft server in Raspberry Pi
From Raspberry PI, Posting data Minecraft player statistics, inventory data, world data from Minecraft server to Azure function end point.
Azure function posting the data to Azure event hub
Microsoft fabric event stream consumes the data from event hub and processing to Fabric lakehouse destination
Dataverse uses virtual tables to get the data to power platform to utilize the business process capabilities in power platform
Power automate flow process the data to Dynamics 365 Finance & Operations for further processing of inventories and orders
Data injection to Minecraft server
Asure Service Bus Queue using as a gateway from cloud to On prem Raspberry Pi Minecraft server to get the processed data from Dynamics 365 Finance & Operation, Dataverse and Azure
Our very own Minecraft Workspace in FO
Why we deserve the points for Data, Ai and Analytics:
We deserve Data, AI & Analytics points because we don’t just “use AI” for the sake of using AI. we built a real data pipeline where raw gameplay and business events become structured, queryable, and useful, and where AI is grounded in that data instead of guessing.
First, we’ve made the Minecraft world measurable (not with a tape measurement ..though) . We sync inventories, world data, player statistics, and bot telemetry from the Raspberry Pi server stack into Azure, route it through Azure Functions > Event Hub, and then consume it in Microsoft Fabric (Event Stream > Lakehouse). That means we’re not stuck with scattered JSON or log files, we have a foundation that supports real analytics and reporting on quests, bot purchasing orders, and player activity, and it scales as the system grows.
Second, we turn unstructured signals into clean data. Server logs are parsed into structured event records (deaths, combat, exploration, achievements, mining, crafting, and can adapt others and so on and so on) and stored as dated artifacts, creating a time-series dataset.
This is the difference between “cool logs” and “usable data,” because it lets you analyze behavior over time, correlate activity, and build meaningful metrics.
Third, our AI is data-backed. The RAG system indexes diary entries and technical logs, chunks them with overlap, embeds them, and retrieves relevant passages with cosine similarity. When someone asks Steve a question, the response is grounded in retrieved memories sooooooo it can accurately recall specific incidents because the data exists and is referenced in-context, not just thin air pooof..
Finally, we built end-to-end traceability between enterprise and execution. We log D365 event details alongside task execution, so you can answer questions like “which order triggered this collection run?” by correlating identifiers, timestamps, quantities, and task IDs. And we keep the operational loop tight: Fabric and Dataverse can expose the data back into the Power Platform, and Power Automate can process it into FO for inventory/order handling with Service Bus acting as the gateway back down to the Raspberry Pi for cloud-to-edge injection.
Low-Code
Power Platform-ready APIs and endpoints
Our API and webhook design is built to integrate cleanly with Power Platform (Power Automate, Logic Apps). RESTful endpoints make it easy to automate workflows and build dashboards, and the solution can be extended with low-code tooling for reporting and orchestration.
We have made low code components from power platform such as Powerapps (model driven app and Canvas apps), Power automate, Custom Connector, PowerBI)
Canvas Apps
Crafting Creepers Inventory Admin App
Helps to Generate model from the captured picture of an object. It uses the API from logiqraft.futurein using custom connector.
Creation of Order from CanvasApp
Generate purchase Summary using mscrm-addons.com and handle further processing in Dataverse and model driven app.
Minecraft Gold Converter App
Using APIs to fetch the external data from Materials such as gold and Exchange rate API
Here we use Low code Canvas app with the possibility of using Pro code PCF
We created a low-code app in Microsoft Power Platform based on an original idea using accelerator components. It includes phone-position adjustment and video streaming within a canvas app, with a simple purpose: a dynamic experience that responds to the movement of your phone. See video>
Custom Connector
We have created custom connector for another Teams API
Power Automate
We have created few power automate flows to automate the process and integrate the data with external services such as Linkmobility, Oneflow, mscrm-addons etc.
Also setup an Approval flow for approving and signing the purchase requisition
Why we deserve Low-code points:
We deserve Low-Code points because we used Power Platform to make the solution easy to run, easy to change, and fast to extend.
Power Platform building blocks (real, not just a demo)
We built with Canvas apps + Model-driven apps, Power Automate, Power BI, and Custom Connectors, so users can create orders, manage data in Dataverse, and see outcomes without needing developers for every change.
Apps that solve specific jobs
We delivered practical Canvas apps: an Inventory Admin app (using a custom connector to call an external API), a Minecraft Gold Converter (using external gold + exchange-rate APIs, with optional PCF), and a mobile experience app with phone-position adjustment and video streaming (video provided).
Automation + integrations with low-code flows
We created Power Automate flows for approvals and signing, and for integrating with OneFlow, LinkMobility, and mscrm-addon which makes the process runs automatically instead of manually.
Code Connoisseur
Purpose-built code across backend, bot, and mod
We implemented custom code across the stack: backend services (Node.js/Express), the Mineflayer bot, and a Java/Fabric mod. The codebase is modular and structured (see AGENTS.md for patterns) and includes innovations like the HTTP Bridge mod, bot self-preservation logic, and RCON integration.
The fabric inventory for converting the inventory in-game to products in Finance and Operations
Structured LLM interactions with guardrails (schemas, validation, planning)
We implemented structured LLM interaction patterns using DSPy signatures for command parsing, quest generation, and multi-task planning. To keep execution reliable, our planning pipeline enforces strict output schemas and validates every task structure before the bot runs it (e.g., required fields for collect/navigate/deposit), preventing malformed plans from becoming runtime failures.
Steve RAG: semantic memory built from scratch
We built a Retrieval Augmented Generation system that indexes diary entries and server logs into chunked segments with metadata, embeddings, and cosine similarity retrieval. The index supports reprocessing when files change (hash-based detection), and the retrieval layer feeds relevant passages into the chat prompt so Steve’s answers stay consistent with recorded events rather than invented details.
Full-stack delivery: services, storage, and edge deployment
We wrote the chat server and storage model (JSONL conversation history per date with IDs and timestamps), plus container and web delivery plumbing (Docker builds, persistent volumes, Nginx serving with CORS). We also produced deployment scripting across environments (shell scripts for edge devices and PowerShell for local workflows), turning the stack into something repeatable to run.
Why we deserve points:
This is not just a standard stack the LLM in the box but we engineered reliability
validated task outputs so the bot doesn’t break at runtime, built memory retrieval so answers stay consistent with recorded data, and delivered everything as a runnable, repeatable system (which means it has containers + scripts + edge-ready setup).
In short (how I understand it myself) : innovative code that actually ships and works, from ERP mapping to autonomous execution and real operational services.
Governance & Best Practices
Safety, privacy, monitoring, and operational robustness
We emphasize reliability and responsibility through error handling and logging across backend and bot services. Player inventories and “quest data” are handled securely, while telemetry and monitoring are streamed via Azure Event Hub. The bot is designed with responsible safety logic (self-preservation, health checks).
Monitoring:
As we are unning the Minecraft server as a prod environment we have set up a comprehensive monitoring stack with:
Cadvisor
Node exporter
Prometheus
Grafana
D365 Security Roles
Because Dynamics 365 Finance & Operations is the system of record in our solution, we also use it as a security boundary. We implemented FO security system (security roles, duties, and privileges) And trust me…working with this for the past months now, this can take some time to fully cope with..
so only the right people can view, create, approve, and dispatch the transactions that ultimately trigger bot work. That means the ability to “send work to the bot” is controlled the same way you’d control purchasing or production actions in a real enterprise process: through FO permissions, not informal access.
In practice, this keeps the automation constrained to its intended scope. The bot and edge device only operate on what FO authorizes, and the integration only processes actions that the user is permitted to perform.
Workers and admins can control the bot through the governed workflow which makes sure the bot doesn’t become a free-running actor, but an execution agent that acts on behalf of authorized users.
We deserve points for this category because;
We deserve Governance & Best Practices points because we treated this like a system that must be trusted and have spent time making sure the BOT handles it duties without breaching limits and
First, we built for operational robustness. Both the backend and the bot services include deliberate error handling and structured logging, so failures don’t turn into silent chaos. When something breaks, we can see what happened, where, and why.
That matters because our solution is event-driven and automated, if you don’t invest in observability, you don’t have automation, you have a surprise generator.
Second, we designed with privacy and data responsibility in mind. We handle player inventories and quest-related data as real user data: it’s stored and processed in a controlled way, with clear boundaries between what belongs in the game world and what belongs in the business/automation layers. The system doesn’t need broad access to everything; it only moves the minimum data required to execute and report on tasks.
Third, we made the solution monitorable by default. Telemetry and monitoring are streamed through Azure Event Hub, which gives us a consistent way to observe execution, detect anomalies, and build reporting on top of reliable signals. That monitoring isn’t only for dashboards but it’s part of governance: if you can’t measure behavior, you can’t manage risk.
Because we built FO security (roles, duties, privileges) directly into the execution pipeline, we deserve governance points for making automation permission-driven and bounded by enterprise access.
Lastly for governance and monitoring ,We deserve governance and best-practice points because we treat the Minecraft server like a real production workload, with a proper monitoring stack (cAdvisor, Node Exporter, Prometheus, Grafana) to ensure visibility, stability, and fast incident response
Our AI and autonomy bot are built with safety logic, not just wishful vibe coding thinking.
The bot includes self-preservation behavior and health checks, so it doesn’t blindly execute tasks until it dies or gets stuck. We also constrain autonomy through structure: tasks are planned and executed within defined patterns instead of free-form behavior, and the bot reports status back through the same controlled pipeline. In other words, we’ve designed autonomy that stays predictable, inspectable, and stoppable. It also is hosted locally, on its own rasberry Pi, enbaling us to have full control whats gets out and in of the hardware. The pipeline is also following best practice in terms of pipeline and dockerization practices, this makes things hard to break once its in production.
Digital transformation
The solution automates Minecraft server management to reduce manual effort and improves player/admin experience through intelligent automation (quests, bot orders, notifications). It demonstrates measurable impact through faster task execution, real-time feedback loops, and extensibility for future needs. And the needs we believe will come!
We are transforming Minecraft inventory management with finance and operations. With business process that automates into Minecraft.
Why we deserve points for Digital Transformation:
We replace manual steps with automation, give real-time feedback on progress, and make outcomes measurable (faster execution, clearer status, and easier operations). The key transformation is linking inventory and production thinking from FO to Minecraft execution, proving the same pattern can scale to real-world harvesting and remote operations. Which we have seen both with our own customer and other production companies. THIS IS THE FUTURE.
Final summary
This is an end-to-end production system that starts with trusted business demand and ends with autonomous execution, all while keeping the enterprise system safe and the outcome measurable. It combines real workflow automation, event-driven cloud integration, autonomous task execution, and a data pipeline that turns activity into operational intelligence.
Thanks for us, It was a pleasure. Until next time!
The climate emergency is no longer a distant threat on the horizon – it is a present reality reshaping lives, economies, and ecosystems around the world. Scientists have issued repeated warnings that the planet is at “code red,” with key indicators such as greenhouse gas concentrations, global temperatures, and ice mass loss reaching critical levels, underscoring the need for transformative action now rather than later. Meanwhile, recent global health analyses show that climate-related impacts – like extreme heat, air pollution, wildfires, and food insecurity – are already contributing to millions of deaths annually, with heat-related fatalities alone up significantly compared to past decades.
Despite this urgency, there remains a glaring disconnect between what young people experience and what many educational systems equip them to understand or act upon. Surveys reveal that a large majority of youth feel they receive inadequate climate education – 70% question the quality of what they’re taught, and a startling number cannot fully explain even basic concepts of climate change. In fact, more than eight out of ten young people say they know little to nothing about how their governments are tackling the climate emergency. At the same time, youth around the globe report deep concern about the future: many feel anxious, powerless, and emotionally affected by climate change, highlighting both a gap in knowledge and a need for meaningful engagement.
These realities are not abstract – they have concrete consequences. Climate-exacerbated disasters disrupt schooling for tens of millions of students every year, threatening education, wellbeing, and future opportunities. The cost of inaction is not just environmental but societal: without equipping young people with knowledge, agency, and tools for climate solutions, we risk perpetuating cycles of unpreparedness, disengagement, and systemic vulnerability that will burden generations to come.
This is where EcoCraft steps in. We’re building a platform designed to bridge the gap between current climate realities and the education systems meant to prepare young people for them – empowering a generation with the understanding, skills, and motivation needed to be informed participants in climate action and resilient contributors to a sustainable future, wrapped around a fun and engaging experience within Minecraft.
EcoCraft – The Movie
Before we dive into the detail, why not check out our Hollywood-quality movie explaining our solution?
And if you want more (of course!) then we have an awesome Bratwurst & Biscuits theme song! Enjoy.
Project Overview: What We Built
We built EcoCraft, an educational ecosystem embedded inside a Minecraft server designed to help NGOs, teachers, and children explore ecological impact through interactive play. Minecraft is used as hands-on learning platform where children encounter sustainability challenges and need to fulfil mission to solve those challenges. This allows the children to get direct feedback and have improved reinforced learning. Minecraft works brilliantly for this audience because it blends exploration, creativity, and systems thinking in a space that kids already understand and enjoy. Behind the scenes, we connected this Minecraft world to modern cloud services and automation flows (e.g., Dataverse, Azure Functions, agent-driven world changes) so that NGO workers can influence scenarios in real time, track progress, and automate educational workflows without manual overhead.
The Big Idea Behind EcoCraft
The idea is to make complex topics like sustainability, systems thinking, and responsible decision-making understandable through play. Instead of explaining these concepts in abstract terms, EcoCraft lets kids experience them directly in a world they can shape. Every action such as building, consuming resources, changing the environment has visible consequences, encouraging curiosity and reflection rather than right or wrong answers. By combining Minecraft’s creative freedom with intelligent systems in the background, EcoCraft turns learning into experimentation and shows how thoughtful choices can lead to healthier, more balanced worlds.
High-Level Architecture
Our solution brings the best in breed capabilities from Microsoft’s low-code app development solution, Power Platform, alongside the existing set of customer engagement applications in the form of Dynamics 365, with the limitless scale and enterprise-grade capabilities within the Microsoft Azure platform. We further complement this by bringing together existing capabilities from Independent Software Vendors, such as mscrmaddons, LinkMobility and OneFlow. This results in a solution that can be tailored for internal use, can facilitate our required extensibility requirements and combines the best of low-code and pro-code capability into a fully working, end-to-end solution.
Category: Redstone Realm
EcoCraft should win the Redstone Realm category because it is not just an educational game concept, but a real business solution built on Microsoft 365, Power Platform, Microsoft Azure and Dynamics 365 technologies plus Minecraft! It combines scalable architecture, AI-driven innovation, strong governance, and a smooth player experience into a platform that can be operated globally by NGOs, schools, and educators.
From a business perspective, EcoCraft enables scalable and repeatable learning experiences wherever Minecraft is available. EcoCraft can manage content, scenarios, and world changes through structured workflows instead of custom game development or manual intervention. This allow us the same solution to be reused across schools, regions, and programs without increasing operational complexity. As climate change and waste pollution continue to accelerate globally, with rising temperatures and increasing waste volumes posing long-term risks, we offer a way to create awareness and drive behavioural change at scale by turning global challenges into interactive learning experiences.
The global relevance of this scalability is critical. Climate change and waste pollution are challenges that affect everyone on our planet. Scientific evidence confirms a persistent rise in global temperatures, while global waste volumes continue to grow at an alarming rate. We translate these abstract, large-scale problems into interactive, hands-on learning scenarios. By doing so, it enables education at scale and empowers children worldwide to understand environmental responsibility early, where behavioral change has the greatest long-term effect.
AI/Copilot plays a central role in this architecture. Copilot agents are used to generate and evolve new scenarios using natural language, dramatically reducing the effort required to design and adapt educational content. The complexity of AI, data processing, and orchestration runs outside the game world, while the results are delivered seamlessly to players. This is the AI-infused pickaxe helping us mine insights and impact more efficiently.
Accessibility is our foundational design principle. Minecraft is already familiar to the target audience, which significantly lowers the entry barrier for children and teachers alike. No additional tools, accounts, or technical onboarding steps are required. Players can immediately engage with the content in an environment they already understand. On the operational side, the supporting platform built on Microsoft Power Platform and Dynamics 365 follows established accessibility standards designed by Microsoft. Built-in accessibility features such as keyboard navigation and screen reader support ensure that our NGO-Users with different abilities can confidently operate with the system with as few barriers as possible.
Privacy and trust are integral to the system architecture to be recognized and taken seriously as an NGO. EcoCraft does not store the real name of the kids within the Dynamics 365 environment. Personal contact information is protected through field-level security, ensuring that our Users cannot access sensitive data. Data usage is governed by formal contracts with participating schools and all processes follow ISO-certified standards. This creates a trustworthy foundation that allows EcoCraft to be used safely in educational and non-profit contexts without compromising data protection or compliance.
The players experience remains smooth because technical complexity is kept entirely behind the scenes. Automation and integration enhance the world’s responsiveness without interrupting gameplay. For players, interactions feel natural and immersive. The world reacts, evolves, and responds seamlessly to their actions, allowing learners to focus on creativity, collaboration, and problem-solving rather than technology.
EcoCraft delivers scalable global impact through a trusted, accessible, and well-architected system that keeps learning engaging and safe. Its ability to combine environmental education, strong governance, and a seamless player experience makes it the clear choice to win the Redstone Realm category.
Category: Governance & Best Practices
As part of our commitment to strong governance and responsible use of Artificial Intelligence (AI), we’ve taken a principled approach to how our platform features are designed, tested, and deployed. This has required us to define clear development and oversight structures, that leverage native platform features where appropriate, whilst also bringing in other tools from across the stack.
Our use of both environments and managed environments sets our foundation level strong. We immediately can ensure a healthy Application Lifecycle Management (ALM) process is put in place, whilst also affording opportunities to incorporate useful capabilities, such as Power Platform Pipelines.
From there, we apply additional configurations that tap into the most appropriate platform level features. To ensure auditability and traceability, we’ve built rich monitoring and logging into our development pipeline from the ground up. With Azure DevOps, YAML pipelines, and Application Insights, every build, configuration change, and model interaction can be traced and reviewed. This not only supports effective debugging and performance measurement but also creates a clear audit trail for compliance and governance reviews. Version control via Git source control and environment separation between development, test, and production strengthen this auditability, giving stakeholders confidence in the correctness and integrity of each component of the solution.
On the security boundaries front, we’ve deliberately separated concerns and protected critical assets such as API keys and secrets using Azure Key Vault, coupled with managed identities. This approach enforces strong security postures and minimizes the risk of credential leakage or unauthorized access. By implementing role-based access controls and environment segregation, our solution mitigates common attack vectors and ensures that sensitive operations are contained within trusted execution contexts. Aligning with principles of secure and robust system design is essential to maintaining trust and operational resilience.
Our use of Artificial Intelligence in the solution is guided by our awareness of the risks involved. First and foremost, we ensure we build in appropriate prompt-level protections for all the AI agents we use, accepting that without this, we are potentially opening our solution to risk of abuse, data leakage and other issues. From there, we leverage Copilot Studio, which, thanks to the Copilot Control System-level protections that it provides, ensures the technology can be used safely and aligned to all data privacy concerns.
Finally, and speaking of data privacy, this is woven into the architecture rather than treated as an afterthought. Data access controls via Dataverse security roles and Azure Role Based Access Control (RBAC) features, anonymization where appropriate, and encryption in the form of Column Level Security profilesall contribute to a privacy-preserving design that respects user rights and regulatory expectations. These measures align with widely accepted privacy frameworks that emphasize consent, secure storage, and minimal data exposure throughout the AI lifecycle. By adopting these data governance and protection strategies, we build systems that not only deliver value but also safeguard sensitive information in a way that’s reliable, defensible, and future-ready.
Altogether, our approach considers the common things that are required to ensure any successful business application deployment, while also appreciating some of the more pressing concerns that are arising from newer technology areas, such as AI.
Category: Data, AI & Analytics
Operational Reporting
Data sits at the centre of EcoCraft operations. Every action the player takes in the Minecraft world is captured as an event and stored with enough context to be useful later. This allows us to understand player behaviour over time and monitor the player experience, getting ahead of issues that might arise.
Gameplay data from Dataverse is surfaced using Power BI, with a dashboard embedded directly into the model-driven app (see this ‘Dash It Out’ post for further details). This dashboard gives NGO users and organisers a clear, shared view of what is happening in the world – such as player activity, common actions, and outcomes – without needing to switch tools. The focus is on simple, readable visuals that support decisions, over complex reports.
Real-time Analytics
For real-time visibility, we use Application Insights to monitor the system as it runs. This helps us spot issues quickly, understand performance, and see how commands and events are flowing through the platform. During the challenge, this proved valuable for troubleshooting and for building confidence that the system was behaving as expected.
Our ‘Right Now’ badge post give further information about how we use Application Insights to monitor our Azure functions, Power Automate flows and Power App usage.
AI for Good (Gameplay)
AI is used in a few targeted ways. An agent supports the NGO user by interpreting gameplay data and suggesting or triggering changes to the world to improve the player experience. Rather than acting automatically, the agent works within clear limits and uses the data already captured to guide its decisions.
In this scenario, the NGO user is using the ‘Change the World’ agent to ask for changes to a player’s Minecraft world using natural language. The agent interprets the request and generates the required Minecraft instructions (e.g add more trash to pick up) to send as a Command to the player’s Minecraft session. See our post for the Existensial Risk badge for more information.
We also used AI to generate realistic sample data, including players, teachers, parents, and error logs. This allowed us to test analytics, dashboards, and monitoring early, without relying on live usage. It helped validate both the data model and the overall observability of the solution. Our post for the ‘Nasty Hacker’ badge describes how we did this in more detail.
Overall, the aim was to show how data, analytics and AI can work together in a practical way – starting with good data, adding insight through visualisation and using AI where it genuinely improves the experience and team productivity.
We feel we’re a strong contender in Data, AI and Analytics category because data is built into the solution from the start. All gameplay is captured as structured events, analysed through Power BI dashboards embedded directly in the model-driven app, and monitored in real time using Application Insights. AI is used in a focused and explainable way to support NGO users by improving the player experience. The result is a clear, practical example of how data, analytics and AI can work together to support the NGO user’s insight, monitoring and help them make better decisions while EcoCraft is in use.
Category: Low-Code Innovation
Why EcoCraft is the strongest low-code solution built at ACDC2026
What makes our solution stand out is not a single feature or clever workaround. It is the architectural consistency: every capability builds on first-party Microsoft tooling, stays as much as possible in low code, and still delivers enterprise-grade outcomes.
At the centre of the solution is Customer Insights – Journeys (CI-J). As a first-party engagement platform that combines segmentation, orchestration, and communication without requiring custom code. With CI-J comes no-code segmentation, highly personalized E-Mail and SMS journeys and a robust event structure out of the box. The key advantage here it is easy to use and trust! Data stays within one ecosystem, security and compliance are inherited from Dataverse and the solution is supported by Microsoft and gets better features every month.
To connect with existing and new users, we relied on Customer Insights marketing forms published directly by Microsoft and copy & paste ready code to embedding it on every website. The forms are accessible from our website and act as the primary touchpoint for schools requesting an EcoCraft world. The forms were created entirely via drag and drop. From a business perspective, this eliminates infrastructure overhead. From a technical perspective, it avoids custom frontends, APIs, or security concerns that usually come with public forms.
We extended the standard customer journeys using Custom Triggers in Customer Insights. While these triggers could easily act as a bridge into pro-code scenarios like Azure Functions or Java, we intentionally stayed low code and combined them with Power Automate. The result is a system that can evolve into more complex architectures later, without forcing that complexity upfront.
Power Automate became the backbone for operational workflows. We used it to involve the ACDC sponsors (OneFlow and ms-crm addons), handle the contract management, send-out and capturing of signatures, print certificates and information sheets and connect to Azure services for storage of documents and external integrations to Minecraft. From a design perspective, this keeps business logic readable and auditable for a wider audience and reduce the need for developers. From an operational perspective, it allows citizen developers to understand and adapt the system.
Building approval processes to keep humans in the loop was important. We want to use automated processes but not want to trust automations and AI blind. New request from schools, data usage, and outputs can be reviewed and approved, or rejected, directly in Outlook or Teams. This is essential for an NGO like EcoCraft! Accountability and transparency matter as much as innovation.
For internal use of Ecocraft we decided to go with a model-driven app. NGO users can access and manage data in a structured, secure way without needing technical knowledge. On top of that, Power BI dashboards make impact visible. Either we want to see how many kids we trained in the last month, how many schools got an information sheet about our mission or the impact of EcoCraft to the world.
Model-driven apps are not known for their best user experience (sometimes), so that’s why we extend our app with Custom Pages to make interactions feel smooth and intuitive, but we have to admit that we need to surf with an extended view of low-code (like a black belt need to do in some situations), to use JavaScript to open our Custom Pages. The improvement of user experience makes the decision even easier, because our mission does not allow to lose our users in a bad app.
On the last step we extend our solution with Copilot Studio Agents to be a frontier NGO. The agents help generate new EcoCraft scenarios that children need to solve. Our users can write the scenarios in natural language with local information about the school or even with personal preferences of the child included and the agents are transforming it to a JSON format that the Minecraft server understand and automatically transforming the world. With these agents, we reduce the number of pro-code developers and keep creativity as the most important skill for new scenarios.
Taken together, our EcoCraft solution demonstrates what low code looks like when it is treated as a strategic platform of choice. It is extensible and aligned with both business goals and technical best practices. That is why we believe it represents the strongest low-code solution developed during the ACDC2026.
Category: Code Connoisseur
Given the key requirement to integrate directly into Minecraft, it was always going to be necessary to incorporate pro-code extensibility within our solution. Here, we started off with a simple requirement – get something into a Minecraft world based on an external event. The two approaches open to us were as follows:
Use Minecraft Bedrock Edition: This appeared to be ideal from the outset, as the techniques required aligned closely to the teams’ technical skills (Visual Studio Code, TypeScript, NPM etc.). However, it became apparent that our key dependency was on the @minecraft/server-net module, which is currently in pre-release, and we were unsuccessful in being able to use this to initiate an outbound HTTP request to an endpoint we stood up.
Use Minecraft Java Edition: In our early research, we had highlighted the potential to use Remote Commands (RCON) to manipulate the Minecraft world. There were concerns around the security aspect of this and how deeply the team would need to know Java to be able to code in this manner. But it soon became apparent that this was the only viable solution to achieve the outcome we needed and, with some support available for using RCON in C#, the pathway was open to us.
Having conducted our proof of concept, we commenced with setting up our Azure DevOps environment, which was then used to host all the code we planned to create:
Our core Azure function, configured using a HTTP trigger, handles the following workflow:
On creation of a new Command Execution in Dataverse, a cloud flow issues a POST request to the Function App endpoint, with a payload containing our Minecraft RCON’s e.g.
[
“setblock -2 -60 16 minecraft:dried_ghast”,
“setblock -2 -60 20 minecraft:dried_ghast”,
“setblock 3 -60 19 minecraft:dried_ghast”,
“setblock -2 -60 14 minecraft:fire”,
“setblock 3 -60 16 minecraft:fire”,
“setblock 1 -60 12 minecraft:fire”,
“setblock 2 -60 22 minecraft:fire”
]
The function app checks the payload, parses it accordingly, before then opening a RCON connection with the server and posting each command individually. A response will be returned to indicate whether the command was processed successfully or not
As we had introduced Azure into the equation, it was a natural next step to also implement Infrastructure as Code (IaC). We described our detailed approach in an earlier blog post, but the result is separate Bicep templates covering all the components we have deployed. This allows us to quickly redeploy our environment out, track our changes, maintain our Azure IT assets formally and, through Biceps innate capabilities, more easily handle common issues, such as incorrect resource definitions or missing dependencies.
Finally, we have brought in custom code to support our more complex automation tasks within Dataverse. Our first preference has always been to use Power Automate cloud flows for this, but there will always be specific scenarios where a plug-in can express and execute our desired logic in the most effective way. For example, we have a plug-in that runs on creation of a new Session that links up all the required related rows using a few lines of code:
Our plug-ins benefit from the following features we proactively implemented:
Unit and integration testing built directly into the project.
All code hosted within our Azure DevOps Git repository.
Clear, logical folder structure, grouping all plug-in classes based on the logical table name.
In all our coding efforts, the assistance of tools like GitHub Copilot and ChatGPT cannot be understated. They allowed for very rapid iteration and development of solutions, streamlining the amount of time it took to build out required extensibility points and allowing us to stress test our prototyping so we could make an informed go/no go decision. Thanks to the experience levels on the team, we were able to produce more quality outputs, rather than just trusting any potential “slop” generated by the AI tools.
Overall, our solution balances the need between low-code and pro-code extensibility, by considering the need for custom code only when the requirements and the complexity mandate their usage. From there, we then take a measured and focused view on how we can maintain our code artifacts in the best possible way, to ensure that critical aspects of the development lifecycle, such as testing and deployment, can be streamlined and, often case, automated entirely.
Category: Digital Transformation
EcoCraft is a clear example of digital transformation because it connects a familiar digital environment with real-world processes, people, and decisions, while enabling organizations to do more with less. It demonstrates how intelligent automation can improve experience, scalability, and operational efficiency without increasing complexity or resource requirements.
Real-world relevance is at the core of our design. Sustainability, waste reduction, and systemic thinking are not treated as abstract concepts, but translated into interactive experiences that reflect challenges faced by schools, NGOs, and communities worldwide. Actions taken by children inside Minecraft directly mirror real world cause and effect relationships. This creates feedback loops where learning outcomes are tied to realistic scenarios. We are making the experience both educational and applicable beyond the digital environment.
Digital automation is a central role in making it scalable and practical. World changes, scenario progression, feedback loops, and data-driven responses are triggered through automated, event driven workflows. This significantly reduces manual effort for educators and NGO Users while ensuring consistent and reliable execution across sessions and locations. We can deliver high quality experiences repeatedly without needing additional staff or technical intervention, directly supporting the goal of doing more with fewer resources.
At the same time, we keep the humans in the loop. NGO Users and educators remain in control of critical decisions such as approving scenarios, adjusting difficulty, or responding to individual learning needs. Automation supports their work rather than replacing it. This balance ensures accountability, contextual judgment, and responsible use of technology, while still benefiting from efficiency and scale.
The impact of this approach extends beyond the hackathon itself. The underlying architecture follows reusable digital transformation patterns such as event-driven automation, consent aware data handling, agent assisted and decision making. These patterns can be applied to education platforms, community programs, nonprofit services or other customer and internal solutions. We are demonstrating not only a successful implementation, but a transferable model for intelligent automation in real-world business and more important social scenarios.
EcoCraft improves the experience for both learners and operators while reducing operational overhead. It shows how intelligent automation, combined with human oversight, can deliver measurable impact and practical value in real-world contexts, making it a strong competitor for the Digital Transformation category. Maybe the best competitor?
What Went Well
Keeping the scope small and playing to the strengths and experience of our team meant we always had something working, even as the solution evolved
Power Platform and AI enabled rapid productivity and build
Clear separation between components reduced complexity and improved the ability of our team to work independently towards the shared build goals
Minecraft provided an intuitive, engaging front end with a low learning curve
Agile team – our use of a “manual” kanban board and hourly standups ensured we could keep on top of all tasks, pivot / adjust our approach and foster a closer working relationship, leading to an extraordinary team spirit.
What We’d Do Next
Improve reporting so in-game player behaviour patterns are easier to ingest and trigger additional automations.
Track player progress over time to support more tailored responses, using a experience / “level up” based system.
Extend the command model to handle more complex in-game outcomes, using agentic capabilities to generate required RCON scripts
Final Thoughts & Wrap-Up
The climate emergency demands more than awareness – it demands understanding, agency, and sustained action. Our solution is designed to meet young people where they already are, using engaging, interactive experiences to transform abstract climate concepts into something tangible, relatable, and empowering. By combining education with play, exploration, and problem-solving, we aim to turn climate learning from a passive experience into an active journey—one that builds confidence, curiosity, and a sense of ownership over the future.
The potential impact of EcoCraft is significant. Research consistently shows that experiential and game-based learning can improve knowledge retention by 20-40% compared to traditional instruction, while also increasing motivation and long-term engagement. By introducing climate education earlier and reinforcing it through repeat interaction, we foresee meaningful gains in climate literacy, systems thinking, and real-world problem-solving skills. Even modest improvements at scale – such as helping one in five learners better understand climate cause-and-effect – can compound into long-term behavioural shifts that influence families, communities, and workplaces.
Crucially, this solution doesn’t just benefit children. It supports parents with clearer conversations, shared learning moments, and a sense of reassurance that their children are being equipped for an uncertain future. For NGOs and educators, it offers a scalable, adaptable tool that can reach underserved audiences, reduce barriers to climate education, and provide measurable engagement data to inform programmes and policy. Over time, we believe this approach can contribute to stronger participation in local climate initiatives, increased adoption of sustainable behaviours, and a generation that feels informed rather than overwhelmed.
Taking no action today locks in higher costs tomorrow – environmentally, socially, and economically. By investing now in education that inspires understanding and action, we are choosing resilience over resignation. Our solution is not a silver bullet, but it is a meaningful step: one that helps turn concern into capability, and capability into change. The future is being shaped right now – and we believe young people deserve a fun and engaging tool to help them understand the important role they can play.
Criteria: Real-time collaboration using socket.io, signalR, WebSocket, etc. Show us the code!
Our Implementation: Micro:bit IR sensor → Azure Functions → Webcam server → Minecraft builds in real-time
The Problem: Physical Meets Digital
So here’s the thing – we built this sweet Minecraft house builder platform with AI, forms, templates, shopping carts, the whole nine yards. But you know what’s missing? The physical world. Sitting at a keyboard typing coordinates is fine, but what if you could just… point at something and build it?
That’s where the micro:bit comes in. We’ve got an IR proximity sensor hooked up to a micro:bit that detects when you place a MicroBit Board plate on it. The moment that sensor triggers, we want to capture what’s on the plate with a webcam, send it to Azure, and start building in Minecraft. Not in 5 seconds, not “eventually” – RIGHT NOW. Real-time, baby.
The Architecture: A Chain of Real-Time Events
Here’s the flow, and I want you to appreciate how fast this happens:
Azure Function Receives Event (HTTP trigger) – ~20ms
Function Requests Webcam Snapshot (HTTP) – ~200ms
Webcam Server Captures Image (Python Flask) – ~100ms
Image Stored in Azure Blob (Storage SDK) – ~150ms
Minecraft Building Triggered (Game API) – ~500ms
Total time from “I placed the plate” to “blocks are spawning”: ~1 second. That’s real-time enough for government work.
The Micro:bit: Edge Computing at Its Finest
The micro:bit runs MicroPython and is connected to a WiFi module (ESP8266 or similar). When the IR sensor detects proximity change, it immediately fires an HTTP POST to our Azure Function:
This is edge computing – the micro:bit makes the decision locally and triggers the cloud immediately. No polling, no delays, just pure event-driven architecture.
The Azure Function: The Real-Time Orchestrator
The Azure Function is where the magic happens. It’s an HTTP-triggered function that receives the micro:bit event and immediately kicks off the snapshot process:
HTTP trigger fires immediately when micro:bit calls
No queues, no delays – straight HTTP pipeline
Timeout set to 30 seconds (generous, but we’re typically done in <1s)
Logging at every step so we can measure actual performance
The Webcam Server: Local Speed, Cloud Integration
The webcam server runs locally on a laptop because that’s where the USB webcam is. It’s a simple Flask server that exposes one endpoint: `/snapshot`. When Azure Function calls it, it captures a frame immediately:
python
# camera_server_continuous.py
from flask import Flask, Response, jsonify
import cv2
import time
app = Flask(__name__)
# Open camera once at startup
camera = cv2.VideoCapture(0)
camera.set(cv2.CAP_PROP_FRAME_WIDTH, 1280)
camera.set(cv2.CAP_PROP_FRAME_HEIGHT, 720)
@app.route('/snapshot')
def snapshot():
"""Capture and return a single frame as JPEG"""
start_time = time.time()
# Grab frame immediately
success, frame = camera.read()
if not success:
return jsonify({'error': 'Failed to capture image'}), 500
# Encode to JPEG
_, buffer = cv2.imencode('.jpg', frame, [cv2.IMWRITE_JPEG_QUALITY, 90])
elapsed = (time.time() - start_time) * 1000
print(f"📸 Snapshot captured in {elapsed:.1f}ms")
# Return raw bytes with JPEG mime type
return Response(
buffer.tobytes(),
mimetype='image/jpeg',
headers={
'X-Capture-Time': f'{elapsed:.1f}ms',
'Cache-Control': 'no-cache'
}
)
@app.route('/health')
def health():
"""Health check endpoint"""
return jsonify({
'status': 'healthy',
'camera': camera.isOpened(),
'timestamp': time.time()
})
if __name__ == '__main__':
print("🎥 Webcam server starting on port 8080...")
print("📹 Camera initialized")
app.run(host='0.0.0.0', port=8080, threaded=True)
The camera is opened once at startup and stays open. This is crucial – opening the camera takes ~500ms, but grabbing a frame from an already-open camera takes ~100ms. That’s the difference between “instant” and “noticeable lag.”
We expose this local server via ngrok so Azure Functions can reach it, but the actual capture is happening on local hardware at USB speeds.
The Real-Time Magic: It’s All About Latency
So where’s the “real-time collaboration” part? Let’s break it down:
1. Event-Driven Architecture
The micro:bit doesn’t poll anything. It reacts to hardware interrupts from the IR sensor. The moment that sensor state changes, code executes. No loops, no waiting, just pure event handling.
2. HTTP as a Real-Time Protocol
Yeah, I said it. HTTP can be real-time if you do it right. We’re not using HTTP for long-polling or any of that nonsense – we’re using it as a fast RPC mechanism. POST request hits the function, function responds in <1 second. That’s real-time enough for physical interaction.
3. No Message Queues
We could have put a message queue between the micro:bit and Azure Functions. Azure Service Bus, Azure Event Grid, whatever. But why? That adds latency for no benefit. Direct HTTP connection means minimal hops, minimal latency.
4. Local Processing Where It Matters
The webcam server runs locally because USB cameras don’t work in the cloud (shocking, I know). This means the actual image capture happens at full USB 2.0 speeds – no network latency in the critical path.
The Code You Actually Wanted to See
Let’s talk about the hairy parts – the stuff that makes this actually work in production.
Handling ngrok’s Browser Warning
ngrok (free tier) shows a browser warning page. Azure Functions sees this HTML instead of the webcam image. Solution? Special header:
Welcome to 1999. But hey, it works, and it’s fast. No library overhead, just raw TCP.
Camera Initialization Performance
python
# This takes 500ms - do it once!
camera = cv2.VideoCapture(0)
# This takes 100ms - do it per request
success, frame = camera.read()
Keep the camera open. Seriously. Re-opening on every request will kill your latency budget.
Why This Is Actually Real-Time
The “Right Now” badge asks for real-time collaboration. Here’s why our solution qualifies:
Event-Driven, Not Polling:
The micro:bit doesn’t sit in a loop asking “did something happen?” every second. The IR sensor triggers an interrupt, code runs immediately. That’s true event-driven architecture.
Sub-Second Response:
From sensor trigger to blocks appearing in Minecraft: ~1 second. That’s fast enough that users perceive it as instant. Human reaction time is ~200ms, so anything under 1 second feels immediate.
Continuous Connection:
The webcam server keeps the camera open and ready. Azure Functions are warm (mostly). The ngrok tunnel is persistent. Everything is ready to go at a moment’s notice.
Physical-Digital Synchronization:
This is real-time in the truest sense – it’s synchronizing the physical world (MicroBit plate on sensor) with the digital world (Minecraft blocks) in human-perceptible real-time.
The Full Stack
Every arrow in this diagram represents a real-time connection. No queues, no batch processing, no eventual consistency. Just pure, synchronous, low-latency event flow.
Performance Metrics (The Numbers Don’t Lie)
IR Sensor to WiFi Send: 10ms (measured via micro:bit display timing)
WiFi to Azure Function: 50ms (Azure Function logs timestamp delta)
Azure Function Processing: 20ms (internal logging)
HTTP to Webcam Server: 200ms (ngrok overhead + network)
–Webcam Capture: 100ms (X-Capture-Time header)
Blob Storage Upload: 150ms (Azure SDK logs)
Total (sensor to storage): ~530ms
That’s half a second from physical action to cloud storage. Add another ~500ms for Minecraft API calls and block spawning, and you’re at 1 second total. Fast enough to feel magical.
What We Learned
1. Real-Time Doesn’t Mean WebSockets
Everyone assumes real-time means WebSockets or SignalR. Nah. Real-time means “low latency” and “event-driven.” HTTP can absolutely be real-time if you keep connections warm and minimize hops.
2. Edge Devices Are Powerful
The micro:bit is a $15 computer with 16KB of RAM. And it’s fast enough to be part of a real-time system. The lesson? Don’t underestimate what you can do at the edge.
3. The Weakest Link
Our latency budget is dominated by two things: ngrok tunnel latency (~150ms) and webcam capture (~100ms). Everything else is <50ms. Optimize the slow parts first.
4. Logging Is Your Friend
We log timestamps at every step. This lets us see exactly where time is spent and identify bottlenecks. You can’t optimize what you don’t measure.
Beyond the Badge
The “Right Now” badge celebrates real-time systems. But the real win here isn’t the technology – it’s the experience. Kids (and adults) place a MicroBit plate on the sensor, and *immediately* see blocks appear in Minecraft. That’s magic. That’s what technology should feel like.
We didn’t build a real-time system because a badge asked us to. We built it because it makes the experience better. The badge just recognizes what we were already trying to do.
Try It Yourself
Want to see this in action? Here’s what you need:
Hardware:
BBC micro:bit with WiFi module (ESP8266)
IR proximity sensor (any digital output sensor works)
USB webcam
Computer to run the webcam server
Software:
Azure Function (Node.js 18+)
Azure Blob Storage account
Flask + OpenCV for webcam server
ngrok for tunneling
Badge Status: Right Now ⚡
We built a system where physical actions trigger digital reactions in less than 1 second. That’s real-time. That’s “right now.” That’s how you bridge the physical and digital worlds without making users wait.
Getting All Contract-Related Activities from a Minecraft Building Claim using mscrm.addons
One of the recurring challenges when working with Dataverse and Dynamics-style data models is connecting the dots between records that are related… but not always in obvious ways.
In our Minecraft Building Claim solution, contracts are at the center of everything:
Design contracts
Approval conversations
Emails, notes, and follow-ups All of these live as activities, and we needed a reliable way to fetch all contract-related activity data directly from a Building Claim.
This is where mscrm.addons quietly becomes your best friend.
The Problem
A Minecraft Building Claim:
Is linked to a Contract
The Contract owns multiple Activities
Activities are polymorphic and stored across tables (email, task, note, etc.)
The challenge:
How do we retrieve every activity related to the contract — starting only from the building claim?
Direct relationships won’t get you all the way there.
Activities are not directly related to the building claim — they’re related to the contract, and those relationships are handled through Dataverse activity pointers.
Why mscrm.addons?
mscrm.addons lets you:
Extend FetchXML queries
Traverse complex relationships
Include activity-related data without manually querying every activity table
Instead of running:
One query for emails
One for tasks
One for notes
One for phone calls
You let Dataverse do the heavy lifting.
The Approach
Step 1: Start from the Building Claim
We begin with the Minecraft Building Claim record, which contains a lookup to the related contract.
Step 2: Traverse to the Contract
Using FetchXML, we link the building claim to the contract entity.
Problem statement: We came to ACDC Hackaton 2026 with an ambition to solve the problem that exist in supply chain, that affects everyone involved. Global spare-parts supply chains are slow, fragile, carbon-heavy, and often fail when parts are obsolete, or manufacturers no longer exist.
Original Solution we wanted to build: CraftPortal replaces shipping with digital “crafting,” using AI and cloud technology to match, recreate, and produce parts locally- fast, resilient, and sustainable. It was, however, focused on the idea from ou perspective and plan to refine this concept during the hackathon.
Evolution Journey: and then happened why we love ACDC Hackaton for.
– “Put the Customer in focus”, said Sara on the opening day, our beloved judge on Digital Transformation category.
So, we worked on the idea to enrich our web application solution to include the ISV package and give our potetial customer with UI and UX they recognize well, namely M365, Power Platform, BizApps.
– “I love your futuristic concept…” said Mikael from Redstone Realm perspective and inspired us to combine Microsoft 365 / Dynamics 365, SharePoint, Teams, and Azure
“Simple code screenshot is not enough, show me how it solves your problem”, noted Keith (Code Connoisseur), and we challenged the status Quo and wanted to generate bigger impact…. So what we did is
“Turn data into insights….”, reminded Cathrine from Data, AI and Analytics, and we revised our data model focusing to build a solid fondation for our solution, so we could start to map external data sources and motiveted us to explore RAG.
“Everything you have there has to be there for a strong reason” warned us Fredrik from Low-Code angle in the start, and we critically reviewed our ecosystem to follow strict focus on the power of the low code.
“No security holes” declared Scott on Governance and Best Practices and you don’t mess with Scott. No fluff, we need a proper best practice focused ALM and Governance for the whole application.
As a result of continued brainstorming during these 3 days, and dialogue, our refined Solution started to look like this: … scroll down:)
DIGITAL TRANSFORMATION
The Concept
Supply chains are slow, fragmented, and carbon-heavy. Parts ship across the world when they could be printed locally. CraftPortal changes this – a marketplace where recipes travel through the portal, parts get made nearby. Faster. Greener. Smarter.
Customer needs a part. Can they print it? Yes – browse marketplace, select blueprint from IP Owner, print, deliver. No – publish a tender, receive bids from Manufacturers, select, award, contract, they print, deliver. Two paths. Same portal.
We built two paths to CraftPortal. A SaaS web application for users who want to jump straight in. And an ISV package for customers who want CraftPortal wired into their Microsoft 365 environment.
Users choose: Web App or Power Apps:
Customers who prefer Model Driven App get a clean, familiar UI – the Power Apps experience they already know, tuned for digital inventory workflows:
But they can surely use our fancy web app. Vendors – IP Owners and Manufacturers – use the Portal interface. They browse public tenders, submit bids, upload recipes, manage contracts, track orders. All through Power Pages.
Value & Monetization
CraftPortal sits in the middle of every transaction. Recipe rented? We’re there. Part printed? We’re there. That’s the value.
Monetization options:
Subscription – monthly/annual access to the platform
Transaction-based – percentage per recipe rental, per tender, per print job
Or both. Base subscription for access, transaction fee for volume.
LOW-CODE
Low-Code: The Redstone Behind CraftPortal
We built CraftPortal in 3 days. A marketplace. Tender flows. Vendor management. Document automation. AI agents and Power pages portal. How? Low-code.
The Building Blocks are as follows:
Power Pages for the Portal. Model Driven Apps for back office. Power Automate for every flow. Copilot Studio for autonomous agents. Dataverse for data. Generative Pages for dashboards.
We wired it together with clicks, not code.
The Low-Code Highlights
Autonomous agents – Copilot Studio detects Dataverse changes, posts to Teams, triggers RPA
RPA integration – Power Automate Desktop opens Bambu Lab Studio and clicks Print.
Teams + SharePoint + Dataverse – Fully automated channel and document location creation via Power Automate
Generative Pages – KPI dashboard pulling live Dataverse data
OneFlow contracts – Power Automate creates and sends contracts for signature
A full digital inventory platform. Built by a small team. In 3 days. Low-code made it possible.
CODE CONNOISSEUR
Low-code gets you far. But sometimes you hit a wall – a custom UI that doesn’t exist, real-time updates that Power Automate can’t handle, or an API that needs to be built from scratch. That’s when we switch gears. Pro-code fills the gaps.
We have various code projects and components in our solution:
Power Pages Portal. That manages turning basic Minecraft resources into different tools and help clients to find an appropriate vendor for printing adapts to all devices and screen sizes. Chameleon | Arctic Cloud Developer Challenge Submissions
Our CraftPortal KPI Dashboard brings it all together. Built with Generative Pages and React, pulling live data from Dataverse:
Summary cards – Total Projects, Open Projects, Total Bids, Wandering Traders
Project Status Distribution – Donut chart showing lifecycle states
Bid Conversion – Submitted vs selected bids
Win Rate by Trader – Performance leaderboard
Projects per Month – Trend analysis over time
Top Wandering Traders – Gamified rankings
Light mode. Dark mode. Minecraft item icons from the official API. Business intelligence with a blocky twist.
We built the foundation. Dataverse as our core. Power BI dashboards for KPIs – tender status, bid conversion, vendor performance, projects per month. Live telemetry streaming from our IoT-connected Crafting Tables via Azure IoT Hub. Real-time monitoring of print jobs, temperatures, and device health.
Last year we went deep on Microsoft Fabric – Medallion architecture, Data Activator triggers, the whole pipeline. We didn’t want to repeat ourselves.
via Azure Data Factory. Deploy a proper RAG pipeline – chunking strategies, metadata filtering, semantic search, hybrid search, custom retrievers. Debunk RAG the right way.
Unfortunatley, we barely finished the data platform in time. The RAG adventure stays on the roadmap.
Sometimes three days isn’t enough. But the foundation is solid. The diamonds are waiting to be mined.
GOVERNANCE & BEST PRACTICES
Essence
Our goal during the hackathon was to show the complex implementation of the project with different aspects of the implementation.
When it comes to even the industry focus switched to AI related topics it still requires advanced level of the solution design to enable existing services for the LLM.
That is why we mentioned advanced level technologies such as: Azure Local, Lighthouse, and IoT Hub.
By deliberately meeting and exceeding every requirement across Digital Transformation, Low-Code, Pro-Code, Data & AI, Redstone Realm, and Governance & Best Practices – while continuously refining our solution through your direct feedback – we believe CraftPortal represents the complete ACDC vision, and we thank the judges for challenging us, guiding us, and inspiring us to build something truly worthy of this win.