CCCF (CrayCon Creepers Central Farmers) is an end-to-end automated Minecraft resource marketplace. Customers order via Power Pages portal, sign contracts via OneFlow, get SMS confirmation via Link Mobility, and AI bots fulfill orders automatically. All data flows through Dataverse.
Redstone Realm
Full Power Pages portal at https://cccpfactoryportal.powerappsportals.com/ with customer storefront, order management, and admin dashboards. 13 web pages, 18 templates. Dataverse tables for orders, orderlines, resources, harvesters, and harvest summaries. Web API enabled for orders. Azure AD, LinkedIn, Twitter, Facebook authentication. Complete business flow from browse to purchase to contract to fulfillment to tracking.
Governance and Best Practices
Governance wasn’t an afterthought in CrayCon Creepers Central Farming. It’s how we kept the whole “always harvesting, always ordering” idea from turning into a fragile demo. We built with a clear environment strategy (Dev → Test → Prod), so experiments stay contained, validation happens before anything reaches real users, and production remains stable. On top of that, we treated the portal like an actual production surface: anonymous visitors only see public-friendly content, while operational views (admin dashboards, harvester assignments, facility metrics) are protected with authentication and granular table permissions aligned to real roles and responsibilities. (https://acdc.blog/crayon26/security-and-governance-building-systems-that-wont-collapse-under-pressu…)
On the security and data side, we focused on preventing accidental “oops”-moments rather than relying on perfect maker behavior. DLP policies are used to control connector usage and stop risky data flows by default, so operational data doesn’t quietly leak into services that don’t belong in the same trust boundary. That also supports compliance and privacy principles: least privilege, separation of environments, and clear control points for where data can move. The result is a portal and platform setup that’s designed to be resilient under pressure, not just functional when everything goes right. (https://acdc.blog/crayon26/security-and-governance-building-systems-that-wont-collapse-under-pressu…)
We also leaned heavily into traceability and accountability. Every meaningful change is designed to be captured through source control and reviewed before promotion, giving us a clean audit trail of what changed, why, and who approved it. Our ALM flow uses GitHub Actions + PR review as a governance gate, with deployment status and context pushed back into Teams/Dataverse so the team gets real-time visibility (and the ability to stop unsafe changes early). We even use AI in a “governed” way, not to make decisions for users, but to improve transparency by generating documentation/changelogs alongside the code so the system stays explainable and maintainable as it grows. (https://acdc.blog/crayon26/craycon-creepers-automating-solutions-alm-with-github-actions-and-ai/112…)
In our CCCP Factory Portal, we also treated Power Pages security as defense in depth. Public visitors can view basic, non-sensitive resource information, but anything operational like the admin dashboard, harvester assignments, production metrics, and alerts is locked behind authentication and controlled with security roles and granular table permissions. That way we can give the right people exactly the access they need (for example read-only for supervisors, scoped access per team) without handing out broad admin rights, and we avoid exposing data that directly affects real operational decisions. (https://acdc.blog/crayon26/security-and-governance-building-systems-that-wont-collapse-under-pressu…)
Finally: we did install the Power Platform CoE Starter Kit to reinforce governance visibility. It took some wrestling (developer/default environment realities are… spicy), but we were able to get some data out of it, giving us the beginnings of the monitoring baseline we want for long-term operations and responsible scaling. That’s the theme across the whole solution: fun concept, serious foundations, secure defaults, clear ownership, and trustworthy building blocks.
Login throttling (5 attempts per 5 min, 15 min lockout). X-Frame-Options SAMEORIGIN. SameSite cookie policy. Role-based table permissions separate customer, admin, and anonymous access. API keys in environment variables, not code. All orders tracked in Dataverse with timestamps. Bot harvest events logged with bot ID, resource type, quantity. INSECURE_CODING disabled by default.
Data, AI and Analytics
Dataverse as data backbone: ccc_orders, ccc_orderlines, ccc_resources, ccc_harvesters, ccc_harvester_summary tables. Bots POST harvest telemetry every 10 items to Power Automate webhook, which writes to Dataverse. Portal dashboards query live data. 17 LLM providers integrated: OpenAI, Claude, Gemini, Groq, Mistral, DeepSeek, Replicate, HuggingFace, Cerebras, vLLM, Grok, Mercury, Azure OpenAI, OpenRouter, Qwen, LLaMA. Switch providers via JSON config.
Low-Code
Power Pages portal built entirely in maker tools. Liquid templates, content snippets, site settings. Power Automate flows handle OneFlow contract generation, Link Mobility SMS, Dataverse operations. No custom .NET code. 80% business logic in Power Platform. Pro code only for Minecraft protocol and AI orchestration.
Code Connoisseur
remote_control_bot.js: WebSocket bot control with Prismarine Viewer integration. 9 job types: farmer-wheat, farmer-potatoes, farmer-beets, farmer-carrots, sparse-farmer, brigadier, guard, scout, wanderer. Mindcraft AI framework with 17 LLM providers, profile-based config, 100+ skill actions. React/TypeScript website with live ACDC badge dashboard. Cloudflare Workers deployment. TUI CLI to control bots
Digital Transformation
Customer self-service 24/7. Instant order creation triggers automated contract and SMS. AI bots harvest continuously. Real-time production data in portal. Full audit trail. No manual data entry, no phone calls for status, no paper contracts.
Uses Microsoft 365 & Teams as the primary collaboration and experience hub.
Combines Microsoft Fabric, Dataverse, the Power platform, Azure Functions and more into a unified workflow.
Integrates AI agents in Fabric and Copilot Studio for data interpretation of geoJSON and creating resource insights.
Focuses on accessibility, usability, and smooth user experience using an adaptive Power App and Teams default accessibility properties
Provides real‑time notifications through a Teams Bot and thereby improving responsiveness and collaboration.
Providing automatic storing of opportunities using Dynamics
Governance & Best Practices
Uses official, publicly available Norwegian geodata ensuring transparency and compliant sourcing.
Keeps data in Microsoft Fabric + Dataverse, benefiting from enterprise‑grade security, governance and access control.
AI agents are explicitly instructed to handle inconsistent formats reducing risk of misinterpretation.
The Copilot Studio agent uses traceable and explainable knowledge sources (NGU, Geonorge).
Workflow ensures AI recommendations feed into human‑in‑the‑loop processes by focusing on consultancy requests and advisor review
Interactions (requests, opportunities, notifications) are logged and stored for auditability.
Data, AI & Analytics
Ingests large‑scale geodata from a 430k‑line XML + multiple CSV metadata files.
Built a Fabric Lakehouse to store, structure and refine all raw geographic information.
Created pipelines + dataflows to process and transform data into a clean SQL analytics table.
Set up Dataverse virtual tables for live synchronization of processed geodata to Power Platform.
Developed a data‑aware AI Agent inside Fabric for interpreting geoJSON and using an complex geometry logic.
Copilot Studio agents that takes advantage of external knowledge sources, implementing the Fabric Data agent as a supportive agent and trigger automated processes in Power Automate
Final output enables AI-driven insights, area‑based resource predictions and intelligent automation.
Low‑Code
End‑user interaction handled via a Power App with native map control for shape and area selection.
Copilot Studio agent handles reasoning about geodata, resource discovery and external API querying without custom code.
Uses Dataverse virtual tables to create a low‑code bridge between Fabric and Power Platform.
Power BI as an dashboard to help decide if an request is a suitable opportunity using Dynamics
User journeys (select area → request consultancy → receive SMS → create CRM opportunity) are built mostly with low‑code components.
Code Connoisseur
Provisioned an Azure Function via Bicep, demonstrating infrastructure‑as‑code mastery.
Built custom functions to:
Convert Markdown → HTML for clean rendering inside Power Apps.
Encode inputs to Base64 when needed.
Developed a Teams Bot using Teams SDK 2.0, integrating:
Microsoft Graph
Dataverse CRM
Azure Maps
Azure Tables for proactive notification logic
Implemented autonomous alert logic tied to an external speaker “sales‑bell” system via custom code.
Demonstrates how pro‑code extends low‑code elegantly and purposefully.
Digital Transformation
Converts complex Norwegian geological data into simple, actionable insights for businesses.
Reduces risk and cost in early‑stage assessments (site evaluation, planning, resource estimation).
Automation:
Data ingestion
Data transformation
Resource detection
Notification
CRM opportunity creation
Improves customer & employee experience with:
A simple map-based interface
Automated consultancy booking
SMS confirmation
Proactive Teams alerts
Demonstrates measurable real‑world impact by accelerating project startup and democratizing access to geodata.
Expanding on the solution
Our initial idea
We wanted to bring the simplicity of the Minecraft-way of discovering and gathering natural resources into the real world by helping organizations collect, process, and act on geological and geographical data. By gathering these publicly available geodata from official Norwegian sources, bringing them into a Datalake in Microsoft Fabric and synchronizing it with Dataverse, we wanted to end up with a solution that enable companies in real estate, agriculture, mining, infrastructure and related sectors to reduce risk, lower costs, improve sustainability planning, and accelerate project startup.
By using the Power Platform with Power BI for insights, Power Automate for workflows, Power Apps for interaction, and Copilot Studio for AI assistance, we could provide future customers with accessible, actionable resource knowledge delivered directly through Teams as a unified collaboration and interface hub. Ultimately, leveraging Microsoft’s cloud and low‑code ecosystem to make geodata more usable, intuitive, and strategically valuable.
Using Fabric and Data agent
Norway’s open geodata portal, Geonorge, provides extensive natural‑resource information and for our solution, we decided to use the “N50 Kartdata” dataset from Kartverket as the primary data source. By downloading a XML file, with over 430.000 lines of geodata, and CSV files with supportive metadata, we could populate a Fabric Lakehouse with all this data. Then using pipelines and dataflows, we could process the data into a single SQL analytics table for easier querying. A virtual table in Dataverse were created to automatically synchronize the data and making it easily available to the Power Platform ecosystem, enabling efficient and accurate retrieval of resource information whenever needed.
Within the same Fabric workspace, an AIdriven data agent was created and finely instructed to ensure seamless use of the dataset and handling of geoJSON data that wasn’t always following the standard formatting. This proved very useful as the standard map controller in Power Apps allows us to select a shape on a map, but the JSON data returned from the controller was not always in a correct geoJSON format as it could return an array with the geoJSON as a property on the elements within. The agent also had to be instructed how to handle circle shapes, as this is defined in geoJSON as a single point with a radius property, and the agent kept misinterpreted the radius as a real-world distance measured in meters.
Low code acces using the Power platform
We have created a Power App for our end users/customers, that allows our customers to use the default Power Apps map controller to select an area on a map and submit requests for consultancy assistance on possible resources. When a request is submitted a Power Automate flow is triggered, send the area coordinates to an agent in Copilot studio, which uses these coordinates to determine what resources exist there. This agent is configured with Knowledge-sources from the website of “Norway’s national geological survey” (https://www.ngu.no/) and the API’s of Geonorge (https://www.geonorge.no/verktoy/APIer-og-grensesnitt/) as general sources, and supplied by using our Fabric Data agent as a supportive agent-in-agent for detailed data. The user are able to book a concultancy with TNT by using the “Book Consultancy”-button, made available in the same app. This will generate an opportunity for our advisors to work on. Additionally this button triggers a power automate-flow that utilizes the Link Mobility-connector to notify the user by SMS, confirming their request for consultancy.
Combining with Pro-code functionality
We have provisioned an Azure function using a Bicep template, to help us with certain workflows that are typically difficult to manage in the Power platform alone. For example, the Copilot agent returns answers in Markdown language, which is not particularly suitable for our end-user to view in the Power App. So we have a script in the Azure function that converts this to HTML, which is much more readable within a richtext textfield. We also have a function that can receive inputs from a request and convert it to base64 format.
Transforming the world using autonomous agents & proactive Teams Bot
In addition, we have an autonomous geological report agent that will occasionally discover noteworthy information that our advisors should be alerted about. This works on the same data as previously mentioned, in addition to other open sources with valuable knowledge about geology.
A Teams Bot, built with Teams SDK 2.0, integrates with Microsoft Graph, Dataverse (CRM), Azure Maps and more – and even makes some noise on an external speaker if there is a new report generated. We have removed the volume knob, so it does not help to mute the laptop and set the phone in airplane mode. They will have to act! It’s like a sales bell – only cooler!
The notifications are sent proactively by keeping track of the reports in an Azure Table. Notifications are delivered to all users – but only once. They can then choose to investigate further, ignore it, or create an opportunity directly by hitting a button!
Managing land usage in a Minecraft multiplayer environment requires structure, fairness, and clear governance to avoid conflicts between players building in shared worlds. To address this, we have implemented a digital building permit solution using the Microsoft Power Platform. Dataverse serves as the core data foundation, Power Pages provides a user-friendly portal where players apply for building permits, and Dynamics 365 Copilot Service Workspace supports case management and escalation when human review is required.
When submitting an application, players define the full build area using start and end (x, y, z) coordinates and describe in detail what they plan to build. This information is stored in Dataverse and made available through Microsoft Fabric to a set of agents built in Microsoft Foundry. Multiple agents analyze each application in parallel, evaluating coordinates and descriptions to detect overlaps with existing builds, restricted zones, excessive depth or height, and other potential rule breaches. The agents operate with access to the critical tables and fields in the case management process, ensuring decisions are based on authoritative and up-to-date data.
If all automated checks pass, the application is deemed ready for approval and a building permit is granted. If clear violations are identified, the application can be rejected or escalated for manual review. Power Automate orchestrates the overall process, handling status changes and approvals, while Customer Insights Journeys is used to notify applicants via SMS and email. Applicants are informed when their permit has been approved, rejected, or requires further review, and chat agents provide real-time interaction and support throughout the application lifecycle. The result is a scalable, automated, and transparent system that protects player creations, minimizes conflicts, and maintains order across the multiplayer server.
Redstone realm
As a Minecraft server admin it is difficult to resolve neighbor problems, and who was where first, and so on. Without organizing who owns what, and what rules apply where mayhem is inevitable.
Additionally approving applications based on zoning and researching what rules apply where, and what permits apply for what zoning, and overlapping zones is time consuming, and might require a team that knows the server well if scaled up.
This is why we have made the block permit solution. It leverages AI to do research, validate the risk, and comment on permit applications. It makes the process take minutes, not months/weeks.
We leverage agent pods (Microsoft Foundry Workflows) that are dedicated AI teams for a specific purpose. For example checking the permit applications compliance with zoning rules for where the application is for.
With these agent teams the research proses is easy, and compliance suggestions, risk analysis, and negotiations are just minutes away.
User interface – How to get a building permit on the Minecraft Server
We are leveraging Power pages for players to apply for a building permit in the portal. Currently, we are only accepting applications from players with Entra ID.
However, there are plans to evolve the registration process setting up username and password based on email and password with double opt-in authentication.
Home page
The main navigation has five pages, where players can apply for a building permit, as well viewing my applications and building permits.
Application form
Once building permit applications are successfully submitted, they are managed through the case management functionality within Dynamics 365 Copilot Service Workspace. All communications related to the building application process, culminating in the issuance of a permit upon approval, are coordinated via Dynamics 365 Customer Insights.
View my Building applications and Building Permits
Players can also see their building applications and their status (pending, approved or rejectedI) as well as viewing their building permits once approved.
Customer Service representatives can be reached in several ways if players have any questions regarding their permits or applications. Click on the “Send an Email” and your email client will open, then submit; a logic will then create a case in Customer Service.
Governance & Best Practices
The ability to observe everything that happens is important. And we use the Microsoft foundry to observe our agents.
These dashboards provide transparency to the usage of our agents.
But how do we prevent our agents from just “halusinating” the rules and decisions. And how do we prevent the fact that sometimes it just does not find all the data we want.
The solution is an agent that is specifically designed to be critical and give feedback to the researcher agents. Essentially giving the work of saying “try harder” to an agent. Each instance of the combination: Fabric data agent, Microsoft foundry agent, and read team agent is placed in a foundry workflow that works as a agent pod that makes shure that it figures out the correct answer.
Identities are important throughout the usage of agents. To be able to use the Fabric data agents you have to have access in fabric, even when you use the workflows. This means that the container apps that hosts the Foundry workflows as a API has to have a managed identity that has access in fabric.
The illustration below shows the apps in use to manage the approval process.
Data & AI
If you don’t have good data, you cant have good AI.
We might only have a single data scientist in our team that has 3 months experience with the role. But that does not mean that we don’t have data traveling from Dataverse to fabric and a medallion structure.
We have a dataflow. This is not optimal, but in our tenant the linking directly to fabric through shortcuts was broken. Shortcuts was the option A, but we had to adapt to use plan B.
We have a medallion structure that is made in the spirit of a data scientist.
In our bronze layer we just get the data to where it should be. In our silver layer we filter out the data we do not want to use. And in our gold layer we make the data useful with nice merges and useful tables. There are many ways to do a medallion structure, and this does the job I want it to do, but there is room for improvement in the silver layer.
This is the part where I rip the band aid off and say we have not had time to do power BI but we have visualization of our data at least.
Data is at the core of our solution. And as a verry new scientist that is way better at AI than data I know ai will out preform me by far. That’s why mining in our data is done with Fabric data agent.
The example above is a query created by Fabric data agent to figure out the contents of permit with the ID A100C206-FEF8-F011-92B9-00224806C768. In this case it searches in bronze for demo purposes.
These fabric data agents are further used by implementing in to other agents like Copilot, and Microsoft Foundry agents. These implementations help drive data driven decisions, and empowers humans to make data driven decisions quicker, and more accurately.
Low-Code
Low Code is NOT Dead… Not for me, not for us, not for the Enderdogs.
That Low Code is dead is an expression that we have heard for a while.. and yes we do see that Agents will do more of the Low Code than we do today. But Still, there is so much we CAN do with so little.
Our main purpose here was to create a solution within Minecraft. Our Business Case is that you need to have a permit before you can start to build your building in Minecraft, and for that we use;
Power Pages
Dynamics Customer Service
Power Platform
Contact Center
We also use Microsoft Fabric and Microsoft Foundr.
We start in Power Pages, where the Applicant can start to create an application. The Application is built with standard schema in Power Pages. When the application is created, we send it into Dynamics CE, to send we use an power automate.
In Dynamics CE, customer service and Dataverse we get the application as a case. And within the case we have the application. N:1 Now we can start the process that handles the case.
The Approval Process
First screening is done with the use of AI. The ai can comment the case, give suggestions, negotiate, and do risk analysis for the application. When the Agent is ready it sends back to Dataverse an updated status for Approved or Rejetcted. We are also able to Approve or Reject manually if the agents decides that it is required.
Connected To Dynamics Marketing (Customer Insight Journey). In Customer Insight Journey we have created a trigger that gets the data from the Case in Dynamics Customer Service. We are creating a journey that sends out emails to those who has been Approved or rejected.
Contact Center
We have set up a chat channel from the contact center, that is integrated in the portal The setup for channels is configured in the Customer Workspace Admincenter. After the setup we copy the code and add it into the portal. And Whola you have installed a chat so that the user easily can chat directly to you. An Agent starts the conversation, but it can easily connect you to an human resource. This Agent is configured in Copilot Studio.
Now the Human Agent can start the conversation with the person that is on the other side.
Copilot helps store the conversation on your case
Record Creation and update Rule.
We have also configured the Record Creation and Update Rule to manage emails from the web.
The best part of this is that you do not have to do so much because it is already configured for you. You just have to add your requirement for the case.
And activate it and you are good to go.
Connected to Teams and Outlook
The Bible
Thank you for the tip
We use planner to create task in M365
Booking
We can store all our documents in SharePoint and connect it to Dynamics 365
And pf course collaborate with Microsoft Copilot.
Code connessieur
We love code so much that we code inn to solutions that is not meant to be coded in yet.
We use a total of 5 coding languages in our solution, Powershell, Python, Terraform, C#, Bash.
We have IaC on all the infrastructure in Azure by using Terraform.
This includes the Foundry Workflows
Container apps:
And AI agents:
Our entire terraform infrastructure is built on ease of config, and vibe code friendliness (because that is a thing we need to think about now). all foundry configs are folder and JSON based
This config will create 10 agents, 4 flows, and one model.
The same mindset works for the services deployments in container applications. Just add your script, and config in a folder, and it wil just deploy the service infra.
But Iac is far from the only place we have code.
We have notebooks in fabric that create valuable data and really mines for pure diamonds in our data.
And yes it is vibe coded, but it works flawlessly.
Digital Transformation
As a player at a multiplayer server it feels good to have a space you can call your own regardless of what the other players think. Permits are a nice way to claim a area as your own, and proving that you are the rightful owner of the land.
Going beyond that it can be a good tool to make shure players are aligned on common areas by adding zoning rules. The combination makes it really easy to manage what gets built where. A good example can be a cabin area with thick trees. The zonining rule can dictate that “No open fires are allowed unless inspected and approved. Height limit 25 blocks. Size limit 40×40. Minnimum distance to approved permits 50 blocks. Styling must be mainly stone and wood.”
The zoning description would make the zone only approve “cabin like” permits.
But Permits are not cool if you use hours to apply, and months before you get the permits approved(like in the real world).
The application process is super simple, and have good user instructions for the end users to understand the process.
But what happens when you have sent the application. How does it approve the application, and help humans approve/make decisions.
This is where AI comes inn to the picture. The firs screening of all permit applications are done with AI. We have a team of 14 AI agents. We have dedicated agents for validating the coning compliance, assessing risk of the application, finding out if neighbors should be warned, and negotiating the permit terms if its close to valid.
These agents have origin in Microsoft foundry, but they are each connected to their own Fabric data agent that they can communicate their data investigation needs to, and it will figure out the data.
The docker container that contains unmined and also run.sh, a bash script to run unmined. Chose this instead of building a console app because it’s already running linux. This is the code to run the unmined console, the environment variable area is to limit the size of the map so it doesn’t turn to big and takes to long. Uploads the rendered imaged into Azure Blob storage.
All cases that are approved can easily be viewed in Dynamic365.
We can also see the approved applications in a view in Zone Manager portal
The Approval process runs with Customer Insight journey
Custom code and Web API Fully functional, client-side JavaScript implementation for a Minecraft zone and application management system. The code is intended for integration with Power Pages/Power Platform sites, with extension points for backend connections (Dataverse, Power Automate). The manager supports zone creation, application workflow, and interactive map visualizations.
1. Global State Management
A single source of truth for dynamic, interactive components.
const STATE = { zones: [], // Created zones map: [], // Zones currently shown on the map (deploy queue) applications: [], // Building/mining applications zoom: 1, // Map zoom factor activeTab: ‘applications’ };
Benefit: Keeps all UI in sync with code-driven updates.
2. Zone and Application Data Models
Zones: Each zone has coordinates (startX/Y/Z, endX/Y/Z), biome, category, protection and PvP flags, maxPlayers, and unique id.
Applications: Building requests featuring playerName, role, experience, requestedZone, type, dimensions, and status.
function createZoneElement(zone, seed) { // Organic SVG shape per zone (see generateZoneSVG) }
Example SVG Generation for “Organic” Zone Shape:
function generateZoneSVG(zone, width, height, seed) { // Color per zone category, organic curved borders using calculated path }
Benefit: Makes map visually attractive and clear to use.
Application Marker Display
Pending applications appear as building icons on the map, using coordinate and type data.
5. Tab Navigation and UI Management
Tab switching logic:
function switchTab(tabName) { STATE.activeTab = tabName; // Update UI classes, render specific content by tab }
Benefit: Mimics SPA-style navigation; content fetch is scalable (supports async API integration).
6. User Feedback & Validation
Error Message Display: When a form is invalid or zones overlap.
Success Alerts: E.g., after deployments or successful creation.
Confirmation Dialogs: For destructive actions (deletion, deploy).
7. Extensibility & Integration Points
Places in the code marked with // TODO are prepared for linking live flows to Dataverse and Power Automate APIs:
// TODO: Call Power Automate flow to update Dataverse
Usage Instructions
Embed zone-manager.js in a Power Pages site alongside the appropriate HTML structure (element IDs must match).
Edit sample data in loadSampleData() as needed, or replace with calls to backend API/Dataverse for production.
Manage zones/applications through forms, buttons, and interactive map.
Trigger deploy to simulate finalizing zone layouts (extend deployment logic as needed).
API Integration: Connect event hooks to Dataverse/Power Automate for real data handling.
Security: Implement server-side validation for production.
Styling: Enhance CSS for better responsive usability.
Performance Tuning: Optimize for large numbers of zones/applications as needed.
3D Zone Overlap Logic
function checkZoneOverlap(zone1, zone2) { // …math for checking bounding box overlap in X, Y, Z… return overlapX && overlapY && overlapZ; }
Zone SVG Path Generation (“Organic” Shape)
function generateOrganicPath(width, height, seed) { // Generates smooth, wavy polygon for zone display }
Application Card Render (with actions)
function createApplicationCard(app) { // Renders HTML for player requests, with approve/reject buttons }
Delivery Complete: Ready for Acceptance & Integration
Client-side sample and demonstration code is robust.
Extendable for backend integration; clearly documented.
Clean architecture for zone/app management and map interface.
Azure Functions
The idea was to use Power automate to trigger the Azure functions.
The PostRenderFunction receives a .zip file of the whole minecraft map. Validate the zip file and create a new “Render Job” row in Dataverse and use that GUID to send as naming of the .zip file. After the blob has been saved it will run the Azure Container Instance. And return message of Job ID, status “Queued” since it has been sent to the ACI. And the name of the world Blob Storage.
The GetRendeFunction will receive the JobID and check against “Render Jobs”. And if it has ran it will return the image. If not it will check against Blob storage with the GUID of the file. If it’s in rendered it will return the rendered image and change the status of the job.
A little Introduction to our final delivery here at ACDC;
We have built an end-to-end production flow where business demand starts inside the enterprise stack, gets approved and signed digitally, becomes an ERP production order, and is then executed automatically by an AI bot that gathers materials and reports progress back in complete with real-time updates and dashboards.
Minecraft is our safe, visual execution layer; the real-world use case is deployable physical robots harvesting and performing tasks in mines, quarries, and remote locations
Our project mimics real life customer work that we have worked on with Rana Gurber. Where they are deploying a dog-bot that gathers data in the mines, increasing safety for workers.
The flow (start to end)
Before going through the categories, a brief description of the architectural flow is good to have for some context later
1) Approval + Signing turns intent into an approved transaction
A purchase requisition is created in Dynamics 365 and enters an Approval/Signing stage. When it hits that stage, a Power Automate flow triggers the OneFlow contract creation, adds participants, publishes it for signing, and listens for contract status updates. Once fully signed, the flow retrieves contract details and automatically creates the Purchase Order header + lines in Dynamics 365.
2) Purchase Order triggers a Production Order and the worker starts
Once the approved PO exists, it triggers creation of a Production Order in our very own Minecraft FO Module. The Production Order reads the Bill of Materials (BOM) and determines required resources. In our module, those requirements are already translated into Minecraft blocks/items.
3) Execution happens in Minecraft: mine, gather, craft, deliver, and report back
An AI-controlled NPC worker is spawned in the Minecraft world and executes the translated task list:
mines required resources,
gathers/harvests materials,
crafts required items,
…..and sends the data to Power BI and our Canvas app Dashboard in Real time
4) Completion + status goes back to ERP (loop closed)
When the materials are ready, completion status is sent back so the Production Order can be updated/closed in F&O.
Boom, and you have automated production.
Now to the fun part;
Categories
Redstone Realm
We built a complete business process that begins where real work begins in internal systems. Approvals and contract signing are mechanisms that convert intention into trusted demand.
From there, Dynamics 365 Finance & Operations becomes the system of record for purchasing and production planning, and our downstream services treat it like the authoritative source, not just another data feed.
We deliver a real business solution for Minecraft server automation and management, integrating Microsoft Azure (Service Bus, Event Hub) and APIs. The experience is built around in-game automation (quest system + bot feedback), with an AI-autonomous bot that executes tasks while using self-preservation logic.
Finance & Operations overview and one-click ordering experience
Inside Dynamics 365 Finance & Operations, we provide a clear, user-friendly overview of Minecraft worlds and their connected resources. Worlds and resources are presented in a structured view with visual indicators so users can instantly understand key status informationsuch as whether a world is online, what resource type it represents, and whether it is currently ordered, without needing to read through large volumes of data.
Built to extend without redesign
The underlying structure is designed to scale predictably as the solution grows: it’s straightforward to expand the model, add new screens, connect reporting, and support additional integrations without having to rebuild the foundation.
The bridge between Dynamics 365 Finance & Operations and the Minecraft world where D365 business events (for example, production requests for raw materials) are dispatched through Azure messaging and orchestrated by a Logic App workflow “TriggerAI”. The orchestrator transforms structured business data into an AI-readable instruction, and the bot executes the job in the world, navigating terrain, collecting materials, depositing outputs, and reporting completion back through the same cloud path.
Player experience: narrated diary + conversational recall
We extended the experience beyond pure automation by adding a diary and interaction layer: Steve can be asked questions like “What happened on January 23rd?” and responds using memories grounded in diary entries and logs via Retrieval Augmented Generation. We also provide audio narration of diary entries through a containerized text-to-speech service and a web diary player that lets users browse history, play audio, and chat through a clean interface.
Why we deserve points in this category:
We deserve Redstone Realm points because we built a real business solution on Microsoft tech where Minecraft is just the “factory floor” or the real life mine.
The process starts where business work actually starts: inside the company. Approvals and contract signing turn a request into something trusted, and then Dynamics 365 Finance & Operations becomes the single source of truth for purchasing and production planning. We don’t treat FO like a dashboard feed. we treat it like the system that controls what’s allowed to happen.
From there, we use Azure to make the flow fast and reliable. When FO raises a business event (like a request for stone, wood, ore), it’s sent through Azure messaging and picked up by our orchestrator (TriggerAI). TriggerAI turns the structured business data into an instruction the AI can understand, and then the bot executes the work in Minecraft: it navigates, gathers the right materials, deposits them, and reports status back through the same path so the loop closes.
We also made it usable for real people. This can be and will be adopted in the future for many many businesses, and we did it in Minecraft.
In FO, users get a clean overview of worlds and resources with visual status indicators, and they can order or cancel with one click, and no complicated steps. The structure is built so we can extend it without redesigning everything: new screens, reporting, and integrations plug in cleanly. And on the experience side, we added a diary + chat layer with audio narration so users can follow what happened and ask “what did Steve do?” without digging through raw logs.
In short: it’s a Microsoft-first business workflow (FO + Azure + automation + AI) that turns enterprise requests into automated execution gathering resources, with a smooth user experience end-to-end,exactly what the Redstone Realm category is asking for.
Data, AI & Analytics
Telemetry sync + analytics-ready foundation
We collect and sync player inventories and bot telemetry (for example: backend/azure-telemetry.js and the HTTP Bridge mod),using analytics on quests, bot orders, and player activity. This also lays the groundwork for AI-driven features through the Mineflayer bot and quest automation patterns.
We transform raw Minecraft server logs into structured event records (deaths, combat, exploration, achievements, mining, crafting, farming, building, etc.) using a dedicated parser. Outputs are stored as dated artifacts (logs/diary/chat per day), creating a time-series dataset that can be queried and analyzed over time instead of being trapped as text files.
Memory indexing + retrieval for grounded answers
The RAG indexing system processes both diary prose and technical logs, chunks them with overlap to preserve context, embeds the chunks, and retrieves the most relevant passages using cosine similarity. When users ask questions, we embed the query, fetch top matches, and inject them into the prompt so Steve can recall specific incidents because they exist in indexed memory.
Traceability between enterprise transactions and in-world actions
We log D365 event details (for example, identifiers, quantities, resource types, event times) alongside in-world task execution. This enables correlation questions like “Which D365 order triggered the collection of X blocks?” by aligning timestamps and task IDs for end-to-end traceability.
Engagement analytics from chat history
Chat sessions are stored as JSONL conversation turns with role, content, timestamps, and session IDs. This allows analysis of interaction patterns like conversation length, common question topics, and response latency based on recorded usage.
Orchestration layer as a workflow boundary
We use a Logic App named “TriggerAI” as an orchestration boundary that receives structured business events and transforms them into AI-ready instructions, keeping routing and transformation logic maintainable and observable as a workflow rather than hard-coding every integration step in services.
An easy POST of the Trigger AI
>
Data consumption from Minecraft server in Raspberry Pi
From Raspberry PI, Posting data Minecraft player statistics, inventory data, world data from Minecraft server to Azure function end point.
Azure function posting the data to Azure event hub
Microsoft fabric event stream consumes the data from event hub and processing to Fabric lakehouse destination
Dataverse uses virtual tables to get the data to power platform to utilize the business process capabilities in power platform
Power automate flow process the data to Dynamics 365 Finance & Operations for further processing of inventories and orders
Data injection to Minecraft server
Asure Service Bus Queue using as a gateway from cloud to On prem Raspberry Pi Minecraft server to get the processed data from Dynamics 365 Finance & Operation, Dataverse and Azure
Our very own Minecraft Workspace in FO
Why we deserve the points for Data, Ai and Analytics:
We deserve Data, AI & Analytics points because we don’t just “use AI” for the sake of using AI. we built a real data pipeline where raw gameplay and business events become structured, queryable, and useful, and where AI is grounded in that data instead of guessing.
First, we’ve made the Minecraft world measurable (not with a tape measurement ..though) . We sync inventories, world data, player statistics, and bot telemetry from the Raspberry Pi server stack into Azure, route it through Azure Functions > Event Hub, and then consume it in Microsoft Fabric (Event Stream > Lakehouse). That means we’re not stuck with scattered JSON or log files, we have a foundation that supports real analytics and reporting on quests, bot purchasing orders, and player activity, and it scales as the system grows.
Second, we turn unstructured signals into clean data. Server logs are parsed into structured event records (deaths, combat, exploration, achievements, mining, crafting, and can adapt others and so on and so on) and stored as dated artifacts, creating a time-series dataset.
This is the difference between “cool logs” and “usable data,” because it lets you analyze behavior over time, correlate activity, and build meaningful metrics.
Third, our AI is data-backed. The RAG system indexes diary entries and technical logs, chunks them with overlap, embeds them, and retrieves relevant passages with cosine similarity. When someone asks Steve a question, the response is grounded in retrieved memories sooooooo it can accurately recall specific incidents because the data exists and is referenced in-context, not just thin air pooof..
Finally, we built end-to-end traceability between enterprise and execution. We log D365 event details alongside task execution, so you can answer questions like “which order triggered this collection run?” by correlating identifiers, timestamps, quantities, and task IDs. And we keep the operational loop tight: Fabric and Dataverse can expose the data back into the Power Platform, and Power Automate can process it into FO for inventory/order handling with Service Bus acting as the gateway back down to the Raspberry Pi for cloud-to-edge injection.
Low-Code
Power Platform-ready APIs and endpoints
Our API and webhook design is built to integrate cleanly with Power Platform (Power Automate, Logic Apps). RESTful endpoints make it easy to automate workflows and build dashboards, and the solution can be extended with low-code tooling for reporting and orchestration.
We have made low code components from power platform such as Powerapps (model driven app and Canvas apps), Power automate, Custom Connector, PowerBI)
Canvas Apps
Crafting Creepers Inventory Admin App
Helps to Generate model from the captured picture of an object. It uses the API from logiqraft.futurein using custom connector.
Creation of Order from CanvasApp
Generate purchase Summary using mscrm-addons.com and handle further processing in Dataverse and model driven app.
Minecraft Gold Converter App
Using APIs to fetch the external data from Materials such as gold and Exchange rate API
Here we use Low code Canvas app with the possibility of using Pro code PCF
We created a low-code app in Microsoft Power Platform based on an original idea using accelerator components. It includes phone-position adjustment and video streaming within a canvas app, with a simple purpose: a dynamic experience that responds to the movement of your phone. See video>
Custom Connector
We have created custom connector for another Teams API
Power Automate
We have created few power automate flows to automate the process and integrate the data with external services such as Linkmobility, Oneflow, mscrm-addons etc.
Also setup an Approval flow for approving and signing the purchase requisition
Why we deserve Low-code points:
We deserve Low-Code points because we used Power Platform to make the solution easy to run, easy to change, and fast to extend.
Power Platform building blocks (real, not just a demo)
We built with Canvas apps + Model-driven apps, Power Automate, Power BI, and Custom Connectors, so users can create orders, manage data in Dataverse, and see outcomes without needing developers for every change.
Apps that solve specific jobs
We delivered practical Canvas apps: an Inventory Admin app (using a custom connector to call an external API), a Minecraft Gold Converter (using external gold + exchange-rate APIs, with optional PCF), and a mobile experience app with phone-position adjustment and video streaming (video provided).
Automation + integrations with low-code flows
We created Power Automate flows for approvals and signing, and for integrating with OneFlow, LinkMobility, and mscrm-addon which makes the process runs automatically instead of manually.
Code Connoisseur
Purpose-built code across backend, bot, and mod
We implemented custom code across the stack: backend services (Node.js/Express), the Mineflayer bot, and a Java/Fabric mod. The codebase is modular and structured (see AGENTS.md for patterns) and includes innovations like the HTTP Bridge mod, bot self-preservation logic, and RCON integration.
The fabric inventory for converting the inventory in-game to products in Finance and Operations
Structured LLM interactions with guardrails (schemas, validation, planning)
We implemented structured LLM interaction patterns using DSPy signatures for command parsing, quest generation, and multi-task planning. To keep execution reliable, our planning pipeline enforces strict output schemas and validates every task structure before the bot runs it (e.g., required fields for collect/navigate/deposit), preventing malformed plans from becoming runtime failures.
Steve RAG: semantic memory built from scratch
We built a Retrieval Augmented Generation system that indexes diary entries and server logs into chunked segments with metadata, embeddings, and cosine similarity retrieval. The index supports reprocessing when files change (hash-based detection), and the retrieval layer feeds relevant passages into the chat prompt so Steve’s answers stay consistent with recorded events rather than invented details.
Full-stack delivery: services, storage, and edge deployment
We wrote the chat server and storage model (JSONL conversation history per date with IDs and timestamps), plus container and web delivery plumbing (Docker builds, persistent volumes, Nginx serving with CORS). We also produced deployment scripting across environments (shell scripts for edge devices and PowerShell for local workflows), turning the stack into something repeatable to run.
Why we deserve points:
This is not just a standard stack the LLM in the box but we engineered reliability
validated task outputs so the bot doesn’t break at runtime, built memory retrieval so answers stay consistent with recorded data, and delivered everything as a runnable, repeatable system (which means it has containers + scripts + edge-ready setup).
In short (how I understand it myself) : innovative code that actually ships and works, from ERP mapping to autonomous execution and real operational services.
Governance & Best Practices
Safety, privacy, monitoring, and operational robustness
We emphasize reliability and responsibility through error handling and logging across backend and bot services. Player inventories and “quest data” are handled securely, while telemetry and monitoring are streamed via Azure Event Hub. The bot is designed with responsible safety logic (self-preservation, health checks).
Monitoring:
As we are unning the Minecraft server as a prod environment we have set up a comprehensive monitoring stack with:
Cadvisor
Node exporter
Prometheus
Grafana
D365 Security Roles
Because Dynamics 365 Finance & Operations is the system of record in our solution, we also use it as a security boundary. We implemented FO security system (security roles, duties, and privileges) And trust me…working with this for the past months now, this can take some time to fully cope with..
so only the right people can view, create, approve, and dispatch the transactions that ultimately trigger bot work. That means the ability to “send work to the bot” is controlled the same way you’d control purchasing or production actions in a real enterprise process: through FO permissions, not informal access.
In practice, this keeps the automation constrained to its intended scope. The bot and edge device only operate on what FO authorizes, and the integration only processes actions that the user is permitted to perform.
Workers and admins can control the bot through the governed workflow which makes sure the bot doesn’t become a free-running actor, but an execution agent that acts on behalf of authorized users.
We deserve points for this category because;
We deserve Governance & Best Practices points because we treated this like a system that must be trusted and have spent time making sure the BOT handles it duties without breaching limits and
First, we built for operational robustness. Both the backend and the bot services include deliberate error handling and structured logging, so failures don’t turn into silent chaos. When something breaks, we can see what happened, where, and why.
That matters because our solution is event-driven and automated, if you don’t invest in observability, you don’t have automation, you have a surprise generator.
Second, we designed with privacy and data responsibility in mind. We handle player inventories and quest-related data as real user data: it’s stored and processed in a controlled way, with clear boundaries between what belongs in the game world and what belongs in the business/automation layers. The system doesn’t need broad access to everything; it only moves the minimum data required to execute and report on tasks.
Third, we made the solution monitorable by default. Telemetry and monitoring are streamed through Azure Event Hub, which gives us a consistent way to observe execution, detect anomalies, and build reporting on top of reliable signals. That monitoring isn’t only for dashboards but it’s part of governance: if you can’t measure behavior, you can’t manage risk.
Because we built FO security (roles, duties, privileges) directly into the execution pipeline, we deserve governance points for making automation permission-driven and bounded by enterprise access.
Lastly for governance and monitoring ,We deserve governance and best-practice points because we treat the Minecraft server like a real production workload, with a proper monitoring stack (cAdvisor, Node Exporter, Prometheus, Grafana) to ensure visibility, stability, and fast incident response
Our AI and autonomy bot are built with safety logic, not just wishful vibe coding thinking.
The bot includes self-preservation behavior and health checks, so it doesn’t blindly execute tasks until it dies or gets stuck. We also constrain autonomy through structure: tasks are planned and executed within defined patterns instead of free-form behavior, and the bot reports status back through the same controlled pipeline. In other words, we’ve designed autonomy that stays predictable, inspectable, and stoppable. It also is hosted locally, on its own rasberry Pi, enbaling us to have full control whats gets out and in of the hardware. The pipeline is also following best practice in terms of pipeline and dockerization practices, this makes things hard to break once its in production.
Digital transformation
The solution automates Minecraft server management to reduce manual effort and improves player/admin experience through intelligent automation (quests, bot orders, notifications). It demonstrates measurable impact through faster task execution, real-time feedback loops, and extensibility for future needs. And the needs we believe will come!
We are transforming Minecraft inventory management with finance and operations. With business process that automates into Minecraft.
Why we deserve points for Digital Transformation:
We replace manual steps with automation, give real-time feedback on progress, and make outcomes measurable (faster execution, clearer status, and easier operations). The key transformation is linking inventory and production thinking from FO to Minecraft execution, proving the same pattern can scale to real-world harvesting and remote operations. Which we have seen both with our own customer and other production companies. THIS IS THE FUTURE.
Final summary
This is an end-to-end production system that starts with trusted business demand and ends with autonomous execution, all while keeping the enterprise system safe and the outcome measurable. It combines real workflow automation, event-driven cloud integration, autonomous task execution, and a data pipeline that turns activity into operational intelligence.
Thanks for us, It was a pleasure. Until next time!
Our solution is a fix to all those award conversations you might have with friends, family or employees. The howler is a message delivery service sending what you wish to send, in the desired tone of voice straight to the physical howler on their desk.
Technical Architecture
Our flow is pretty simple, yet elegant, if we might say so ourselves:
Flow starts with the input from your chosen Wizard/Witch. Input whatever you want, to who you want, in the tone of voice you want. In one of two possible ways:
Into our seamless React Static Web Application. The app fetches secrets from Github secrets to ensure that no messages get out of hand
Use our public API directly if you are more of a hacker-witch
The message is posted to our Power Automate flow and stored for archivation in Dataverse
A Python Flask app accepts the incoming letter. Processes it to add a magical touch tone of voice and transforms it to text-to-speech though Azure Speech Services and OpenAI APIs.
Audio file is played on a Bluetooth speaker and the Arduino makes the letter physically move.
{ “sender”: “Bellatrix”, “recipient”: “Sirius”, “sentiment”: “Angry but polite”, “subject”: “The end”, “body”: “You’re definitely fired.” }
Fabric Fera Verto
Set up of a Fabric instance, but sadly we ran into region problems
For the future we would like to add the data from Dataverse through a Fabric Link to prepare our data for our analyst department
The data platform team can use Howler data to gain valuable insights combining it with other business data
Low-Code Charms
Half of our solution is built on low-code.
Power Automate is the skeleton of the solution, and Dataverse is the brain and blood. The rest of the body is made in Python and React.
Without low-code, it would have been impossible to deliver such a good solution in such a short time. This helps us greatly to shorten the distance from design to finished product.
We also have a Model-driven app for Howler support that helps our magical advisors if something goes wrong during message delivery
Pro-Code Potions
For the frontend, we’ve created a React Router app, with hot-reloading for DX, sass for styling and react-scripts providing the necessary scripts for running, building, and testing the project
For the backend, we are using Flask, written in Python (aka. Parselmouth). We’re doing the following:
Use Azure Cognitive services to enchant the input message in the form of a letter appropriate for reading aloud, and adjusting the tone of voice etc.
We use the same service to create a audio file with the enchanted-message
Send commands to the Arduino in order to start and stop the physical motor in the howler-letter while the audio is playing.
Digital transformation
Our solution provides value to any business that wants to shorten the time it takes for awkward conversations or long messages.
We streamline everyday life by allowing users to enter a short message, which is read out to the recipient as a longer, more appropriate text. This will save everyone a lot of time and awkward moments.
ALM Magic
Using GitHub actions for CI/CD; we are building the application, and then deploying the build/image to our Azure Static Web app.
We were planning to set up Github security for code scanning and dependabot to ensure updated dependancies; but did’nt have the budget for GitHub Advanced Security.
Magic Matrix
As this is primarily designed for the Hogwarts school, we also were planning to archive the messages and audio-files in Sharepoint, with proper access control and governance; done through Power Automate.
As well as SMS notifications, we were planning to send teams notifications as an additional service.
A Low-code App to help Wizards with their mental health, after so many attacks and traumatic experiences, there is no doubt many of them need to keep check on their working hours and mental well-being.
First they must register when the enter or leave Hogwartz.
Then a report will be create for each of them:
Then a score will be provided:
In case they receive a bad score they will be prompted to take a mental health quiz:
We went out on a mission to elevate the businesses of Diagon Alley by improving the customer experience and creating a solution to help the business automate their day-to-day tasks and improve their efficiency and revenue. As a way of simplifying the development, we have chosen to focus on the local pet store as proof of concept solution, but always with plans of expanding in the back of our heads
In our attempt to move the store lines to the cloud, we have a given the customers a Power Portal website where they can sign in to order pets, food and cages, select a preferred estimated delivery time in addition to the cost of individual items and in total. To help the customers use this site, we have also developed a “magic” mirror that gives recommendation for a pet and provides a QR-link to the registration form with the suggestion automatically set.
The customers can also get an easy access to the website using our “magical” smart mirror known as “Mirror of Noitceles”. While standing in front of this mirror, a camera will take their picture and, using Azure Open AI, will recognize facial features and come up with a suggestion for a pet. They will also be given a QR-code they can scan that provides them a link to the registration website with the recommended pet already set in the registration form.
By using the Power Platform to create the solution, we have created an easily accessible solution that could be implemented into Teams and therefore by available for business owners that works on the floor all day. This way, we also can provide an integrated connection with SharePoint and their documents handling and simplify communication by implementing textmessage, phonecalls and pre-made Power App components.
This also gives an easily maintained database in the shape of Dataverse, and by synchronizing this to Fabric we also have solid foundation for adding future modules that can improve the experience by sharing data between different businesses and develop complex AI-solutions that will improve their experience and return of interest further.
As the business using our solution, and their customers, are registering data into Dataverse, we are synching this into Fabric in the backend part of the solution. To maximize the value of Fabric, we also have connected an Power BI report that displays the data we need to improve the solution in the best way possible.
By doing this we are creating a massive data collection that can be used to improve the solution between business and train AI-models to create an even better solution and modules to add on in the future.
Category: Low-Code Charms
Our solution is mainly taking advantage of the magical powers within the low-code Power platform. We are using Power Pages to maintain a website for customers where they can register their order to the store. This is also connected to our smart mirror that recognized customers, suggest a pet based on their appearance and provides them an QR-link to this website with the suggested pet already selected.
We are also using a model-driven Power App to maintain the data within the Dataverse, and allows the employees to handle the customers order. They are able to see all orders, pets, and type of food in a entity-view (tables) – as well as an Admin-form with a kanban-board (thank you Resco) for updating Order-statuses and see the sales-stream, and a PowerBi-report (connected to Fabric). The store owners also have an “Admin”-page set up with an overview of available pets, their aisle location and more, with the possibility to register new Pets as they become available.
The solution is further improved by using Power Automate to automate tasks that that usually takes unnecessary time, like sending and receiving text message or generate supplier agreements in SharePoint and sending supply orders.
Category: Pro-Code Potions
As a lot the customer functionality is handled by low-code solutions, the heavier aspects of the solution is handled in the backend using more traditionally coded functionality.
We have a Raspberry Pi configurated with the “Magic mirror” node.js solution (https://magicmirror.builders/), and have improved this by adding and adjusting modules, including a list view of a Dataverse table and the generation of QR-code that send customers to the ordering website. We also have adjusted the Power Pages website by using jQuery to implement URL parameters so the suggested pet shows up automatically.
We have set up a scraping solution to the Harry Potter Wiki-page to integrate data from this universe, and are using the external python package “Cv2” to handle API’s for facial recognition. Oth er technologies we have used are msal, axios, Flask and Websockets.
Category: Digital transformation
As a lot the customer functionality is handled by low-code solutions, the heavier aspects of the solution is handled in the backend using more traditionally coded functionality.
We have a Raspberry Pi configurated with the “Magic mirror” node.js solution (https://magicmirror.builders/), and have improved this by adding and adjusting modules, including a list view of a Dataverse table and the generation of QR-code that send customers to the ordering website. We also have adjusted the Power Pages website by using jQuery to implement URL parameters so the suggested pet shows up automatically.
We have set up a scraping solution to the Harry Potter Wiki-page to integrate data from this universe, and are using the external python package “Cv2” to handle API’s for facial recognition. Oth er technologies we have used are msal, axios, Flask and Websockets.
Category: Magic Matrix
We are using a combination of Teams and Power Platform to create a magical solution for businesses and customers. We have document handling in SharePoint, Customer communication with phone calling and text messages in Teams as well as a good integration with Power platform
The Arctic Cloud Developer Challenge has been an exhilarating ride, pushing our creativity and technical prowess to new heights. For our final solution, we created a Sorting Hat Quiz app together with a Physical Talking Hat, merging the best of the digital and physical worlds. In this blog post, we’ll break down how we approached the challenge and highlight how each category shaped the final product.
Inspired by the magic of the Harry Potter universe, we decided to build a Sorting Hat Quiz app that not only sorts users into their Hogwarts houses but also has a real-world interactive element. Student can choose how he would like to be sorted, digitally or physically.
Now, let’s see how our solution corresponds to each category:
Fabric Vera Verto
We have created a dashboard in PowerBI showing us deep magical insights and statistics about distribution of student’s answers.
The hat interface uses python to orchestrate TTS STT and backend communication.
We use Terraform to deploy the entire backend, package the backend web app, and update the code.
The backend code is java script that can handle multiple conversations at once.
One backend to rule them all
We are also customizing the low code Web app with our pro code
following our journey, and we hope you feel as much magic in our solution as we do!
Digital transformation
By offloading the work from the old Sorting hat to the digital version it enables seamless logging and analytics.
It enables multiple students to be sorted at the same time from all types of locations.
We can pull statistics from the sorting giving an insight in the details of the sorting process
ALM Magic
Environment variables in bot
Terraform automatically updates the backend to the newest code
We have a sign in using phone number and username.
The app is automatically deployed to Teams users that is in the tenant.
We make pretty code
The question pools are stored in Dataverse making it easy to manipulate the quiz
We are using ALM (CI/CD) pipeline and releases the CRM solution form development to , UAT and production environment’s
Magic Matrix
We are using the magic of Teams
Accessibility: Backend is API compatible making it possible to integrate with anything; Speech to text with whisper, and text to speech with Eleven labs making it accessible as long as you can talk
Privacy: LLM does not train on user conversation; Voice input is not stored
By leveraging the strengths of each category—Fabric Fera Verto, Low-Code Charms, Pro-Code Potions, Digital Transformation, ALM Magic, and the Magic Matrix—we were able to create a Sorting Hat Quiz app and a physical talking Sorting Hat that deliver a magical, interactive experience. Our solution pushed the boundaries of what’s possible, seamlessly blending digital and physical technologies to engage users in an unforgettable way.
We are incredibly proud of our final product, and we believe it not only meets the requirements of the Arctic Cloud Developer Challenge but also sets a new standard for how apps can bridge the gap between the virtual and physical worlds.
At Hogwarts, we’ve revolutionized the administration of social events and classes, and student attendance tracking through a seamless integration of Dynamics Customer Insights Journeys, Pro-Code, Power Automate, Liquid, custom pages and Power pages.
By combining these technologies we have made both classes and social events easier to administer for the system users. The system users only have to create the events and there is no need for the users to create segments, emails or journeys. By combining technology from MappedIn, Power Automate and Power pages the status of the event registrations are handled by the system and removing a burden from the users since they no longer have to delete data (to cancel an event registration) or manually check in an event registration in a other table.
Navigating the vast and magical halls of Hogwarts can be quite the adventure. To make it easier for students and professors to find their way around, we’ve integrated Mappedin’s innovative search bar and routing functionality. Whether you’re looking for a specific classroom, common room, or office, Mappedin provides multiple ways to find your destination quickly and efficiently.
In addition to searching for rooms, Mappedin allows users to set routes to their desired destinations. Once a room is selected, the system provides step-by-step directions, ensuring that students and professors can navigate the castle efficiently. This routing functionality is particularly useful in a large and complex building like Hogwarts, where finding the shortest path can be a challenge. Whether you’re heading to the Great Hall for a feast or the Astronomy Tower for a late-night class, Mappedin has you covered.
By using MappedIn we’re able to send out real time communication as students approach classrooms, by using the Customer Insights Journey functionality. We’re using customer insights journeys to send communication that contains a link with utm parameters. When the students open up the link a customer journey using the trigger “When Marketing Site Visited”. We’re then able to get the link they open which also have the contact ID and The event Registration ID. By using a power automate and a custom trigger we can get the event registration and change the status to “Attended”.
Image of the process for classes:
Image of the process for Social Event:
User interface and landing zones
Students and professors are presented with a model-driven application where they open a landing page (custom page). From there students have a separate landing zone from the professors andhere they can navigate to other areas of the application. The custom page has three navigation points, the social events, the active classes for today and the Marauders map for getting a guide to show you around the school. The navigation points are connected to views, forms and other custom pages in the Model-driven app using PowerFx Navigation() function. From the Student landing zone, there is a PCF with a gamification aspect. Students can compete in fetching the Golden Snitch and score House Points and level up. Each time they level up the snitch moves faster.
See your classes and get guidance from the map. When approaching the class room you get a message that is based on location, registrating you for the class automatically.
Category Alignment
Fabric Fera Verto Our solution creates deep magical insights through:
POWER BI
In our Hogwarts App, we’ve conjured up some magical tools to make life even more exciting for our students and teachers. These enchanting features are designed to inspire, motivate, and bring a touch of wonder to everyday activities. Here is a list over our Magical Insights:
House Points Tracker The battle for house glory just got more thrilling! We’ve introduced a House Points Chart, where students can keep track of the current standings and see which house is leading the charge. The bars shimmer in the house colors—scarlet for Gryffindor, emerald for Slytherin, sapphire for Ravenclaw, and gold for Hufflepuff—bringing the spirit of friendly competition to life.
Golden Snitch Leaderboard Catch the Snitch, and claim your glory! The Golden Snitch Leaderboard highlights the top players in the beloved Snitch-chasing game. The rankings sparkle with magical hues: gold for first place, silver for second, and bronze for third. Will you be the next Seeker extraordinaire?
Class Attendance Tile At Hogwarts, striving for 100% class attendance is a noble goal! To aid in this quest, we’ve added an Attendance Tile to the report. It reveals how each class fares in attendance, with fascinating insights. However, beware—the data shows a troublingly low turnout in Defense Against the Dark Arts! Could a bit of extra encouragement—or perhaps a strong Patronus—be the solution?
Students by House Chart Curious about the balance of Hogwarts houses? The Students by House Pie Chart offers a delightful overview, showing the distribution of students across the four houses. Each slice of the pie is enchanted with its house color, providing a clear and colorful glimpse into Hogwarts’ magical diversity. This tool fosters greater understanding and unity among the houses.
With these mystical updates, we’re ensuring that every student and professor can dive deeper into the wonders of Hogwarts life. Keep exploring, keep competing, and most importantly—keep the magic alive!
Low-Code Charms We’ve woven powerful low-code spells through: Power Pages for event registration Power Automate flows for attendance tracking Custom pages with responsive design to enhance users’ experience Enhanced date handling through PCF Model-driven app to help teachers administrate the school
Marketing Events automation To ease the task of creating new Subjects every Semester, we have decided to use some Power Automate automation 😊
This flow will run at the beginning of each year, and for every semester contained, it will iterate through all the subjects that take place during it and will create an event for every occurrence of the subject.
Finally, for every student that enrolled into the subject, it will create an event registration for every event.
Teachers can: Assign points to the different houses.
Handle emails sent to the customer and check the events and event registrations created through our different automation processes.
Define the different rooms in which subjects can take place and link them to the interactive app
Power Pages
We used Power Pages to customize our event registrations and upcoming sessions – responsive and user friendly.
Through our Power Pages portal, where they can browse and sign up for both classes and social events. Upon registration, a Power Automate flow checks event capacity and automatically assigns the appropriate status, followed by a confirmation email containing event details, a QR code, and a cancellation link (Generated by power automate). Students receive timely reminders as the event approaches. Should plans change, students can easily cancel their registration through a personalized link, triggering another Power Automate flow that updates their status to “Cancelled”.
When a student is created in Dataverse a power automate triggers generating a unique url to an overview of their upcoming classes. The url is sent out with marketing emails to student as the semester start is approaching to all of the students. When a event is created a unique url to that event is created. The url is then connected to the buttons on each event card presented in the event gallery.
We used FetchXML to fetch data from Microsoft Dataverse. It allowed The Golden Snitches to query for complex queries with filtering, sorting, and aggregation, making it ideal for creating dynamic and interactive content in our Power Pages Portal for the students and Professors.
Because we are getting the form dynamically trough fetch xml we can remove another burden from the user that would need to copy a script and add it to a page. But now we can display the form by using liquid:
Custom page – Responsive design
Using containers, horizontal and vertical, to structure the layout and have proper responsiveness to resize and adjust the layout according to user screen size. Utilizing the dynamic values of Parent to set the width and height of controls within the screen
HTMLtext, syntax and code principles:
Use of $ to reference dynamic values within a string.
Seamless navigation: Utilizing the custom page within model driven apps makes it possible for us to easily navigate to other areas of the model-driven app.
We developed a PowerApps Component Framework (PCF) using MappedIn’s React-based SDK documentation. This allowed us to embed their robust mapping and navigation functionality directly into Dynamics 365. The solution enables users to visualize spaces, navigate between rooms, and even calculate optimal routes within buildings. Custom PCF for semester date management.
Map in action
PCf’s react fragment example.
Custom PCF with an animated catchable golden snitch
Azure AI implementation for voice recognition (Ask Cathrine)
Powershell for advanced pipeline deployment
To ease the deployment of our PCFs, we made a short Powershell scripts that updates the version of both the solution and the PCF automatically before its compilation.
Specialized field customizations: Integration with Resco components
We used DateRangeBuilder PCF from Resco to enhance the creation of new Semesters in our App.
Oneflow: • Auto-dispatches contracts faster than a Nimbus 2000 • Tracks signatures with the precision of the Marauder’s Map • Sends confirmation owls (emails) that would make Hedwig proud
Now Muggle parents can sign from their “mobile devices” (fascinating contraptions), while Professor McGonagall tracks everything from her enchanted dashboard. The whole system runs smoother than a well-brewed Felix Felicis!
Digital Transformation We’ve bridged operational gaps through: Maximum event capacity field:
Power automate that updates the event registration status based on the maximum capacity of the event.
We are then using generic journeys to send out the event registration with a QR code. And reminders for the event. Point Taken also use Power Automate to generate a cancellation URL on the event registration that will be sent out with the emails in the generic journey.
Cancel url on the event registration:
Power automate that generate the cancel url for sending out with the marketing emails:
The students will then be able to click the link and be navigated to a site on the power page where they can cancel their event registration.
When the students click the button they trigger a power automate that changes the event registration to status “Canceled”:
Since the social events also is only for students and employees at the school we will be able to track when the students is getting near the room of the event. We will then again use generic journeys with emails to track the attendance. The administration will then be able to send out communication after the event like surveys or other content.
ALM Magic Our ALM strategy includes:
Git Connect integration
We used Power Platform’s new Git Connect functionality to automate the commit process for solution changes.
This feature allows changes made in Power Platform to be committed directly to an Azure DevOps repository, with the user who made the changes reflected as the commit author. This integration not only saved time but also ensured proper traceability and alignment with our team’s Git workflows.
1. Components contained in the solution separated by name 2. Authors of the commits.
DevOps pipelines for building PCFs
To handle the custom PowerApps Component Framework (PCF) components, we reused and adapted an existing pipeline. This pipeline automates:
Version Updates: Using a PowerShell script, we increment version numbers to maintain proper versioning.
2. Solution Packaging: The pipeline packs the updated PCF into a deployable solution.
3. Deployment: The final step deploys the solution directly to our environments. This automation reduced manual effort, minimized errors, and ensured smooth delivery of components across environments.
Solution segmentation
Entra ID security implementation
Microsoft Entra ID Security Groups To simplify user administration and have better control of user access to the environments, we have created teams of type Microsoft Entra ID security groups.
We created a security group in Entra ID where we add the users we want to give access to the environment.
We then created a team of type Microsoft Entra ID Security Group in Power Platform where we assigned the Entra Group to the team. We then assigned the team the desired security roles:
We now only have to add users to the Group in Entra ID to give access and assign the users security roles.
Power automate – Force Sync users from Entra ID Groups Now that we have automated and gotten more control of the granting access to users to the environment apps and automations, we need to force sync the users.
The users will not appear in Dataverse unless they have logged into the environment. But building a power automate flow we can solve this issue:
The flow will be triggered once a day and then it will get the members of the Entra ID group. The flow then uses the force sync action to sync the users.
Solution Strategy The team have created a solution strategy that separates the different types of components in different solutions for easy deploy and a better development process.
Power Platform Pipelines We have used Power Platform Pipelines to automate the deployment process between environments.
Power Automate Connectors – Service Principal:
A service principal is specific to Microsoft Azure and is used to grant applications access to Azure resources. We have used a Service Principal for our Power Automate connectors to make sure that the governance of the flows is exceptional.
Magic Matrix We’ve created an enchanting experience using: If another pandemic hits or a troll attacks the school, teachers will be able to stream the classes to teams. With Dynamics Customer Insights Journeys and Microsoft Teams, the professor can easily plan, host, and follow up on live-streamed events, so that the students never miss a class.
Concept: Imagine stepping into Professor Snape’s potion lab, where magic meets modern technology. At PowerPotters of Cepheo, we’ve reimagined potion production using IoT, AI, and automation to create a seamless, efficient, and intelligent system—perfectly blending the wizarding world with real-world innovation.
End-to-End Scenario: The ur solution begins with a voice command or sensor data triggering a production order in Dynamics 365 Finance and Operations (D365FO). From there:
Approval workflows in a beautifully designed PowerApps interface ensure strict oversight.
IoT sensors monitor potion brewing in real time.
Power BI dashboards visualize live inventory, production orders, and trends, enabling better decision-making.
Legacy ASCII-based systems, like the ETON packaging machine,for integration with packaging systems
Every component works harmoniously to eliminate manual steps, increase efficiency, and enhance user experiences for potion masters and house-elves alike.
User Story: Professor Snape manages a bustling potion lab where students brew elixirs daily. When a student initiates brewing:
A voice command or sensor input creates a production order in D365FO.
Snape reviews the potion request in the Alchemy Approval App, approving or rejecting it.
Approved potions are tracked in real time on the Power BI dashboard, where Snape monitors ingredient usage, inventory, and production status.
Students add ingredients, telling the cauldron which they use, and the cauldron IoT reports back the ingredient details to D365FO.
A plethora of support systems has been added as well to make the complete process of managing a laboratory simple and effective, such as:
Sharepoint and Teams messaging to the team of Elfs working the laboratory
AI-based input of masterdata into D365FO
Third-party solutions for ease of work
Impressive backend FO X++ actions.
This end-to-end process ensures a smooth, magical potion production workflow!
Important notice for our fabulous judges. We know you have a lot to read, so from here on, the blog post is split into each of your categories so that you can save time!
2. Fabric Fera Verto
Concept: Fabric Fera Verto, or “Data Transformation,” is the magical core of our potion production analytics. Using Power BI and Dynamics 365 Finance and Operations (D365FO), we’ve built a seamless flow of data to empower potion masters with insights. By visualizing inventory, production orders, and potion trends, we’ve turned raw data into actionable magic.
Key Components:
Dataflow from Dynamics 365 FO to Power BI:
Data from D365FO is extracted using an OAuth legacy connection, ensuring a secure and reliable pipeline.
Key information includes inventory levels, production orders, and ingredient usage.
Power BI Dashboard:
Business Value: The dashboard is displayed in the alchemy lab, providing real-time insights into:
Inventory Levels: Tracks current stocks of potion ingredients and finished elixirs.
Recent Potions: Displays recently completed brews for trend analysis.
Active Production Orders:
Potion name, quantity, and approval status (e.g., “Pending Approval”).
Bill of Materials (BOM) visualization showing ingredients required for active orders.
Impact: Potion masters can manage resources effectively, reduce ingredient shortages, and plan future brews.
Cloud-Powered Updates:
The Power BI report is published to the Power BI Cloud with scheduled updates, ensuring near real-time data for potion production insights.
Future Scalability with Fabric
The integration is designed to grow with the needs of the alchemy lab. As potion production increases, transitioning to Fabric tables would:
Improve Efficiency: Fabric provides a unified data source that ensures faster queries and streamlined data transformations, reducing latency.
Enable Real-Time Updates: Potion masters could make decisions instantly, whether adjusting production schedules or responding to ingredient shortages.
Support Increased Data Volume: As more IoT devices feed data into the system, Fabric’s robust infrastructure can handle higher data loads without compromising performance.
Why This Earns Fabric Fera Verto:
Seamless Data Integration: The solution connects D365FO to Power BI, bridging transactional data with analytical insights.
Real-Time Insights: The dashboard equips potion masters with live data for informed decision-making.
Scalability: The system is designed to evolve, with future enhancements like Fabric enabling even more seamless updates.
3. Low-Code Charms
Concept: Our low-code innovations ensure that the magic of potion production flows effortlessly across the lab. Using Power Automate and a responsive Canvas app, we’ve created intuitive workflows for approvals, notifications, and user interactions—eliminating manual effort and ensuring streamlined processes.
Key Components:
Production Order Creation Flow:
Trigger: When a student initiates brewing (via voice or IoT sensor), a Power Automate flow creates a production order in D365FO.
Key Steps:
Retrieve potion details, including name, quantity, and requirements.
Generate the production order in D365FO, ensuring every potion has a structured workflow.
Update status fields to track progress through the brewing lifecycle.
Approval Workflow in the Alchemy Approval App:
Canvas App: A beautifully designed app lets Professor Snape approve or reject potion requests with ease.
Notifications to sponsors (via SMS or Teams) about potion status or ingredient shortages.
Updates to SharePoint lists for elf task management, ensuring no steps are missed in potion production.
Why This Earns Low-Code Charms:
Automation First: Manual processes like potion approval and ingredient monitoring are replaced with automated workflows.
User-Centric: The app and flows simplify approvals and provide clear, actionable information to users.
Visual and Accessible: The responsive, Harry Potter-themed app adds both functionality and flair, making it easy for potion masters to manage brewing on the go.
4. Pro-Code Potions
Concept: The true magic of our potion production system lies in the pro-code components, where custom Python scripts, X++ extensions, and hardware-level integrations make the impossible possible. By blending IoT, AI, and robust coding practices, we’ve created a solution that bridges the physical and digital realms with precision and scalability.
Key Components:
Python for IoT and AI Integration:
IoT Sensor Management:
Python scripts like sensor_script.py process data from the ultrasonic level sensor, ensuring precise liquid measurement in potion cauldrons.
Real-time readings are transmitted to Power Automate, enabling immediate workflow adjustments.
Voice Recognition:
Using voice_script.py, the system leverages the OpenAI Whisper API to translate spoken potion names into actionable triggers.
This enables seamless, hands-free initiation of potion brewing.
Integration Orchestration:
integration_script.py ties sensor readings, voice inputs, and Power Automate flows together, ensuring that each step of the process aligns perfectly.
X++ Extensions in Dynamics 365 FO:
Custom Extensions:
Example: AdCustGroup_Frm_Extension ensures tailored functionality for potion-specific processes like production orders and inventory updates.
Field extensions and event handlers provide flexibility for tracking potion-specific data in D365FO.
Business Logic:
Custom X++ code facilitates real-time updates to production orders, inventory tracking, and seamless integration with Power Automate.
ESP32 Firmware Deployment:
Automated Deployment:
A GitHub Actions workflow automates the process of flashing firmware to the ESP32 microcontroller.
Key features include dependency installation (esptool), direct flashing via shell commands, and a self-hosted runner for stable deployments.
Hardware Integration:
The ESP32 monitors potion brewing equipment, sending real-time data to the cloud and triggering workflows when thresholds are met.
Why This Earns Pro-Code Potions:
Hardware and Software Synergy: Python scripts and ESP32 firmware ensure seamless integration between IoT devices and workflows.
Advanced Customization: X++ extensions tailor D365FO for potion production, enabling real-time updates and efficient inventory management.
Efficiency and Reliability: Automated firmware deployments and modular script design make the system scalable and error-resistant.
5. Digital Transformation
Concept: Our potion production platform is a shining example of digital transformation. By integrating IoT, AI, dashboards, and workflows, we’ve revolutionized potion brewing at Hogwarts, streamlining processes for students, potion masters, and house-elves alike. This transformation isn’t just magical—it’s impactful, reducing manual effort and enhancing collaboration while creating a more efficient and connected potion lab.
Key Components:
IoT-Driven Automation:
Sensors: Ultrasonic level sensors continuously monitor potion cauldrons, sending precise measurements to Power Automate for real-time adjustments.
Voice Commands: AI-powered voice recognition eliminates manual input, enabling potion masters to initiate production hands-free.
Workflows and Approvals:
Power Automate orchestrates end-to-end processes, from potion initiation to approvals and inventory management.
The Alchemy Approval App ensures oversight with an intuitive interface for reviewing and approving potion requests.
Real-Time Data Visibility:
The Power BI dashboard provides live updates on:
Inventory levels for ingredients and potions.
Production status, including Snape’s approvals and BOM requirements.
This visibility empowers potion masters to make informed decisions, reducing bottlenecks and ensuring resource availability.
Legacy System Integration:
The ETON packaging machine, reliant on ASCII files, is seamlessly integrated into the workflow using automated file generation.
This modernization reduces manual input, ensuring the machine operates in sync with modern systems.
Accessibility for All Users:
House-Elf-Friendly SharePoint Lists: Tasks and potion requirements are published in a simple format, ensuring elves can manage their duties effectively with minimal digital training.
Why This Defines Digital Transformation:
Efficiency: IoT and workflows automate repetitive tasks, saving time and eliminating human error.
Collaboration: Students, Snape, and house-elves are all connected through the platform, ensuring transparency and teamwork.
Enhanced Decision-Making: Real-time dashboards and AI-driven insights provide actionable information for smarter, faster decisions.
Legacy Modernization: Integrating old systems like ETON showcases how digital transformation can bridge the gap between the past and the future.
6. ALM Magic
Concept: True magic lies in the details, and for Team PowerPotters, Application Lifecycle Management (ALM) has been the wand that brings order to the chaos of customization. From naming conventions to robust deployment practices, we’ve built a foundation of clarity, consistency, and scalability that ensures our solution runs as smoothly as a perfectly brewed Felix Felicis.
Key Components:
Naming Standards:
Consistent naming conventions across all artifacts make customizations easy to identify and manage.
Examples:
AdCustGroup_Frm_Extension: Form extension for custom potion-related workflows.
AdCustGroup_Frm_dsCustGroup_Extension: Datasource extension to enhance potion data tracking.
AdCustGroup_Frm_EventHandler: Event handler for custom logic tied to potion production orders.
These standards ensure every artifact is self-explanatory, reducing the effort required for onboarding or debugging.
Error Handling and Logging:
Python scripts include robust try...except blocks with detailed logging, enabling traceability and quick resolution of issues.
Example: In sensor_script.py, logs capture real-time sensor readings and flag anomalies, ensuring the potion brewing process remains uninterrupted.
Similarly, X++ code in D365FO includes structured error handling to protect data integrity during custom operations like inventory updates.
CI/CD Deployment Cycle:
Azure DevOps Pipelines:
Pre-build pipelines prepare the build server, ensuring all dependencies are available before starting.
Express builds validate core functionality, while full builds ensure a complete release package.
Webhooks for Notifications:
Successful deployments trigger webhooks to notify the team via Teams, ensuring everyone stays informed.
GitHub Actions for IoT:
ESP32 firmware deployment is automated using GitHub Actions, streamlining hardware updates and maintaining consistency.
Mocking for Testability:
The inclusion of mock modules (e.g., GPIO mock) enables rigorous testing of hardware-dependent Python scripts without requiring physical devices. This ensures stability before deployment.
Why This Earns ALM Magic:
Clarity: Naming standards and modularity make the codebase easy to understand and extend.
Reliability: Error handling and logging practices ensure stability and quick debugging.
Efficiency: CI/CD pipelines and automated firmware updates minimize manual intervention while maintaining quality.
Future-Proofing: Mocking and deployment practices enhance scalability and adaptability.
7. Magic Matrix
Concept: The Magic Matrix represents the interconnected web of systems, tools, and platforms that make our potion production platform a seamless, unified experience. By integrating Dynamics 365, Power BI, IoT, Teams, and SharePoint, we’ve created a cohesive solution where every component works in harmony to ensure efficiency, transparency, and collaboration.
Key Connections:
Dynamics 365 Finance and Operations (D365FO):
Central Hub: D365FO serves as the backbone, handling production orders, inventory management, and BOM tracking.
Custom Extensions: X++ code enhances functionality, ensuring potion-specific processes like ingredient monitoring and approval workflows are perfectly aligned.
Power BI Dashboards:
Data Visualization: Directly connected to D365FO, Power BI provides real-time insights into production statuses, inventory levels, and potion trends.
Cloud Integration: Published to Power BI Cloud, the dashboards are accessible across devices and updated regularly to keep potion masters informed.
IoT Devices:
ESP32 Sensor Integration: Sensors monitor potion cauldrons in real-time, feeding data into Power Automate for precise workflow adjustments.
Firmware Automation: GitHub Actions workflows ensure ESP32 firmware is always up to date, enabling reliable data collection.
Power Platform:
Power Automate: Acts as the orchestrator, connecting IoT inputs, D365FO workflows, and approval processes.
Canvas App: The Alchemy Approval App integrates directly with D365FO, allowing Professor Snape to review and manage potion requests in a user-friendly interface.
Teams and SharePoint:
Notifications via Teams: Webhooks notify team members when a new production order is approved or when a deployment is completed.
Elf-Friendly SharePoint Lists: Simple lists provide house-elves with clear instructions for managing ingredients and potion schedules, ensuring accessibility.
Why This Earns Magic Matrix:
Unified Ecosystem: By connecting tools like D365FO, Power BI, IoT, and SharePoint, we’ve created a fully integrated platform.
Cross-System Collaboration: Teams and SharePoint ensure seamless communication and task management across all users.
Legacy Modernization: Integrating the ASCII-reliant ETON machine demonstrates our ability to bridge old and new technologies.
Real-Time Data Flow: Connections between IoT devices, D365FO, and Power BI enable informed, real-time decision-making.
8. A Magical Farewell: Mischief Managed!
As the final chapter of our journey at ACDC 2025 closes, Team PowerPotters would like to extend a heartfelt “Thank you” to the organizers, judges, and fellow participants who’ve made this hackathon an unforgettable experience. Like a well-brewed Polyjuice Potion, this competition has shown us how a perfect blend of creativity, teamwork, and technical prowess can create magic!