Final Delivery from the Cepheo Crafting Creepers

A little Introduction to our final delivery here at ACDC;

We have built an end-to-end production flow where business demand starts inside the enterprise stack, gets approved and signed digitally, becomes an ERP production order, and is then executed automatically by an AI bot that gathers materials and reports progress back in complete with real-time updates and dashboards.

Minecraft is our safe, visual execution layer; the real-world use case is deployable physical robots harvesting and performing tasks in mines, quarries, and remote locations

Our project mimics real life customer work that we have worked on with Rana Gurber. Where they are deploying a dog-bot that gathers data in the mines, increasing safety for workers.

The flow (start to end)

Before going through the categories, a brief description of the architectural flow is good to have for some context later

1) Approval + Signing turns intent into an approved transaction

A purchase requisition is created in Dynamics 365 and enters an Approval/Signing stage. When it hits that stage, a Power Automate flow triggers the OneFlow contract creation, adds participants, publishes it for signing, and listens for contract status updates. Once fully signed, the flow retrieves contract details and automatically creates the Purchase Order header + lines in Dynamics 365.

2) Purchase Order triggers a Production Order and the worker starts

Once the approved PO exists, it triggers creation of a Production Order in our very own Minecraft FO Module. The Production Order reads the Bill of Materials (BOM) and determines required resources. In our module, those requirements are already translated into Minecraft blocks/items.

3) Execution happens in Minecraft: mine, gather, craft, deliver, and report back

An AI-controlled NPC worker is spawned in the Minecraft world and executes the translated task list:

  • mines required resources,
  • gathers/harvests materials,
  • crafts required items,

…..and sends the data to Power BI and our Canvas app Dashboard in Real time

4) Completion + status goes back to ERP (loop closed)

When the materials are ready, completion status is sent back so the Production Order can be updated/closed in F&O.

Boom, and you have automated production.

Now to the fun part;

Categories

Redstone Realm

We built a complete business process that begins where real work begins in internal systems. Approvals and contract signing are mechanisms that convert intention into trusted demand.

 From there, Dynamics 365 Finance & Operations becomes the system of record for purchasing and production planning, and our downstream services treat it like the authoritative source, not just another data feed.

We deliver a real business solution for Minecraft server automation and management, integrating Microsoft Azure (Service Bus, Event Hub) and APIs. The experience is built around in-game automation (quest system + bot feedback), with an AI-autonomous bot that executes tasks while using self-preservation logic.

Finance & Operations overview and one-click ordering experience

Inside Dynamics 365 Finance & Operations, we provide a clear, user-friendly overview of Minecraft worlds and their connected resources. Worlds and resources are presented in a structured view with visual indicators so users can instantly understand key status informationsuch as whether a world is online, what resource type it represents, and whether it is currently ordered, without needing to read through large volumes of data.

Built to extend without redesign

The underlying structure is designed to scale predictably as the solution grows: it’s straightforward to expand the model, add new screens, connect reporting, and support additional integrations without having to rebuild the foundation.

The bridge between Dynamics 365 Finance & Operations and the Minecraft world where D365 business events (for example, production requests for raw materials) are dispatched through Azure messaging and orchestrated by a Logic App workflow “TriggerAI”. The orchestrator transforms structured business data into an AI-readable instruction, and the bot executes the job in the world, navigating terrain, collecting materials, depositing outputs, and reporting completion back through the same cloud path.

Player experience: narrated diary + conversational recall

We extended the experience beyond pure automation by adding a diary and interaction layer: Steve can be asked questions like “What happened on January 23rd?” and responds using memories grounded in diary entries and logs via Retrieval Augmented Generation. We also provide audio narration of diary entries through a containerized text-to-speech service and a web diary player that lets users browse history, play audio, and chat through a clean interface.

Why we deserve points in this category:

We deserve Redstone Realm points because we built a real business solution on Microsoft tech where Minecraft is just the “factory floor” or the real life mine.

The process starts where business work actually starts: inside the company. Approvals and contract signing turn a request into something trusted, and then Dynamics 365 Finance & Operations becomes the single source of truth for purchasing and production planning. We don’t treat FO like a dashboard feed. we treat it like the system that controls what’s allowed to happen.

From there, we use Azure to make the flow fast and reliable. When FO raises a business event (like a request for stone, wood, ore), it’s sent through Azure messaging and picked up by our orchestrator (TriggerAI). TriggerAI turns the structured business data into an instruction the AI can understand, and then the bot executes the work in Minecraft: it navigates, gathers the right materials, deposits them, and reports status back through the same path so the loop closes.

We also made it usable for real people. This can be and will be adopted in the future for many many businesses, and we did it in Minecraft.

In FO, users get a clean overview of worlds and resources with visual status indicators, and they can order or cancel with one click, and no complicated steps. The structure is built so we can extend it without redesigning everything: new screens, reporting, and integrations plug in cleanly. And on the experience side, we added a diary + chat layer with audio narration so users can follow what happened and ask “what did Steve do?” without digging through raw logs.

In short: it’s a Microsoft-first business workflow (FO + Azure + automation + AI) that turns enterprise requests into automated execution gathering resources, with a smooth user experience end-to-end,exactly what the  Redstone Realm category is asking for.

Data, AI & Analytics

Telemetry sync + analytics-ready foundation

We collect and sync player inventories and bot telemetry (for example:  backend/azure-telemetry.js and the HTTP Bridge mod),using analytics on quests, bot orders, and player activity. This also lays the groundwork for AI-driven features through the Mineflayer bot and quest automation patterns.

We transform raw Minecraft server logs into structured event records (deaths, combat, exploration, achievements, mining, crafting, farming, building, etc.) using a dedicated parser. Outputs are stored as dated artifacts (logs/diary/chat per day), creating a time-series dataset that can be queried and analyzed over time instead of being trapped as text files.

Memory indexing + retrieval for grounded answers

The RAG indexing system processes both diary prose and technical logs, chunks them with overlap to preserve context, embeds the chunks, and retrieves the most relevant passages using cosine similarity. When users ask questions, we embed the query, fetch top matches, and inject them into the prompt so Steve can recall specific incidents because they exist in indexed memory.

Traceability between enterprise transactions and in-world actions

We log D365 event details (for example, identifiers, quantities, resource types, event times) alongside in-world task execution. This enables correlation questions like “Which D365 order triggered the collection of X blocks?” by aligning timestamps and task IDs for end-to-end traceability.

Engagement analytics from chat history

Chat sessions are stored as JSONL conversation turns with role, content, timestamps, and session IDs. This allows analysis of interaction patterns like conversation length, common question topics, and response latency based on recorded usage.

Orchestration layer as a workflow boundary

We use a Logic App named “TriggerAI” as an orchestration boundary that receives structured business events and transforms them into AI-ready instructions, keeping routing and transformation logic maintainable and observable as a workflow rather than hard-coding every integration step in services.

An easy POST of the Trigger AI

>

Data consumption from Minecraft server in Raspberry Pi 

From Raspberry PI, Posting data Minecraft player statistics, inventory data, world data from Minecraft server to Azure function end point. 

Azure function posting the data to Azure event hub 

Microsoft fabric event stream consumes the data from event hub and processing to Fabric lakehouse destination 

Dataverse uses virtual tables to get the data to power platform to utilize the business process capabilities in power platform 

Power automate flow process the data to Dynamics 365 Finance & Operations for further processing of inventories and orders 

Data injection to Minecraft server 

Asure Service Bus Queue using as a gateway from cloud to On prem Raspberry Pi Minecraft server to get the processed data from Dynamics 365 Finance & Operation, Dataverse and Azure 

Our very own Minecraft Workspace in FO

Why we deserve the points for Data, Ai and Analytics:

We deserve Data, AI & Analytics points because we don’t just “use AI” for the sake of using AI. we built a real data pipeline where raw gameplay and business events become structured, queryable, and useful, and where AI is grounded in that data instead of guessing.

First, we’ve made the Minecraft world measurable (not with a tape measurement ..though) . We sync inventories, world data, player statistics, and bot telemetry from the Raspberry Pi server stack into Azure, route it through Azure Functions > Event Hub, and then consume it in Microsoft Fabric (Event Stream > Lakehouse). That means we’re not stuck with scattered JSON or log files, we have a foundation that supports real analytics and reporting on quests, bot purchasing orders, and player activity, and it scales as the system grows.

Second, we turn unstructured signals into clean data. Server logs are parsed into structured event records (deaths, combat, exploration, achievements, mining, crafting, and can adapt others and so on and so on) and stored as dated artifacts, creating a time-series dataset.

This is the difference between “cool logs” and “usable data,” because it lets you analyze behavior over time, correlate activity, and build meaningful metrics.

Third, our AI is data-backed. The RAG system indexes diary entries and technical logs, chunks them with overlap, embeds them, and retrieves relevant passages with cosine similarity. When someone asks Steve a question, the response is grounded in retrieved memories sooooooo it can accurately recall specific incidents because the data exists and is referenced in-context, not just thin air pooof..

Finally, we built end-to-end traceability between enterprise and execution. We log D365 event details alongside task execution, so you can answer questions like “which order triggered this collection run?” by correlating identifiers, timestamps, quantities, and task IDs. And we keep the operational loop tight: Fabric and Dataverse can expose the data back into the Power Platform, and Power Automate can process it into FO for inventory/order handling with Service Bus acting as the gateway back down to the Raspberry Pi for cloud-to-edge injection.

Low-Code

Power Platform-ready APIs and endpoints

Our API and webhook design is built to integrate cleanly with Power Platform (Power Automate, Logic Apps). RESTful endpoints make it easy to automate workflows and build dashboards, and the solution can be extended with low-code tooling for reporting and orchestration.

We have made low code components from power platform such as Powerapps  (model driven app and Canvas apps), Power automate, Custom Connector, PowerBI)

Canvas Apps

Crafting Creepers Inventory Admin App

Helps to Generate model from the captured picture of an object. It uses the API from logiqraft.futurein using custom connector.

Creation of Order from CanvasApp

Generate purchase Summary using mscrm-addons.com and handle further processing in Dataverse and model driven app.

Minecraft Gold Converter App

Using APIs to fetch the external data from Materials such as gold and Exchange rate API

Here we use Low code Canvas app with the possibility of using Pro code PCF

Cepheo Crafting Creepers App

We created a low-code app in Microsoft Power Platform based on an original idea using accelerator components. It includes phone-position adjustment and video streaming within a canvas app, with a simple purpose: a dynamic experience that responds to the movement of your phone. See video>

Custom Connector

We have created custom connector for another Teams API

Power Automate

We have created few power automate flows to automate the process and integrate the data with external services such as Linkmobility, Oneflow, mscrm-addons etc.

Also setup an Approval flow for approving and  signing the purchase requisition

Why we deserve Low-code points:

We deserve Low-Code points because we used Power Platform to make the solution easy to run, easy to change, and fast to extend.

Power Platform building blocks (real, not just a demo)

We built with Canvas apps + Model-driven apps, Power Automate, Power BI, and Custom Connectors, so users can create orders, manage data in Dataverse, and see outcomes without needing developers for every change.

Apps that solve specific jobs

We delivered practical Canvas apps: an Inventory Admin app (using a custom connector to call an external API), a Minecraft Gold Converter (using external gold + exchange-rate APIs, with optional PCF), and a mobile experience app with phone-position adjustment and video streaming (video provided).

Automation + integrations with low-code flows

We created Power Automate flows for approvals and signing, and for integrating with OneFlow, LinkMobility, and mscrm-addon which makes the process runs automatically instead of manually.

Code Connoisseur

Purpose-built code across backend, bot, and mod

We implemented custom code across the stack: backend services (Node.js/Express), the Mineflayer bot, and a Java/Fabric mod. The codebase is modular and structured (see AGENTS.md for patterns) and includes innovations like the HTTP Bridge mod, bot self-preservation logic, and RCON integration.

The fabric inventory for converting the inventory in-game to products in Finance and Operations

Structured LLM interactions with guardrails (schemas, validation, planning)

We implemented structured LLM interaction patterns using DSPy signatures for command parsing, quest generation, and multi-task planning. To keep execution reliable, our planning pipeline enforces strict output schemas and validates every task structure before the bot runs it (e.g., required fields for collect/navigate/deposit), preventing malformed plans from becoming runtime failures.

Steve RAG: semantic memory built from scratch

We built a Retrieval Augmented Generation system that indexes diary entries and server logs into chunked segments with metadata, embeddings, and cosine similarity retrieval. The index supports reprocessing when files change (hash-based detection), and the retrieval layer feeds relevant passages into the chat prompt so Steve’s answers stay consistent with recorded events rather than invented details.

Full-stack delivery: services, storage, and edge deployment

We wrote the chat server and storage model (JSONL conversation history per date with IDs and timestamps), plus container and web delivery plumbing (Docker builds, persistent volumes, Nginx serving with CORS). We also produced deployment scripting across environments (shell scripts for edge devices and PowerShell for local workflows), turning the stack into something repeatable to run.

Why we deserve points:

This is not just a standard stack the LLM in the box but we engineered reliability

validated task outputs so the bot doesn’t break at runtime, built memory retrieval so answers stay consistent with recorded data, and delivered everything as a runnable, repeatable system (which means it has containers + scripts + edge-ready setup).

In short (how I understand it myself) : innovative code that actually ships and works, from ERP mapping to autonomous execution and real operational services.

Governance & Best Practices

Safety, privacy, monitoring, and operational robustness

We emphasize reliability and responsibility through error handling and logging across backend and bot services. Player inventories and “quest data” are handled securely, while telemetry and monitoring are streamed via Azure Event Hub. The bot is designed with responsible safety logic (self-preservation, health checks).

Monitoring:

As we are unning the Minecraft server as a prod environment we have set up a comprehensive monitoring stack with:

  • Cadvisor
  • Node exporter
  • Prometheus
  • Grafana

D365 Security Roles

Because Dynamics 365 Finance & Operations is the system of record in our solution, we also use it as a security boundary. We implemented FO security system (security roles, duties, and privileges) And trust me…working with this for the past months now, this can take some time to fully cope with..

so only the right people can view, create, approve, and dispatch the transactions that ultimately trigger bot work. That means the ability to “send work to the bot” is controlled the same way you’d control purchasing or production actions in a real enterprise process: through FO permissions, not informal access.

In practice, this keeps the automation constrained to its intended scope. The bot and edge device only operate on what FO authorizes, and the integration only processes actions that the user is permitted to perform.

Workers and admins can control the bot through the governed workflow which makes sure the bot doesn’t become a free-running actor, but an execution agent that acts on behalf of authorized users.

We deserve points for this category because;

We deserve Governance & Best Practices points because we treated this like a system that must be trusted and have spent time making sure the BOT handles it duties without breaching limits and

First, we built for operational robustness. Both the backend and the bot services include deliberate error handling and structured logging, so failures don’t turn into silent chaos. When something breaks, we can see what happened, where, and why.

That matters because our solution is event-driven and automated, if you don’t invest in observability, you don’t have automation, you have a surprise generator.

Second, we designed with privacy and data responsibility in mind. We handle player inventories and quest-related data as real user data: it’s stored and processed in a controlled way, with clear boundaries between what belongs in the game world and what belongs in the business/automation layers. The system doesn’t need broad access to everything; it only moves the minimum data required to execute and report on tasks.

Third, we made the solution monitorable by default. Telemetry and monitoring are streamed through Azure Event Hub, which gives us a consistent way to observe execution, detect anomalies, and build reporting on top of reliable signals. That monitoring isn’t only for dashboards but it’s part of governance: if you can’t measure behavior, you can’t manage risk.

Because we built FO security (roles, duties, privileges) directly into the execution pipeline, we deserve governance points for making automation permission-driven and bounded by enterprise access.

Lastly for governance and monitoring ,We deserve governance and best-practice points because we treat the Minecraft server like a real production workload, with a proper monitoring stack (cAdvisor, Node Exporter, Prometheus, Grafana) to ensure visibility, stability, and fast incident response

Our AI and autonomy bot are built with safety logic, not just wishful vibe coding thinking.

The bot includes self-preservation behavior and health checks, so it doesn’t blindly execute tasks until it dies or gets stuck. We also constrain autonomy through structure: tasks are planned and executed within defined patterns instead of free-form behavior, and the bot reports status back through the same controlled pipeline. In other words, we’ve designed autonomy that stays predictable, inspectable, and stoppable. It also is hosted locally, on its own rasberry Pi, enbaling us to have full control whats gets out and in of the hardware. The pipeline is also following best practice in terms of pipeline and dockerization practices, this makes things hard to break once its in production.

Digital transformation

The solution automates Minecraft server management to reduce manual effort and improves player/admin experience through intelligent automation (quests, bot orders, notifications). It demonstrates measurable impact through faster task execution, real-time feedback loops, and extensibility for future needs. And the needs we believe will come!

We are transforming Minecraft inventory management with finance and operations. With business process that automates into Minecraft.

Why we deserve points for Digital Transformation:

We replace manual steps with automation, give real-time feedback on progress, and make outcomes measurable (faster execution, clearer status, and easier operations). The key transformation is linking inventory and production thinking from FO to Minecraft execution, proving the same pattern can scale to real-world harvesting and remote operations. Which we have seen both with our own customer and other production companies. THIS IS THE FUTURE.

Final summary

This is an end-to-end production system that starts with trusted business demand and ends with autonomous execution, all while keeping the enterprise system safe and the outcome measurable. It combines real workflow automation, event-driven cloud integration, autonomous task execution, and a data pipeline that turns activity into operational intelligence.

Thanks for us, It was a pleasure. Until next time!

Sincerely, The Cepheo Crafing Creepers

The Great Vibecoding Incident of January 2026

The Setup: A 5,454-line commit lands with the message lots of fucking about. Nobody reads it. Why would they? It’s working, right?

The Plot Twist: Buried deep in the diff, two critical bot functions were quietly replaced with this masterpiece:

async collectItems(itemType, quantity) {
    console.log(`[Collection] Collecting ${quantity}x ${itemType} (not implemented yet)`);
    await this.wait(1000);  // the code equivalent of “thoughts and prayers”
}

async depositItems(items, chestLocation) {
    console.log(`[Deposit] Depositing items (not implemented yet)`);
    await this.wait(1000);  // vibes only
}
The Result: A bot that would receive orders to gather 64 diamonds, announce its intentions to the void, take a power nap, and report back: “Mission complete, boss.”

Lessons Learned:

Maybe read the 5,000-line commit
(not implemented yet) in production is a cry for help
await this.wait(1000) is not a substitute for functionality
The commit message was accurate. It was, indeed, lots of fucking about.

___

DOH:(

Raw Material Ordering: Order resources with Natural Language. Shared with our Neighbours

Sharing is caring

ever wanted to order a bot to just mine resources for you? well we want to share our API so you can do it with natural language.

When participating in a Minecraft hackathon, there is always one shared problem:
resources take time to gather, and manual grinding slows everything down.

To lower that barrier, we exposed a simple API that anyone can use to turn natural language requests into Minecraft bot actions.

No SDK.
No authentication.
No setup ceremony.

Just send a request and the bot goes to work.

Get access here:

TriggerAIV1 – API Documentation

Full swagger and no need to log in!!

TriggerAIV1 is an Azure Logic App–backed API that processes any incoming message and converts it into structured Minecraft bot commands using Azure OpenAI.

It acts as a shared entry point for:

  • hackathon teammates
  • external systems
  • tools like Postman
  • ERP or business events

Quick Start Guide

Purpose

Send any message or command.
The AI parses the intent and converts it into Minecraft bot actions.

Method

POST

Authentication

None required

Endpoint

POST https://resourcedv.azure-api.net/When_an_HTTP_request_is_received/paths/invoke

How it works

  1. You send a POST request with any JSON payload
  2. Azure OpenAI analyzes the message
  3. Intent is extracted and normalized
  4. A structured Minecraft bot command is produced
  5. The bot executes the action in-game

This allows both humans and systems to request work in the same way.

With TriggerAIV1, anyone can:

  • send an order request using natural language
  • connect instantly
  • have the order converted into bot instructions
  • and watch the AI bot mine the resources in Minecraft

That’s why we shared it.<3

AAAALSO If you want a easy 30 seconds API implement you can also use the Postman Version; Use this link and post:

https://prod-30.swedencentral.logic.azure.com:443/workflows/5a0eb91725eb483493c76e6af537e8db/triggers/When_an_HTTP_request_is_received/paths/invoke?api-version=2016-10-01&sp=%2Ftriggers%2FWhen_an_HTTP_request_is_received%2Frun&sv=1.0&sig=yvZsHMzJ4hI37B4rwLCQ88osKJuaSeke-pFYGleLgLo

Change body, do what you like 🙂 Have fun

___-

Cepheo Crafing Creepers

Feature Bombing: Steve’s Diary

For the Feature Bombing badge, we deliberately challenged ourselves to combine multiple user-facing features into a single, coherent experience rather than spreading them across separate tools or screens. The result is Steve’s Diary, hosted as a containerized API running in Azure, which acts as both a narrative layer and a control surface for the Minecraft bot.

Steve’s Diary continuously builds a story and recap of what the bot has been doing in the world. It pulls live data from the game, including inventory state and coordinates, and uses this information as input to a large language model to generate readable diary entries that describe the bot’s actions and progress. This turns raw execution data into something meaningful and human-readable.

On top of this, the diary adds several interactive features in the same interface. It supports text-to-speech, allowing diary entries to be read out loud. It exposes a chatbot interface so users can talk directly to the bot in natural language. It also allows users to send commands back to the bot, closing the loop between observation, narration, and control.

All of these features are intentionally combined into a single screen and a single experience. While feature-dense by design, each feature reinforces the others and contributes to a unified goal: making the bot’s behavior understandable, interactive, and engaging.

Steve’s Diary combines the following user-facing features in one interface:

  1. AI-generated diary entries using a large language model
  2. Text-to-speech playback of diary content
  3. Live Minecraft data integration (inventory and coordinates)
  4. Command execution to control the bot
  5. Conversational chatbot interface to interact with the bot

This approach embraces Feature Bombing not as feature overload, but as purposeful density, demonstrating how multiple user features can coexist in one place while still making sense as a complete experience.

Power of the Shell: Scripting a Complete Minecraft Infrastructure

How we used Azure DevOps, Docker Compose, and Bash to create a fully automated deployment pipeline for a cloud-connected Minecraft server.

The Challenge

We needed to deploy and manage a multi-service Minecraft ecosystem that includes:

  • A Fabric Minecraft server with custom mods
  • A Node.js backend API
  • An AI-powered Mineflayer bot
  • An Azure Service Bus listener for cloud integration

All of this running on a Raspberry Pi, with zero manual intervention after code commits.

Infrastructure as Code

Docker Compose: The Foundation

Our entire infrastructure is defined in a single docker-compose.yml file. No clicking through portals, no manual container creation—just declarative YAML:

Every service, network, volume, and environment variable is version-controlled. Need to spin up the entire stack? One command:


docker-compose up -d

Environment-Driven Configuration

Secrets and configuration are externalized through environment variables and .env files:

This separation means the same compose file works across development, staging, and production—only the environment changes.

CI/CD Pipeline: Azure DevOps

Our Azure DevOps pipeline automates the entire deployment lifecycle in three stages:

Key automation features in our Bash scripts:

Automatic Backups with Rotation:




Health Checks with Retry Logic:






Stage 3: Verification

Post-deployment verification ensures everything is running correctly:

The Complete Picture

Benefits of This Approach

AspectTraditionalOur Scripted Approach
DeploymentManual SSH, copy filesAutomatic on git push
RollbackHope you have a backupLast 3 backups auto-retained
ConfigurationScattered across serversVersion-controlled in git
Reproducibility“Works on my machine”Identical every time
Audit TrailWho changed what?Full git history

Key Takeaways

  1. Everything is Code: Docker Compose defines infrastructure, YAML defines pipelines, Bash scripts handle orchestration.
  2. Self-Hosted Agents: Running an Azure DevOps agent on the Raspberry Pi eliminates SSH complexity and firewall issues.
  3. Defensive Scripting: Every script handles failures gracefully with continueOnError, retry loops, and fallback options.
  4. Preserve What Matters: The rsync exclusions protect world data, databases, and secrets during deployments.
  5. Automated Verification: Don’t just deploy—verify. Health checks and resource monitoring catch issues before users do.

With the power of the shell, our entire Minecraft infrastructure deploys itself. Push to main, grab a coffee, and come back to a running server.

Client-Side Experience Through Automation: The Minecraft Bot as a Client

In our solution, the Minecraft bot is not just a background process or a server-side script. It is a dedicated client entity that connects to the Minecraft server and operates as its own participant in the game world.

The bot exists as an independent entity and acts as an execution proxy for user intent. The player remains the decision-maker, while the bot performs repetitive, low-value tasks such as mining, gathering resources, crafting, and collecting materials. Instead of requiring the user to manually execute these tasks, intent is expressed once and execution is delegated to the bot.

From a user experience perspective, this directly maps to common enterprise UX goals:

  1. reducing manual labor,
  2. automating repetitive work
  3.   allowing users to focus on higher-level decisions.

 Translated into Minecraft, the player continues to interact with the world and make choices, while the bot handles the repetitive execution work in parallel.

(Yes this is our little BOT friend)

Technically, the bot behaves like a client rather than a backend component. It connects to the Minecraft server as its own entity, authenticates and joins the world, receives intent derived from ERP-triggered production orders, executes actions locally in the game environment, and reports state and completion status back to the system. This makes it comparable to non-human clients such as background agents, RPA bots, or headless clients in enterprise architectures.

The bot owns execution, local state, and responsiveness on the client side. This clear separation between intent (defined by the user and upstream systems) and execution (handled by the bot) results in a maintainable and predictable architecture, while significantly improving the user experience by removing repetitive, low-value work.

____

Cepeho Salsa Crafting Creepers

Deployment: ACDC Craftsman

By day three of ACDC, craftsmanship is no longer about proving that something works, but about proving that it works the right way. For our Minecraft Production Order module, our deployment zip file shows most clearly in how the solution is deployed to Dynamics 365 Finance & Operations.

In Finance & Operations, the Application Object Server is the source of truth. If a module is not present under AosService\PackagesLocalDirectory, it simply does not exist from the platform’s perspective. Because of this, our deployment approach follows the same principles Microsoft uses internally: the module is delivered as a ZIP and installed directly into the PackagesLocalDirectory as its own folder.

The PowerShell installer script is intentionally simple but deliberate. It runs relative to its own location, automatically detects where Finance & Operations is installed, and extracts the module into a folder named exactly after the ZIP file. This removes environment-specific assumptions, avoids hard-coded paths, and ensures the module identity is always consistent.

Just as important, the script is safe by default. It will not overwrite an existing module unless this is explicitly requested with -Force. This reflects production thinking: destructive actions should be intentional, not accidental.

Its deployable, repeatable, and trustworthy. It behaves like a native Finance & Operations module, can be installed the same way across environments, and is ready for real DevOps scenarios.

When it comes to deploying to the raspberry Pi:

We treat it exactly like any other production environment by using Azure DevOps pipelines instead of manual SSH or copy-paste workflows. The pipeline validates the Docker configuration, connects to the Raspberry Pi as a self-hosted agent, stops any running containers, deploys the updated files, and then rebuilds and starts the containers in a controlled sequence.

Posted on saturday 😉

Using mscrm-addons.com to generate Purchase Order Summary Document

We use mscrm-addons.com as a lightweight but powerful extension to bridge low-code and structured document output in our solution. The purchasing process starts in a Power Apps canvas app, where the user enters purchase details in a simple, Minecraft-themed purchase order form. When the order is created, we use MSCRM Addons DocumentsCorePack to generate a structured Purchase Order Summary document based on a predefined Word template, that can be linked to resources in our minecraft server. The addon pulls the relevant Dataverse data (product, quantity, prices, purchaser details), merges it into the template, and automatically generates a clean, consistent document that can be emailed or stored. This allows us to keep the front end low-code and user-friendly, while still producing professional, repeatable purchasing documents without custom document-generation code

Link to our Linkedin Post:

https://www.linkedin.com/posts/mads-frisvold-8667231a4_at-arctic-cloud-developer-challenge-at-holmenkollen-activity-7420600438653132800-jcAZ?utm_source=share&utm_medium=member_desktop&rcm=ACoAAC_ODw4BAWrxtkgUsDztq9DP8aVF2MKWwYc

Nasty Hacker: Production Orders Powered by Minecraft!

In our case, that system was Dynamics 365 Production Orders.

What production orders normally do

In a standard ERP setup, releasing a production order typically triggers:

  • Warehouse picking
  • Manufacturing execution systems
  • Shop floor machines
  • Industrial control software

In short: very serious, very real-world manufacturing.

What our production order does

When a production order is released in our solution, it triggers:

  • A Minecraft bot

The production order is treated as valid, approved, and executed. exactly as intended by the traditional ERP.

The difference is only in what executes it.

The semantic hack

We deliberately reinterpreted the meaning of manufacturing:

  • Minecraft acts as the manufacturing execution system
  • Blocks are treated as inventory
  • Mining and crafting are treated as production steps
  • Gameplay becomes the production process

From Dynamics 365’s perspective:

  • A production order was created
  • Materials were consumed
  • Output was produced
  • Status was reported back

All of this is technically correct.

Why this qualifies as a Nasty Hack

  • No custom manufacturing logic was added to Dynamics
  • No simulation layer was introduced
  • No special exceptions were built

We reused standard ERP production flows and simply pointed the execution at something they were never designed for.

The ERP believes it is running manufacturing.
In reality, a bot is mining blocks….Kinda absurd if I say so myself

Conclusion

This is not how production orders are meant to be used.
But it works.

Which makes it a proper Nasty Hacker solution.

Real-Time Service Bus: How the Minecraft Bot Reacts Instantly

When someone creates a production order in Dynamics 365, how fast does the Minecraft bot react?

Answer: Instantly. No polling. No delays. Just push.

The Problem with Polling

Most systems check for new work every few seconds:

Bot asks server: “Any new orders?”

Server says: “Nope.”

Bot waits 5 seconds.

Bot asks again: “Any new orders?”

Server says: “Nope.”

Bot waits 5 seconds.

Bot asks again: “Any new orders?”

Server says: “Finally yes, here’s one from 10 seconds ago.”

Slow. Wasteful. Annoying.

Our Solution: Service Bus Push

Order created in D365

   ↓

Azure Service Bus receives event

   ↓

ASB Listener gets notified INSTANTLY

   ↓

Backend API processes and forwards to bot

   ↓

Bot starts working in under 1 second

No waiting. No polling. The moment an event hits the queue, our listener knows about it.

How It Works

We built a dedicated ASB Listener service using TypeScript and the Azure Service Bus SDK. It sits there with an open subscription to the queue.

The listener uses ServiceBusClient to connect to Azure, creates a ServiceBusReceiver that subscribes to the queue with peekLock mode, and processes messages the instant they arrive.

receiver.subscribe({

  processMessage: async (message) => {

    await forwardToBackend(message.body);

  }

});

When a message arrives, the listener parses the order data, forwards it to the backend API via HTTP, and the backend triggers the Minecraft bot. All of this happens in real time.

Why Service Bus

We could have used HTTP webhooks, but Service Bus gives us guaranteed delivery, built-in retry and dead-letter queues, auto-scaling, decoupling between producer and consumer, and native Azure integration.

The flow is clean:

Production Order → Service Bus Queue → ASB Listener → Backend API → Minecraft Bot

No database polling. No scheduled jobs. Just pure event-driven architecture.

____

Cepheo Crafting Creepers