It’s a wrap — Final delivery at ACDC 2026 🚀

This is our final contribution for the ACDC2026 Hackathon, our final delivery.
We made a video of the end to end solution, enjoy the video.

Image bellow describes the overall design of the solution.

🧱 Redstone Realm

Showing Jetson Nano Edge AI LLM providing guidance for the customer off-grid.
  • Built a real, working solution while actively exploring new Microsoft platform capabilities
  • Used in-box AI such as Prompt Columns and Copilot to embed AI directly into the data model and user experience
  • Grounded AI output in structured data to keep interactions predictable and explainable
  • Used Code Apps to experiment with new ways of building user-friendly experiences and validate ideas quickly
  • Experimented with Edge AI using an NVIDIA Jetson Nano to run LLMs closer to execution
  • Explored trade-offs between edge-based and cloud-based AI through hands-on experimentation

Redstone Realm, for us, was about building, testing, learning — and pushing understanding forward using real tools on real platforms.

Relevant blogpost:
Existential Risk: intelligence without agency, Nvidia Jetson Nano, Glossy Pixels | Arctic Cloud Developer Challenge Submissions

🛡️ Governance & Best Practices

  • Stored all secrets in Azure Key Vault, accessed at runtime by Azure Functions and Power Automate via environment variables
  • Used a clear DEV / TEST / PROD environment strategy with a structured ALM setup for predictable deployments
  • Maintained clear architectural separation between UI, integration, and execution, with deterministic and testable backend logic
  • Applied consistent naming conventions across fields, flows, and assets
  • Used a medallion data structure (raw, refined, curated) to ensure data quality and traceability
  • Used Copilot as an assistive, explanatory layer — not an autonomous decision-maker

Governance was built in from the start to ensure the solution is secure, maintainable, and trustworthy beyond the hackathon.hackathon.

Relevant blogposts: ALM implemented

🧠 Data, AI & Analytics

Even in a hackathon setting, we designed with structure and responsibility in mind.

From raw blocks to blazing insights: use Microsoft Fabric to take messy data through a structured refinement process, model it into trusted semantic layers, unlock visual storytelling with Power BI, and build a foundation with Fabric IQ that helps both AI agents and data scientists uncover the real value in your datasets. If something doesn’t add value, keep polishing until it sparkles! 💎

Relevant blogposts: From raw blocks to data diamonds

⚡ Low-Code

  • Used low-code to move fast while keeping structure and maintainability intact
  • Built a back-office Model-Driven App for governance, search, and operational overview
  • Used Prompt Columns to embed AI directly in the data model and enable predictable Copilot behavior
  • Leveraged new Power Platform capabilities to deliver advanced functionality quickly and securely
  • Established an analytics foundation using Microsoft Fabric with a medallion architecture (raw, refined, curated)

    Relevant blogposts: OneFlow and LINK Mobility Sponsor Badge and more, Go With The Flow | Arctic Cloud Developer Challenge Submissions


🧑‍💻 Code Connoisseur

  • Built a Code App using vibe coding, outside traditional Model-Driven and Canvas patterns
  • Implemented Azure Functions in C# (.NET 8) with RCON API integration to a Minecraft server
  • Ensured backend logic is deterministic, testable, and decoupled from UI and AI
  • Experimented with Edge AI using NVIDIA Jetson Nano, leveraging Linux shell tooling and low-level configuration
  • Explored trade-offs between edge-based LLMs and cloud-hosted AI services
  • Kept business logic in code, with clear separation between experience, AI, and execution

Code was used deliberately — where control and predictability mattered most.

Relevant blogposts: OneFlow and LINK Mobility Sponsor Badge and more

    🌍 Digital Transformation

    • Built a solution that starts from user intent rather than technical specifications
    • Transformed intent into structured data that can be reasoned about, adjusted, and reused
    • Used AI and Copilot to explain consequences and trade-offs instead of automating decisions
    • Connected business logic, data, and visualization into a continuous feedback loop
    • Used Minecraft as a visualization engine to make outcomes tangible and easy to understand
    • Demonstrated how low-code, pro-code, and AI can work together to support better decisions

    The transformation was not just technical — it changed how users understand and act on complex decisions.

    Relevant blogposts: OneFlow and LINK Mobility Sponsor Badge and more, NASTY! If it doesn’t work, expose it to the world | Arctic Cloud Developer Challenge Submissions

    NASTY! If it doesn’t work, expose it to the world

    Our original idea was to use Mineflayer with an MCP server to automate builds in Minecraft. But we had issues connecting this to online servers and making it efficient enough to be justified. We were successful in connect to local instances, but we wanted our solution to be fully in the cloud.

    As a workaround, we instead use RCON to remotely connect to a live server’s console. From there we can easily send AI generate commands based on user input to very quickly create builds on a live server. All this despite the RCON wiki stating
    “It is dangerous to expose RCON ports to the internet. RCON is not encrypted and can be a subject to man-in-the-middle attacks.”

    We gather data from user input:

    Run a prompt to generate commands using the new (preview) prompt columns in Dataverse.

    And run those commands via a Power Automate flow that triggers an Azure function that connects to the live server using RCON and passes along the commands:

    The big benefit is that we don’t need to send an virtual player into Minecraft to build the structures, RCON directly updates the Minecraft data, which means a build can take seconds over minutes, and again, fully operates in the cloud.

    Nasty hacks? Absolutely. We are hardening the process by using non-standard ports and limiting IPs to just connections coming from our own machines and the Power Platform. Even then there needs to be additional work to fully encrypt and protect our Minecraft environments.

    Our goal was to get running software, which was achieved. Connecting the Power Platform to Minecraft turned out the be much more challenging then first believed.

    We feel dirty, nasty, and deserving of the Nasty Hack badge.

    DOH!

    Nothing like spinning your wheels for *hours* trying to diagnose networking issues.

    There is a lot of great community content on connecting Minecraft *locally* to local MCP servers, running node.js applications using the Mineflayer APIs.

    The information gets a lot more cryptic when you try to connect to cloud based services and Minecraft server edition.

    After spending hours trying to diagnose connection issues (and not even sure what I was trying to achieve was even technically possible), I really felt the wheels spin.

    Why would it work one way and not another?

    Even to the point I had to dig deep in the toolbox for old tools like Telnet!

    It turns out, it was a minor case of finger dyslexia which was the cause of the problem.

    IP address 24.5.6.7 is not the same as 20.5.6.7…

    1 digit, 20, NOT 24… hours lost. tears shed. swears muttered.

    Once that was resolved I was unstuck and back to making progress, only a few hours behind.

    D’oh!

    From raw blocks to data diamonds

    Our road to raw blocks to data diamonds

    The Medallion Architecture

    The medallion architecture is a data organization pattern that structures your data platform into three distinct layers, each representing a different level of data quality and refinement.

    Think of it like refining raw ore into jewelry: you start with rough material and progressively transform it into something valuable and ready to use.

    Why Use It?

    It creates a clear, logical flow for data processing. Each layer has a single responsibility, making the system easier to understand, debug, and maintain. You always know where to find data at any stage of its journey.

    The Three Layers

    Bronze — The raw landing zone. Data arrives here exactly as it came from source systems, unchanged and unvalidated. It’s your safety net and audit trail.

    Silver — The cleaned and conformed layer. Data is validated, deduplicated, and standardized here. Think of it as your “single source of truth” where business rules are applied.

    Gold — The business-ready layer. Data is aggregated, enriched, and shaped for specific use cases like reports, dashboards, or machine learning models. This is what end users consume.

    Lets get started

    First things first, lets create our Workspaces:

    • ACDC 2026 Dev
    • ACDC 2026 Test
    • ACDC 2026 Production

    Lets start with creating our a Lake house for our Bronze layer.

    New Lake for the Bronze 🥉

    Then we are creating a Dataflow Gen2 for retrieving data from Dataverse to our bronze lake. Creating connection to Dataverse from Fabric using Dataflow Gen2:

    Selecting the tables we want to report on. In our case it is the Dream Project (basstards_dreamproject) table.

    We are adding this raw data to our bronze data lake (after creating it doh).

    Adding the data to the Bronze lake:

    Choosing the destination to the bronze db.

    Use the settings as is with automatic mapping. Works for now:

    Save the settings and we are retriving the data and adding it to our bronze lake with raw data. Now the data is in our Bronze lake🤓

    Lets go further on to the silver medal. Creating a new lake house for the silver layer:

    Next, create a new Dataflow that is connected to our bronze lake that we are going to transform and then update the Silver lake:

    Finding our bronze lake

    Finding our Lakehouse

    And connecting to the bronze lake:

    Choosing the data we want to work with:

    Now, the transformation begins:

    • Removing unwanted columns
    • Cleaning data
    • Renaming fields
    • Etc.

    Removing some unwanted columns and renaming to make it more cleaner for the silver layer

    Then adding the changes to the silver lake

    Creating a connection for the silver lake

    Then choosing the destination of the transformed table

    Then using automapping. Doing the magic for us and saving the settings.

    If we go to our ACDC_Silver lake we should see the data updated (after the Dataflow Gen2 has ran). For now refreshing it manually. You need to click Save and Run obviously oops..

    Now the silver lake is update with the transformed columns

    Now lets move on to the gold layer:

    And again we need to create the lake for the gold layer

    Then a new Dataflow Gen 2 for the last transformations for the gold layer

    Then we need to connecto to the silver lake like we did for the “FromBronzeToSilver” dataflow

    And finding the lakehouse source

    Next connecting to the silver lake

    Selecting the Silver lake data and clicking Create:

    Now we retrieve the data from the Silver table

    Now we have the same data from the silver lake

    Now lets som aggregated data or business rule in the dataset that can be used in dashboards and reports or other subscribing systems.

    For this example lets just create a column that shows the difference between the budget and estimated cost. And for the fun of it lets see what AI Prompt can help us with that:

    We need to connect to FabricAI it looks like:

    ooops.. that didnt work…

    Lets do it in another way

    Now we got a new column with the variance between the budget and the estimated cost. This contains null values and we can clean this up by replacing the null values with 0. As seen in the next steps.

    By replacing null values with zeros it looks a bit cleaner.

    After replacing:

    And now we want to move this to the gold layer.

    Like this:

    Now adding the data to the gold lake destination

    Using automatic settings to the destination again and then saving the settings:

    Next we need to Save and Run the Dataflow Gen2, as we learned from the FromBronzeToSilver 🤓

    After the Dataflow is Saved and Run, we should see the data in the Gold Lake. And look at that, it actually worked. Wohoo. Now we have our Gold medallion ready or the diamond data we want 💎

    Then creating a new semantic model for use in Power BI

    Now, we have a semantic model:

    Lets try to make a report out of it

    And then we use the semantic model we just created

    And then we select the semtantic model based on our gold lake:

    Aaaaaaaaand 🥁 There we have a report📈

    And were out of time this year…

    I hope this gave a deep insight of how we have created a very simple data platform based on the medallion structure with example of each steps from start to finish.

    That was the end of this post. I hope this warms the diamond hearth (💎) of Catherine Wilhelmsen and I hope you give as plenty of points in this category. Best regards Fredrik Engseth🫶

    Existential Risk: intelligence without agency

    For the Existential Risk badge, we focused on a clear boundary: using generative capabilities without granting autonomy.

    Our solution is built entirely on out-of-the-box features. There is no custom model training, no goal-seeking behavior, and no self-directed decision-making. The system transforms human intent into structured instructions, and nothing more.

    The process starts with a Generative Description, which produces a natural-language description of a house and enriches it with explicitly required attributes such as size, materials, and layout constraints. This step does not infer intent or optimize outcomes—it provides structured context.

    That output is then passed to Generative Instructions, where the description and attributes are converted into a strict, parsable JSON array. The prompt performs a controlled transformation, designed for predictability rather than creativity.

    From there, the flow is purely mechanical.
    The JSON array is sent to Power Automate, which iterates over the commands and forwards them to an Azure Function. The function acts as a thin integration layer, relaying the instructions to Minecraft for execution exactly as provided.

    By design, this approach mitigates existential risk. The system never holds goals, evaluates results, or adapts its behavior. Intelligence is limited to composition and formatting, while agency remains firmly with the human.

    Generative Instructions. This Promt takes additional attributes and formats into a parsable json array taht we send of to a Power Automate that sends the requests to an Azure Function connected to Minecraft.

    The first promt column( Generated Description) generates the general description of the house

    OneFlow and LINK Mobility Sponsor Badge and more

    We here by claim the badges:

    • Stairway to Heaven,
    • Thieving Bastards,
    • Power User Love
    • OneFlow Sponsor Badge
    • LINK Mobility Sponsor Badge

    The components in the current solution:

    We have created a solution containing both pro code and low code, AI, Microsoft APIs and third party tools.

    • Code App for collection Dream Project request
    • Power Automate that uses Dataverse, OneFlow, AI Builder, Outlook and Link Mobility actions
    • Model-driven app as an backoffice app to follow up project. Perfect low code app for managing dream project requests.
    • OneFlow Portal for managing contracts

    A go through of the solution

    The Code App is vibe coded using TypeScript and React and used for collecting Dream Project requests. This creates a Dataverse record that triggers a power automate flow that creates a contract in OneFlow. Pure pro code as far as the eye can see 👀

    Further, we created a Power Automate flow that gets triggered when a new Dream Project is created and this automates our contract process by using OneFlows actions for creating the contract and adding the customer to the contract. LINK Mobility and Outlook action are used to send an SMS and email to the customer responsible for quick follow up and contract validation before sending it. This is a perfect combination of pro code and low code.

    The flow looks like this.

    The AI Prompt looks in the Power Automate looks like this and this is sendt til the AI Prompt.

    The “Run a prompt” action sends the data to the prompt below and gets the outline for the contract details back. And, we are using this in the contract generation.

    We used Template Groups in OneFlow to populate documents from power automate.

    The fields are created in the OneFlow portal. These must be created and added to the template before they are available in the power automate flow.

    The fields are added to the Template in OneFlow like this.

    After sending the contract out from the OneFlow portal the customer gets and email that they can sign.

    The contracts looks like this after been generated by the Power Automate flow. The description is generated using AI Prompts feeded with data from the dream project request from Dataverse.

    The LINK Mobility

    We are using the LINK Mobility Action in Power Automate that lets us send SMS with ease using a source number, platform id and partner id and a phone number to send the SMS to. This works like charm.

    The SMS that is sent looks like this:

    Business value:
    The tools helps as stich our solution together and improve the value for our customers by providing faster contract creation and automation, and notification by SMS for quick follow up by customer success manager.

    Claims:

    Stairway to Heaven

    In our contract generation flow we are using three Microsoft APIs for solving our business need

    • Dataverse API
    • AI Prompt API
    • Outlook API

    Thieving bastards

    Using OneFlow and LINK mobility as payable third party solutions to improve our business process by using existing tools in the marked that is developed by marked experts and that keeps being updated and improved.

    OneFlow

    Using OneFlow for contract generation and posted on LinkedIn

    LINK Mobility

    Using LINK Mobility for notification to customer success mangers and more SMS notification in an upcoming sprint. And posted on LinkedIn

    Power User Love

    By combining the power of pro code for more customizable look, feel and usability for collecting data and using low code apps for backoffice project follow up with a more fix look and feel with drag and drop.

    Code App using TypeScript and React for more control over the user interface and interactivity:

    Using Model-driven app as a backoffice with more “strict” look and field, but perfect for following up and adding data.

    Proof of LinkedIn post

    https://www.linkedin.com/posts/fredrikengseth_oneflow-and-link-mobility-sponsor-badge-and-activity-7420762390587166720-9Gl4?utm_source=share&utm_medium=member_desktop&rcm=ACoAACAQyBoBq-xtaRrmS1pVkkIip0jNA_TMbIo

    ALM implemented

    We have implemented ALM to deploy between solutions using GitHub Actions. We have three workflows:
    – Export solutions PR: Exports chosen solutions from dev and creates a PR to main.

    – Deploy Solutions (Test): Triggers manually or on merge to main. On merge, it decides which solutions to deploy based on the contents of the PR.

    – Export Solutions (Prod): Runs manually on selected solutions and deploys to Prod

    For the Power Of The Shell badge, we’ve leveraged powershell to determine the deploy package based on the triggering PR. This script reads the contents of the last PR, decides which solutions have been changed, and packages solutions and deploys them based on this information:

    Retro badge


    When things don’t behave the way we expect, we tend to step backwards in time. Pen and paper on the table. Telnet in the terminal. No fancy tooling—just enough to answer the most basic question first: is the server actually there?

    Paper didn’t help us auto-align boxes. It didn’t fix spelling mistakes. It didn’t make anything look pretty.
    What it did do was slow us down in the right way.

    With pen and paper, we talked more. We argued a little. We erased things. We redrew them. We spent time thinking about what we were building instead of how well the boxes lined up. The lack of undo forced intent. Every line had a cost, and that cost made decisions visible.

    Telnet played the same role. No abstractions, no helpers—just a raw connection attempt. Either it responds or it doesn’t. Brutally honest feedback beats elegant dashboards when you’re lost.

    This wasn’t about being anti-tools or romanticizing the past. It was about stripping away convenience until understanding had no place to hide. Once the thinking was clear, the modern tools became useful again.

    Retro isn’t old. Retro is intentional.

    Remarkable Team Spirit

    One of our teammates had to step out mid-challenge. That’s part of real projects, and definitely part of real teams.

    A replacement stepped in, context was transferred fast, and the team adjusted without drama. Now we’re shifting gears for the final iteration—focused, aligned, and still enjoying the ride.

    And yes. We’re also staying hydrated. Team rituals matter.

    Final stretch. Same energy. 🚀

    Linkedin Post

    Nvidia Jetson Nano

    A big part of ACDC is experimenting with new ideas and technologies, even when the outcome is uncertain. As part of that, we set up an Nvidia Jetson Nano that may—or may not—end up as part of our final implementation.

    What makes this device interesting is that it is an edge AI computing unit with very low power consumption (around 15–25W), yet still capable of running a local LLM.

    We experimented with running Ollama using Gemma 3, exploring how a small, embedded model could be used in off-grid or low-connectivity scenarios. One potential use case is a locally deployed language model that interprets environmental data from sensors—such as soil conditions, temprature or image recognition—without relying on cloud services.

    These boards are also used for robot applications and could potentially be the heart of our robot building houses in the real world.


    The concrete use cases are still evolving, but the goal is clear:
    to enable intelligent behavior at the edge, where power, connectivity, and infrastructure are limited.

    It takes like 15mins to half an hour to resolve these promts, so we need patient customers..