Final Submittion – OUR JOURNEY AT ACDC 2O26

Problem statement: We came to ACDC Hackaton 2026 with an ambition to solve the problem that exist in supply chain, that affects everyone involved. Global spare-parts supply chains are slow, fragile, carbon-heavy, and often fail when parts are obsolete, or manufacturers no longer exist.

Original Solution we wanted to build: CraftPortal replaces shipping with digital “crafting,” using AI and cloud technology to match, recreate, and produce parts locally- fast, resilient, and sustainable. It was, however, focused on the idea from ou perspective and plan to refine this concept during the hackathon.

Evolution Journey: and then happened why we love ACDC Hackaton for.

– “Put the Customer in focus”, said Sara on the opening day, our beloved judge on Digital Transformation category.

So, we worked on the idea to enrich our web application solution to include the ISV package and give our potetial customer with UI and UX they recognize well, namely M365, Power Platform, BizApps.

– “I love your futuristic concept…” said Mikael from Redstone Realm perspective and inspired us to combine Microsoft 365 / Dynamics 365, SharePoint, Teams, and Azure

“Simple code screenshot is not enough, show me how it solves your problem”, noted Keith (Code Connoisseur),  and we challenged the status Quo and wanted to generate bigger impact…. So what we did is

“Turn data into insights….”, reminded Cathrine from Data, AI and Analytics, and we revised our data model focusing to build a solid fondation for our solution, so we could start to map external data sources and motiveted us to explore RAG.

“Everything you have there has to be there for a strong reason” warned us Fredrik from Low-Code angle in the start, and we critically reviewed our ecosystem to follow strict focus on the power of the low code.

“No security holes” declared Scott on Governance and Best Practices and you don’t mess with Scott. No fluff, we need a proper best practice focused ALM and Governance for the whole application.

As a result of continued brainstorming during these 3 days, and dialogue, our refined Solution started to look like this: … scroll down:)

DIGITAL TRANSFORMATION

The Concept

Supply chains are slow, fragmented, and carbon-heavy. Parts ship across the world when they could be printed locally. CraftPortal changes this – a marketplace where recipes travel through the portal, parts get made nearby. Faster. Greener. Smarter.

Customer needs a part. Can they print it? Yes – browse marketplace, select blueprint from IP Owner, print, deliver. No – publish a tender, receive bids from Manufacturers, select, award, contract, they print, deliver. Two paths. Same portal.

We built two paths to CraftPortal. A SaaS web application for users who want to jump straight in. And an ISV package for customers who want CraftPortal wired into their Microsoft 365 environment.

Users choose: Web App or Power Apps:

Customers who prefer Model Driven App get a clean, familiar UI – the Power Apps experience they already know, tuned for digital inventory workflows:

But they can surely use our fancy web app. Vendors – IP Owners and Manufacturers – use the Portal interface. They browse public tenders, submit bids, upload recipes, manage contracts, track orders. All through Power Pages.

Value & Monetization

CraftPortal sits in the middle of every transaction. Recipe rented? We’re there. Part printed? We’re there. That’s the value.

Monetization options:

  • Subscription – monthly/annual access to the platform
  • Transaction-based – percentage per recipe rental, per tender, per print job

Or both. Base subscription for access, transaction fee for volume.

LOW-CODE

Low-Code: The Redstone Behind CraftPortal

We built CraftPortal in 3 days. A marketplace. Tender flows. Vendor management. Document automation. AI agents and Power pages portal. How? Low-code.

The Building Blocks are as follows:

Power Pages for the Portal. Model Driven Apps for back office. Power Automate for every flow. Copilot Studio for autonomous agents. Dataverse for data. Generative Pages for dashboards.

We wired it together with clicks, not code.

The Low-Code Highlights

  • Autonomous agents – Copilot Studio detects Dataverse changes, posts to Teams, triggers RPA
  • RPA integration – Power Automate Desktop opens Bambu Lab Studio and clicks Print.
  • Teams + SharePoint + Dataverse – Fully automated channel and document location creation via Power Automate
  • Generative Pages – KPI dashboard pulling live Dataverse data
  • OneFlow contracts – Power Automate creates and sends contracts for signature
  • Link Mobility SMS – Automated bid notifications
  • Custom Connectors – ISV package API integration
  • FetchXML Builder – Low-code query generation (we click buttons, it writes XML)

Deep Dives

Want the details? We documented everything here:

The Result

A full digital inventory platform. Built by a small team. In 3 days. Low-code made it possible.

CODE CONNOISSEUR

Low-code gets you far. But sometimes you hit a wall – a custom UI that doesn’t exist, real-time updates that Power Automate can’t handle, or an API that needs to be built from scratch. That’s when we switch gears. Pro-code fills the gaps.

We have various code projects and components in our solution:

  1. PCF Control. That allows customers to order the appropriate equipment that should be printed. Client Side Salsa | Arctic Cloud Developer Challenge Submissions
  2. Power Pages Portal. That manages turning basic Minecraft resources into different tools and help clients to find an appropriate vendor for printing adapts to all devices and screen sizes. Chameleon | Arctic Cloud Developer Challenge Submissions
  3. Web Portal for the 3D printer. Using this custom local web portal we manage our printer device. Crafting, Crafting, Crafting…. Category: Pro Code  | Arctic Cloud Developer Challenge Submissions
  4. Model builder app – which is helping customer recover lost recipes from the single shot; Right now – bring real-time data to the app | Arctic Cloud Developer Challenge Submissions
  5. Azure Function – used to communicate with the external vendor API. ISV Package – the missing link | Arctic Cloud Developer Challenge Submissions

DATA, AI, ANALYTICS

The Dashboard

Our CraftPortal KPI Dashboard brings it all together. Built with Generative Pages and React, pulling live data from Dataverse:

  • Summary cards – Total Projects, Open Projects, Total Bids, Wandering Traders
  • Project Status Distribution – Donut chart showing lifecycle states
  • Bid Conversion – Submitted vs selected bids
  • Win Rate by Trader – Performance leaderboard
  • Projects per Month – Trend analysis over time
  • Top Wandering Traders – Gamified rankings

Light mode. Dark mode. Minecraft item icons from the official API. Business intelligence with a blocky twist.

We built the foundation. Dataverse as our core. Power BI dashboards for KPIs – tender status, bid conversion, vendor performance, projects per month. Live telemetry streaming from our IoT-connected Crafting Tables via Azure IoT Hub. Real-time monitoring of print jobs, temperatures, and device health.

Last year we went deep on Microsoft Fabric – Medallion architecture, Data Activator triggers, the whole pipeline. We didn’t want to repeat ourselves.

The Vision

The vision was fun: a custom Knowledge-based Copilot powered by Fabric. Pull external data from the official Minecraft API (GitHub – PrismarineJS/minecraft-data: Language independent module providing minecraft data for minecraft clients, servers and libraries.)

via Azure Data Factory. Deploy a proper RAG pipeline – chunking strategies, metadata filtering, semantic search, hybrid search, custom retrievers. Debunk RAG the right way.

Unfortunatley, we barely finished the data platform in time. The RAG adventure stays on the roadmap.

Sometimes three days isn’t enough. But the foundation is solid. The diamonds are waiting to be mined.

GOVERNANCE & BEST PRACTICES

Essence

Our goal during the hackathon was to show the complex implementation of the project with different aspects of the implementation.

When it comes to even the industry focus switched to AI related topics it still requires advanced level of the solution design to enable existing services for the LLM.

That is why we mentioned advanced level technologies such as: Azure Local, Lighthouse, and IoT Hub.

At the same time complex solutions usually require more effort for implementation. By following best practices of each piece of that puzzle, we are increasing the overall success rate of the delivery. ISV Package – the missing link | Arctic Cloud Developer Challenge Submissions

REDSTONE REALM

We created our business solution using Microsoft technologies - Open AI, Azure DevOps LLMs, Azure Function, Outlook and Microsoft teams. Check out the article ISV Package – the missing link | Arctic Cloud Developer Challenge Submissions

By deliberately meeting and exceeding every requirement across Digital Transformation, Low-Code, Pro-Code, Data & AI, Redstone Realm, and Governance & Best Practices – while continuously refining our solution through your direct feedback – we believe CraftPortal represents the complete ACDC vision, and we thank the judges for challenging us, guiding us, and inspiring us to build something truly worthy of this win.

Thank you for ACDC 2026!

With love, team LogiQraft.

PS! We really wanted to share a PREMIERE of our final movie with you, before the official release so here you go https://www.youtube.com/watch?v=KV2p3FNTLNI

Event-Driven Autonomous Agent Solution

The challange described in our blogpost – Who is the King of Integrations? Or what? (Add link?)  was that excisting integration libriraies fto connect to Bambu lab local API provided very limited functionality and not fully optimized funcionality we could interacr with our 3D printer. 

Here is our second game plan using low code tools  where we tudned to RPA capabilities in Power Automate workflow. This is also an option since as Solution Providers Logiqraft wants to have multiple option available for our customers when they decide to use our app but migh have problems related to networking or other issues.

We combined Copilot Studio autonomous agent with Power Automate Desktop. When a user updates a recipe status to “Ready to Print” in the Model-Driven App, the autonomous agent detects the change in Dataverse, posts a notification to Teams, and triggers the RPA flow. The RPA then opens Bambu Lab Studio and clicks Print. The physical printer starts the job.

ISV Package – the missing link

We’ve shown you the concept. The roles. The flows. The tech stack. Now let’s talk about how this actually lands in enterprise.

The simple Steve logs into an external portal. Rents recipes. Publishes tenders to print from manufacturing vendors. Monitors production and delivery status. All from the portal interface.

Works great.

But enterprise Steve, the one working at Equinor, IKEA, or Siemens -+doesn’t just browse marketplaces. Enterprise Steve has:

  • An ERP system
  • A CRM system
  • Procurement workflows
  • Approval chains
  • Compliance requirements
  • IT policies

Enterprise Steve needs CraftPortal connected to his world. His tenant. His systems. His processes.

The ISV package bridges these two worlds. Customer data stays in customer tenant. CraftPortal handles the marketplace, IP Owners, Manufacturers, the recipe catalog.

No workflow fragmentation. No copy-paste between systems. No “let me check the other portal.” One flow. Connected.

What Customer Get

Power Platform Components

ComponentCraftPortal Examples
TablesTenders, Recipes, Vendors, Manufacturers, Bids, Projects, Parts
DashboardsTender Overview, Production Status, Vendor Performance
PCF WidgetsRecipe Viewer, 3D Model Preview, Status Tracker
Power Automate templates“New Tender → Notify Vendors”, “Bid Awarded → Create Project”, “Part Printed → Update Inventory”
Security ModelRoles: Procurement Manager, Vendor, Manufacturer, Viewer

What We Get

  • AppSource listing = discoverability, Microsoft co-sell, enterprise credibility
  • Do this right and CraftPortal becomes invisible infrastructure — always there, impossible to replace

That’s not a customer. That’s a permanent relationship.

The tech stuff

Our ISV package includes two data integration flows. Custom Connector for on-demand requests — direct calls to CraftPortal API when you need real-time actions. Cached Core Data for near real-time sync — we push core data into the customer’s environment via Azure EventGrid into Azure SQL and Dataverse. Why both? Cached data enables full delegation in Power Apps. No query limits. Instant performance. Citizen developers query local tables instead of external APIs. CraftPortal data that feels like their own.

The Code

Azure Function Trigger & Interface

Trigger Type

1) HTTP-triggered Azure Function (API Gateway)

  • Used for synchronous operations and for publishing events
  • Secured via Azure AD authentication

2) Event Grid–triggered Azure Function (Event Processor)

  • Subscribes to Event Grid topic events
  • Processes vendor integration asynchronously
  • Updates Dataverse with final status/result

Interface Characteristics

  • REST-style endpoints
  • JSON request/response payloads
  • Versioned route (example):

/api/v1/vendor/operation

The Release

Power Platform Environment Strategy

Source control strategy

Repo structure (example)

  • /solutions/<SolutionName>/ (exported source using Power Platform CLI/PAC)
  • /pipelines/ (YAML for CI/CD)
  • /tests/ (integration test scripts, Postman collections, Playwright scripts, etc.)
  • /docs/ (release notes templates, runbooks)

CI Pipeline

CD Pipeline

Building the ISV package deliverable

An ISV-style deliverable usually includes:

Managed solution ZIP(s)

  • Core solution (managed)

Installation guide

  • Required licenses and prerequisites
  • Import steps
  • How to set environment variables
  • How to create/bind connections
  • Security roles to assign

Configuration workbook

  • List of env vars, defaults, required values
  • Connection references mapping
  • Any URLs/endpoints

Release notes + known issues

  • What changed, what to verify

Support / troubleshooting

  • Common import errors
  • How to re-run failed flows
  • Health check steps

Infrastructure as a code

In our solution, we are using both managed power platform ISV and dedicated cloud infrastructure for them.

It means that each customer should deploy his own Azure infrastructure to unlock Power Platform solution distributed via the AppSource.

We are introducing it via the one click Azure infra deployment process. Model driven app has dedicated admin App which is Allowing user run call the deployment by himself after the main package installation.

Referance: https://learn.microsoft.com/en-us/azure/azure-resource-manager/templates/deploy-to-azure-button?WT.mc_id=IoT-MVP-5002324

Deployment

Note: The selection of the resourcegroup is not part of the BICEP / ARM template. It expects a resource is available. This selection is provided by the portal.

To support that scenarios, We need to provide the all the required bicep files, which are the blueprints for the Azure Services.

Benefits of Bicep

Bicep provides the following advantages:

  • Support for all resource types and API versions:
  • Orchestration
  • Repeatability

The following examples show the difference between a Bicep file and the equivalent JSON template. Both examples deploy a storage account:

Directory Structure

Deployment.

We are using couple of Docker compose files for prod and dev deployment.

It allow us simplify the infrastructure for the local development setup.

Keep in mind that the main rule of using docker-compose: No sensitive data should be hardcoded inside, all environment specific details must be placed to the .env file.

We are following a classic approach with private azure docker registry (ACR) to store frontend and backend docker images.

How to convince IP Owners?

The idea behind is receive all the benefits of owning the infrastructure + extra features of the Platform owner(LogiQraft) like dedicated, specifically designed AI services, Proactive monitoring system, etc…

To make it work, we are using Azure Lighthouse + Managed Identity.

How it works:

  1. The customer delegates a scope (subscription / resource group) to you via Azure Lighthouse. 
  2. Your identity in your tenant (this can be:
    • a Managed Identity, or 
    • a service principal/app)

is granted a role on their resources through that delegation.   

  1. You then query their Application Insights / Log Analytics data using your identity, and Azure enforces the delegated RBAC.

Why Lighthouse is ideal:

  • No need to create/maintain identities in every customer tenant
  • Customers can revoke access easily 
  • Scales across many tenants
  • We assume that IP Owners would be careful to share their IP and by following this approach we are addressing their concerns
Categories: Redstone Realm , Governance & Best Practices , Code Connoisseur , Digital Transformation

Badges:  Power Of The Shell, ACDC Craftsman, Plug N’ Play

Going with the Flow: Wiring Steve to the World

Some recipes are simple. A user-friendly UI is enough: Steve publishes a request, Wandering Traders reply with recipes, deal done.

But some recipes are complex. Specs are unclear. Steve doesn’t know exactly what he needs. There’s back-and-forth: questions, clarifications, files, photos, technical drawings.

We need proper channels. Share specs. Exchange files. Chat in real time.

Discord would work in the gaming world. But we’re building on Microsoft cloud.

So we use Modern Workplace, AI WORKFORCE suite: Teams, SharePoint, and the whole M365 family.


The Challenge: Connecting SharePoint, Teams, and Dataverse

As experienced consultants, we know the options:

Option 1: Server-side sync Quick. Efficient. But locked – not flexible when you want to extend. And no Teams integration.

Option 2: Dynamics 365 Teams Integration The out-of-the-box “Collaborate” button. Sounds good, but: it creates a separate Team or channel for each record. For our solution, that means a new channel for every single Request. Hundreds of channels.

And it’s not automatic – users still need to click “Collaborate,” choose to create new or merge to existing. Manual steps. Extra friction. Our users aren’t ready for that level of complexity just yet!

Our Use Case

As customer-oriented Redstone Engineers, we know adoption is everything. We wanted:

  • Fully automated
  • Zero clicks from user
  • One Team, multiple channels (one per tender)
  • Documents synced to the right place

The Solution: Power Automate

We know how OOB sync works under the hood. There’s a Document Location entity – the infrastructure connecting SharePoint to Dataverse.

We also know that when you create a Teams channel, SharePoint auto-generates a document library for it.

So we wired it up ourselves.

When a Tender is created:

  1. Teams channel created under the main Tender Team
  2. SharePoint folder structure generated
  3. Document Location record created in Dataverse
  4. Everything linked. Automatically.

Fully customized. Total freedom. Zero lines of code pro code. (Shoutout to FetchXML Builder from XRM Toolbox – making us look like we know what we’re doing since day one.)

Categories: #Redstone Realm, #low-code

Azure Local: Bring your own servers to the Azure Cloud 

CATEGORIES: Redstone Realm, Governance & Best Practices.

There are a couple of business cases that are required to comply with strict country regulations, such as keeping data in a specific location or region, or reducing delays between inter-service communication by hosting critical parts of the infrastructure close. 

In business, it’s called “Bring Your Own Server” (BYOS), which means using your own hardware with a provider’s infrastructure (like a data center or cloud) for hosting, or a Hybrid Cloud approach. 

In our project, we are using our own K8S cluster, which hosts microservices on the customer site, its allow us to build the solution that will even work during a lost internet connection without stopping the processes.  

Big customers which are owning a huge amount of 3d printers/farms require extra government and management capabilities. It can be covered by edge IoT gateways on the client side. At the same time, it still requires seamless Azure integration, which can cover the security model part. 
 
IP owners licensing digital inventory – where manufacturers grant printing rights instead of shipping physical parts – often have stringent data sovereignty requirements (research), making our hybrid approach essential for industries like defense and aerospace.” 

Hopefully, Microsoft already has an approach for that case, it’s called Azure Local.  

Azure Local is the new, unified branding for what was previously known as Azure Stack HCI, extending Azure services to your own infrastructure for hybrid cloud management, while Azure Stack HCI was the specific hyper-converged solution.

 
 
Technical implementation  

As an edge device, we are using Sipeed Nano Cluster, which is the basis for several Raspberry Pi CM5 modules.  

High-Level diagram 

In our case we are hosting the following services: 

Core microservices responsible for: 

  • 3D printer orchestration and job scheduling 
  • Print pipeline execution and real-time monitoring 
  • Device control, firmware integration, and safety checks 

Edge IoT gateway services: 

  • Secure printer and sensor connectivity 
  • Local buffering and processing of telemetry data 

Following that approach, we are receiving all the benefits of the cloud and private cloud.  

Show and tell – CraftPortal Evolved: From Minecraft to Supply Chain

in our first post, we talked about the problem. Supply chains are slow. Parts ship across the world. Weeks pass. Carbon burns. Sometimes the manufacturer doesn’t even exist anymore.

We asked: what if supply chain worked like Minecraft?

Now let’s show you what we mean. (no, its not the same picture)

Think about how Steve plays Minecraft.

Steve needs a diamond sword. He doesn’t call a supplier in another country. He doesn’t wait for a ship. He finds the recipe, gathers materials, walks to his crafting table, and makes it. If he needs something far away, he uses a portal – instant.

Now think about CraftPortal.

A customer needs a part. Instead of ordering and waiting for shipping, they log into the Portal and post a request: “I need this part, these specs.”

IP Owners see the request. We call them Wandering Traders. They don’t ship physical parts – they sell recipes. Digital blueprints. They bid on the request.

Customer picks a recipe. Downloads it through the portal. Instant – like Steve stepping through a Nether portal.

Now the customer has two options:

Option 1: They have their own 3D printer – their Crafting Table. They print the part locally. Done in hours.

Option 2: They can’t print it themselves. They order from a Manufacturer – a Villager with better equipment. The Villager crafts it for them from a local facility.

Either way: recipe travels through the portal, part gets made locally.

No ships. No planes. No weeks. No carbon.

The Tech Behind It

The Portal is Power Pages – the marketplace. Recipes live in Dataverse. Copilot and AI Builder handle automation and intelligence. Azure IoT connects the Crafting Tables. Model Driven App runs the back office. Teams keeps everyone talking.

And LogiQraft? We’re the Redstone Engineers. We build the wiring that makes it all work.