Final Submittion – OUR JOURNEY AT ACDC 2O26

Problem statement: We came to ACDC Hackaton 2026 with an ambition to solve the problem that exist in supply chain, that affects everyone involved. Global spare-parts supply chains are slow, fragile, carbon-heavy, and often fail when parts are obsolete, or manufacturers no longer exist.

Original Solution we wanted to build: CraftPortal replaces shipping with digital “crafting,” using AI and cloud technology to match, recreate, and produce parts locally- fast, resilient, and sustainable. It was, however, focused on the idea from ou perspective and plan to refine this concept during the hackathon.

Evolution Journey: and then happened why we love ACDC Hackaton for.

– “Put the Customer in focus”, said Sara on the opening day, our beloved judge on Digital Transformation category.

So, we worked on the idea to enrich our web application solution to include the ISV package and give our potetial customer with UI and UX they recognize well, namely M365, Power Platform, BizApps.

– “I love your futuristic concept…” said Mikael from Redstone Realm perspective and inspired us to combine Microsoft 365 / Dynamics 365, SharePoint, Teams, and Azure

“Simple code screenshot is not enough, show me how it solves your problem”, noted Keith (Code Connoisseur),  and we challenged the status Quo and wanted to generate bigger impact…. So what we did is

“Turn data into insights….”, reminded Cathrine from Data, AI and Analytics, and we revised our data model focusing to build a solid fondation for our solution, so we could start to map external data sources and motiveted us to explore RAG.

“Everything you have there has to be there for a strong reason” warned us Fredrik from Low-Code angle in the start, and we critically reviewed our ecosystem to follow strict focus on the power of the low code.

“No security holes” declared Scott on Governance and Best Practices and you don’t mess with Scott. No fluff, we need a proper best practice focused ALM and Governance for the whole application.

As a result of continued brainstorming during these 3 days, and dialogue, our refined Solution started to look like this: … scroll down:)

DIGITAL TRANSFORMATION

The Concept

Supply chains are slow, fragmented, and carbon-heavy. Parts ship across the world when they could be printed locally. CraftPortal changes this – a marketplace where recipes travel through the portal, parts get made nearby. Faster. Greener. Smarter.

Customer needs a part. Can they print it? Yes – browse marketplace, select blueprint from IP Owner, print, deliver. No – publish a tender, receive bids from Manufacturers, select, award, contract, they print, deliver. Two paths. Same portal.

We built two paths to CraftPortal. A SaaS web application for users who want to jump straight in. And an ISV package for customers who want CraftPortal wired into their Microsoft 365 environment.

Users choose: Web App or Power Apps:

Customers who prefer Model Driven App get a clean, familiar UI – the Power Apps experience they already know, tuned for digital inventory workflows:

But they can surely use our fancy web app. Vendors – IP Owners and Manufacturers – use the Portal interface. They browse public tenders, submit bids, upload recipes, manage contracts, track orders. All through Power Pages.

Value & Monetization

CraftPortal sits in the middle of every transaction. Recipe rented? We’re there. Part printed? We’re there. That’s the value.

Monetization options:

  • Subscription – monthly/annual access to the platform
  • Transaction-based – percentage per recipe rental, per tender, per print job

Or both. Base subscription for access, transaction fee for volume.

LOW-CODE

Low-Code: The Redstone Behind CraftPortal

We built CraftPortal in 3 days. A marketplace. Tender flows. Vendor management. Document automation. AI agents and Power pages portal. How? Low-code.

The Building Blocks are as follows:

Power Pages for the Portal. Model Driven Apps for back office. Power Automate for every flow. Copilot Studio for autonomous agents. Dataverse for data. Generative Pages for dashboards.

We wired it together with clicks, not code.

The Low-Code Highlights

  • Autonomous agents – Copilot Studio detects Dataverse changes, posts to Teams, triggers RPA
  • RPA integration – Power Automate Desktop opens Bambu Lab Studio and clicks Print.
  • Teams + SharePoint + Dataverse – Fully automated channel and document location creation via Power Automate
  • Generative Pages – KPI dashboard pulling live Dataverse data
  • OneFlow contracts – Power Automate creates and sends contracts for signature
  • Link Mobility SMS – Automated bid notifications
  • Custom Connectors – ISV package API integration
  • FetchXML Builder – Low-code query generation (we click buttons, it writes XML)

Deep Dives

Want the details? We documented everything here:

The Result

A full digital inventory platform. Built by a small team. In 3 days. Low-code made it possible.

CODE CONNOISSEUR

Low-code gets you far. But sometimes you hit a wall – a custom UI that doesn’t exist, real-time updates that Power Automate can’t handle, or an API that needs to be built from scratch. That’s when we switch gears. Pro-code fills the gaps.

We have various code projects and components in our solution:

  1. PCF Control. That allows customers to order the appropriate equipment that should be printed. Client Side Salsa | Arctic Cloud Developer Challenge Submissions
  2. Power Pages Portal. That manages turning basic Minecraft resources into different tools and help clients to find an appropriate vendor for printing adapts to all devices and screen sizes. Chameleon | Arctic Cloud Developer Challenge Submissions
  3. Web Portal for the 3D printer. Using this custom local web portal we manage our printer device. Crafting, Crafting, Crafting…. Category: Pro Code  | Arctic Cloud Developer Challenge Submissions
  4. Model builder app – which is helping customer recover lost recipes from the single shot; Right now – bring real-time data to the app | Arctic Cloud Developer Challenge Submissions
  5. Azure Function – used to communicate with the external vendor API. ISV Package – the missing link | Arctic Cloud Developer Challenge Submissions

DATA, AI, ANALYTICS

The Dashboard

Our CraftPortal KPI Dashboard brings it all together. Built with Generative Pages and React, pulling live data from Dataverse:

  • Summary cards – Total Projects, Open Projects, Total Bids, Wandering Traders
  • Project Status Distribution – Donut chart showing lifecycle states
  • Bid Conversion – Submitted vs selected bids
  • Win Rate by Trader – Performance leaderboard
  • Projects per Month – Trend analysis over time
  • Top Wandering Traders – Gamified rankings

Light mode. Dark mode. Minecraft item icons from the official API. Business intelligence with a blocky twist.

We built the foundation. Dataverse as our core. Power BI dashboards for KPIs – tender status, bid conversion, vendor performance, projects per month. Live telemetry streaming from our IoT-connected Crafting Tables via Azure IoT Hub. Real-time monitoring of print jobs, temperatures, and device health.

Last year we went deep on Microsoft Fabric – Medallion architecture, Data Activator triggers, the whole pipeline. We didn’t want to repeat ourselves.

The Vision

The vision was fun: a custom Knowledge-based Copilot powered by Fabric. Pull external data from the official Minecraft API (GitHub – PrismarineJS/minecraft-data: Language independent module providing minecraft data for minecraft clients, servers and libraries.)

via Azure Data Factory. Deploy a proper RAG pipeline – chunking strategies, metadata filtering, semantic search, hybrid search, custom retrievers. Debunk RAG the right way.

Unfortunatley, we barely finished the data platform in time. The RAG adventure stays on the roadmap.

Sometimes three days isn’t enough. But the foundation is solid. The diamonds are waiting to be mined.

GOVERNANCE & BEST PRACTICES

Essence

Our goal during the hackathon was to show the complex implementation of the project with different aspects of the implementation.

When it comes to even the industry focus switched to AI related topics it still requires advanced level of the solution design to enable existing services for the LLM.

That is why we mentioned advanced level technologies such as: Azure Local, Lighthouse, and IoT Hub.

At the same time complex solutions usually require more effort for implementation. By following best practices of each piece of that puzzle, we are increasing the overall success rate of the delivery. ISV Package – the missing link | Arctic Cloud Developer Challenge Submissions

REDSTONE REALM

We created our business solution using Microsoft technologies - Open AI, Azure DevOps LLMs, Azure Function, Outlook and Microsoft teams. Check out the article ISV Package – the missing link | Arctic Cloud Developer Challenge Submissions

By deliberately meeting and exceeding every requirement across Digital Transformation, Low-Code, Pro-Code, Data & AI, Redstone Realm, and Governance & Best Practices – while continuously refining our solution through your direct feedback – we believe CraftPortal represents the complete ACDC vision, and we thank the judges for challenging us, guiding us, and inspiring us to build something truly worthy of this win.

Thank you for ACDC 2026!

With love, team LogiQraft.

PS! We really wanted to share a PREMIERE of our final movie with you, before the official release so here you go https://www.youtube.com/watch?v=KV2p3FNTLNI

Event-Driven Autonomous Agent Solution

The challange described in our blogpost – Who is the King of Integrations? Or what? (Add link?)  was that excisting integration libriraies fto connect to Bambu lab local API provided very limited functionality and not fully optimized funcionality we could interacr with our 3D printer. 

Here is our second game plan using low code tools  where we tudned to RPA capabilities in Power Automate workflow. This is also an option since as Solution Providers Logiqraft wants to have multiple option available for our customers when they decide to use our app but migh have problems related to networking or other issues.

We combined Copilot Studio autonomous agent with Power Automate Desktop. When a user updates a recipe status to “Ready to Print” in the Model-Driven App, the autonomous agent detects the change in Dataverse, posts a notification to Teams, and triggers the RPA flow. The RPA then opens Bambu Lab Studio and clicks Print. The physical printer starts the job.

ISV Package – the missing link

We’ve shown you the concept. The roles. The flows. The tech stack. Now let’s talk about how this actually lands in enterprise.

The simple Steve logs into an external portal. Rents recipes. Publishes tenders to print from manufacturing vendors. Monitors production and delivery status. All from the portal interface.

Works great.

But enterprise Steve, the one working at Equinor, IKEA, or Siemens -+doesn’t just browse marketplaces. Enterprise Steve has:

  • An ERP system
  • A CRM system
  • Procurement workflows
  • Approval chains
  • Compliance requirements
  • IT policies

Enterprise Steve needs CraftPortal connected to his world. His tenant. His systems. His processes.

The ISV package bridges these two worlds. Customer data stays in customer tenant. CraftPortal handles the marketplace, IP Owners, Manufacturers, the recipe catalog.

No workflow fragmentation. No copy-paste between systems. No “let me check the other portal.” One flow. Connected.

What Customer Get

Power Platform Components

ComponentCraftPortal Examples
TablesTenders, Recipes, Vendors, Manufacturers, Bids, Projects, Parts
DashboardsTender Overview, Production Status, Vendor Performance
PCF WidgetsRecipe Viewer, 3D Model Preview, Status Tracker
Power Automate templates“New Tender → Notify Vendors”, “Bid Awarded → Create Project”, “Part Printed → Update Inventory”
Security ModelRoles: Procurement Manager, Vendor, Manufacturer, Viewer

What We Get

  • AppSource listing = discoverability, Microsoft co-sell, enterprise credibility
  • Do this right and CraftPortal becomes invisible infrastructure — always there, impossible to replace

That’s not a customer. That’s a permanent relationship.

The tech stuff

Our ISV package includes two data integration flows. Custom Connector for on-demand requests — direct calls to CraftPortal API when you need real-time actions. Cached Core Data for near real-time sync — we push core data into the customer’s environment via Azure EventGrid into Azure SQL and Dataverse. Why both? Cached data enables full delegation in Power Apps. No query limits. Instant performance. Citizen developers query local tables instead of external APIs. CraftPortal data that feels like their own.

The Code

Azure Function Trigger & Interface

Trigger Type

1) HTTP-triggered Azure Function (API Gateway)

  • Used for synchronous operations and for publishing events
  • Secured via Azure AD authentication

2) Event Grid–triggered Azure Function (Event Processor)

  • Subscribes to Event Grid topic events
  • Processes vendor integration asynchronously
  • Updates Dataverse with final status/result

Interface Characteristics

  • REST-style endpoints
  • JSON request/response payloads
  • Versioned route (example):

/api/v1/vendor/operation

The Release

Power Platform Environment Strategy

Source control strategy

Repo structure (example)

  • /solutions/<SolutionName>/ (exported source using Power Platform CLI/PAC)
  • /pipelines/ (YAML for CI/CD)
  • /tests/ (integration test scripts, Postman collections, Playwright scripts, etc.)
  • /docs/ (release notes templates, runbooks)

CI Pipeline

CD Pipeline

Building the ISV package deliverable

An ISV-style deliverable usually includes:

Managed solution ZIP(s)

  • Core solution (managed)

Installation guide

  • Required licenses and prerequisites
  • Import steps
  • How to set environment variables
  • How to create/bind connections
  • Security roles to assign

Configuration workbook

  • List of env vars, defaults, required values
  • Connection references mapping
  • Any URLs/endpoints

Release notes + known issues

  • What changed, what to verify

Support / troubleshooting

  • Common import errors
  • How to re-run failed flows
  • Health check steps

Infrastructure as a code

In our solution, we are using both managed power platform ISV and dedicated cloud infrastructure for them.

It means that each customer should deploy his own Azure infrastructure to unlock Power Platform solution distributed via the AppSource.

We are introducing it via the one click Azure infra deployment process. Model driven app has dedicated admin App which is Allowing user run call the deployment by himself after the main package installation.

Referance: https://learn.microsoft.com/en-us/azure/azure-resource-manager/templates/deploy-to-azure-button?WT.mc_id=IoT-MVP-5002324

Deployment

Note: The selection of the resourcegroup is not part of the BICEP / ARM template. It expects a resource is available. This selection is provided by the portal.

To support that scenarios, We need to provide the all the required bicep files, which are the blueprints for the Azure Services.

Benefits of Bicep

Bicep provides the following advantages:

  • Support for all resource types and API versions:
  • Orchestration
  • Repeatability

The following examples show the difference between a Bicep file and the equivalent JSON template. Both examples deploy a storage account:

Directory Structure

Deployment.

We are using couple of Docker compose files for prod and dev deployment.

It allow us simplify the infrastructure for the local development setup.

Keep in mind that the main rule of using docker-compose: No sensitive data should be hardcoded inside, all environment specific details must be placed to the .env file.

We are following a classic approach with private azure docker registry (ACR) to store frontend and backend docker images.

How to convince IP Owners?

The idea behind is receive all the benefits of owning the infrastructure + extra features of the Platform owner(LogiQraft) like dedicated, specifically designed AI services, Proactive monitoring system, etc…

To make it work, we are using Azure Lighthouse + Managed Identity.

How it works:

  1. The customer delegates a scope (subscription / resource group) to you via Azure Lighthouse. 
  2. Your identity in your tenant (this can be:
    • a Managed Identity, or 
    • a service principal/app)

is granted a role on their resources through that delegation.   

  1. You then query their Application Insights / Log Analytics data using your identity, and Azure enforces the delegated RBAC.

Why Lighthouse is ideal:

  • No need to create/maintain identities in every customer tenant
  • Customers can revoke access easily 
  • Scales across many tenants
  • We assume that IP Owners would be careful to share their IP and by following this approach we are addressing their concerns
Categories: Redstone Realm , Governance & Best Practices , Code Connoisseur , Digital Transformation

Badges:  Power Of The Shell, ACDC Craftsman, Plug N’ Play

Link Mobility SMS Provider 

Linkedin link: https://www.linkedin.com/posts/filip-shamakoski-438a95147_just-shipped-a-neat-automation-when-a-bid-activity-7420772772953473024-CpE6?utm_source=li_share&utm_content=feedcontent&utm_medium=g_dt_web&utm_campaign=copy

To enable SMS notifications within the application, Link Mobility has been added as the SMS provider. This integration allows the system to send automated SMS messages to customers. 

Bid Creation 

Bid gets created as an example from the model driven app. And the customer information is being retrieved from the Project. 

Customer Notification on Bid Creation 

When a Bid is created, a Power Automate flow is triggered automatically. The flow performs the following steps:  

  1. Detects that a new Bid has been added. 
  1. Retrieves the related customer information. 
  1. Sends an SMS notification to the customer using Link Mobility. 

This ensures that customers are informed immediately when a Bid is created, improving communication and responsiveness. 

Democratization of the antient model crafting

We are happy to make the first version of API of the our Model Builder solution publicly available. You can use it in your own way, to reproduce items just for single shot.

The simple example of how to use it can be found here: https://api-logiqraft.futurein.cloud/api/swagger  

Usage Instructions: Create the Http triggered flow to receive the notification when task is completed. 

The payload will contain the address of the generated model: 

  ”event”: ”task.completed”, 

  ”taskId”: ”fd7ded63-542a-4c3c-93d0-90bff6954a14″, 

  ”status”: ”completed”, 

  ”modelUrl”: ”/uploads/models/fd7ded63-542a-4c3c-93d0-90bff6954a14.glb”, 

  ”timestamp”: ”2026-01-22T17:42:20.666Z” 

Azure API: minimal

Leveraging existing services effectively is key to successful implementation. By building on proven Azure capabilities, we can both reduce costs and introduce telemetry that enables proactive monitoring of our web application’s health.

Within the Model Builder portal, this approach is realized with the following Azure services:

  • Azure Blob Storage is used to store generated projections and models, providing a scalable and cost efficient storage foundation:
  • Azure OpenAI is used to extract tags from the initial image uploaded by the user, enabling automated enrichment and downstream processing:
  • Azure Application Insights is used to collect telemetry from the application, giving visibility into performance, usage patterns, and potential issues so they can be identified and addressed proactively.

oneFlow Integration with Power Automate 

Linkedin link: https://www.linkedin.com/posts/logiqapps-as_acdc2026-activity-7420765826762571777-KXDn?utm_source=share&utm_medium=member_ios&rcm=ACoAAAGgArEBJX4FykL_X9lN_RmNWQBCgkXoBWQ

This document describes the integration between oneFlow and Power Automate used within a Dynamics 365 Model-Driven App. 

1. Selecting a Bid on the Project Form 

On the Project form, when a user selects and specifies a Bid, this action serves as the trigger for the Power Automate flow. 

2. Power Automate Flow Execution 

Once the Bid is selected, a Power Automate flow is automatically triggered. The flow creates the contract from a template in oneFlow, specifies the participant, and publishes it. 

3. Contract Creation and Sending for Signature 

After the contract is created from the template, the flow sends the contract to the Wandering Trader specified in the Bid via email. 

4. Sign Contract 

The contract then can be signed by the Wandering Trader.