What is it good that you can have improved customer communication with chatbots and forums, if the plumbers can’t get notified in realtime of relevant cases? Moreover, Mario and Luigi as CEO and CTO respectively want real-time data for improving decision support (e.g. plumber allocation) and PlumbQuest trends for further analysis.
Dataverse Webhook on Incident reports
To extract real time data, we created a Web hook using the plugin tool box for Dataverse, which calls our Azure Function whenever a new PlumbQuest is made.
To ensure safe access, function level authentication is applied, where the toolbox allows for HTTP Query parameters, safely accessing our Function which uses a traditional HTTP-trigger:
However – Here is the hacky part. The Web hook payload is too large, which makes the traditional JSON-payload corrupted with highly dynamic lengths and content of each PlumbQuest. Therefore we had to do some custom string manipulation to extract the values of most business and de-corrupt the JSON and preparing it for analysis – Almost a Complete ETL-pipeline (*cough*)!
But to access this real-time data in an Analytics environment – Fabric is the way to go (as by Microsoft huge Hype-wave). We created a Custom app Source for an Event Stream in Fabric with an EventHub output binding, which then can map to many different destinations, including a Lakehouse for historisation and trend analysis, as well as Data Factory Reflexes for reactive actions in real-time.
With Data Activator’s Reflexes directly on the stream, one can e.g. trigger additional flows for highly acute PlumbQuest from members in distress, or highlight plumbers who did not provide proper service according to the PlumbQuest review.
In addition, we set up a Dataverse Shortcut (Link) to Fabric, allowing for direct access to Dataverse without ETL or ingestion, providing ease of access and down-the-line deeper analysis on key business metrics, trends and community engagement.
Reproducible deployment
Although we are nasty hackers, we are reproducible hackers. As these were the only Azure resources used (directly), we deployed them using bicep and the Azure CLI. Sensitive variables are marked as secure and not included in the scripts, but parameterised.
So if you want to do it hacky, at least make it traceable.
To make our amazing service Tubi work, a lot of cloud is needed. We aim to make the plumber’s job easier by recommending the best layout for where the pipes should go, and for that, we need AI. We have trained a model in Custom Vision to recognize all components in a bathroom that need water. So, when the plumber uploads a floor plan to our Static Web App, the image is sent to our Azure Function App backend in C# Asp.net through our own API. But both the image and the equipment list must be stored somewhere. Therefore, we have also connected to Azure Blob Storage. Then last but not at all least. The people working in the back office have instant interactive reports available to help them with filing and billing through Power BI and alerting the using an automated flow (Badges: Feature Bombing)
Sometimes it works, and that’s plenty
Databases are good, but sometimes it’s easier to just dump everything in one place until you need it again. Yes, it might not be very scalable or very normalized. SQL became too heavy, and we already needed a Blob storage to store the images, so we also dump the order data in the same blob storage as JSON files. It’s old fashioned way of serverstorage, and a bit dirty, but it works! (Badges: Nasty hacker, Retro badge)
Power the backoffice
As the final list of components are decided, they still have to be approved from the accounting team in the office. To make sure they have all the information they require, we have developed a Power BI dashboard to crawl through our registered data and make sure the orders are handled properly (Badges: Crawler, Dash it Out, Dataminer). And to make sure the orders are handled easy and fast, the dashboard is embedded into teams and an alert is automated by using a logic app to make sure the workers can receive and cooperate in realtime (Badges: Embedding Numbnuts, Go with the flow, Pug N’ Play, Power user love, Right Now, Stairway to heaven).
Client Side Salsa – for the react code for PCF control
Hipster – for using node.js for secure communication
Right now – for socket io for real time communication with the active users in Model driven app
Nasty Hacker – for incredibly awesome solution with dirty hack on the model driven side
Power User Love – for using socket io with Azure Function in low-code power apps
High Level Diagram
Implementation
All realtime communication is implemented via the soket.io that hosted on Azure.
Project Update: Real-Time Communication and Enhanced User Interaction!
We’re excited to share the latest update on our project, featuring a significant enhancement in real-time communication capabilities. Now, administrators and backend integrations can seamlessly send messages to users who have opened specific records. This means that when there’s a change in the record status (e.g., case completion, modification), users will receive instant notifications prompting them to update their page for the latest changes.
This feature not only streamlines communication but also serves as a handy tool for various onboarding processes. Additionally, it facilitates users in understanding what has changed post the deployment of a new release.
Behind the scenes, we’ve implemented Clippy PCF, built upon the ClippyJs project, to empower this real-time communication. Leveraging the power of socket.io hosted on Azure, our solution ensures swift and efficient message delivery.
Moreover, the backbone of the real-time communication lies in Azure Functions, written in Node.js. These functions diligently send out notifications to the socket.io instance hosted on Azure, creating a seamless and responsive user experience.
Exciting times lie ahead as we continue to innovate and refine our project to meet the evolving needs of our users. Stay tuned for more updates, and thank you for being a part of our journey!
Business scenario
It supports real-time communication. So, Administrator or backend integration can send the message to the user that has opened the record. For instance: backend integration changed the status of the record (case has been completed, changed, etc..), all user that has opened record will receive the notification that they need to update the page to receive the lates changes.
Picture yourself as a pirating entrepreneur: You are eager to get rich, be feared and get results faster than the old generation. We know you are vell versed in apps such as Uber, Fiverr, Porterbuddy and AirBnB. In short you want things on time, at the right location and as convenient as possible. This is why you need Plundrr.
“I never thought I could loot this much of coin in such a few moons. Yarr!”
– Pirate Captain Beta testing Plundrr
Plunder is a Raid Planner
At the a first glanse Plundrr is a simple app, but it does a lot of the heavy lifting for you. You set your desired course – from one cove to another – and Plundrr uses machine learning, data scraping, powerful automation, advanced trigonometry and vector calculations to pinpoint prime targets and map intercept courses for you. All you need to do is to show up in time and keep the sails full. However, for every raid there are are key factors for business success:
Find the Best Opportunities to Plunder
Steering Clear of Threats
Intercept all opportunities in most cost efficient manner
Challenge 1: Find the Best Opportunities to Plunder
Based on your ships course and destination Plundrr can estimate which boats will be in your path. But it is not enough to know what boats are there, you also want to know which one has the better booty. this is where Barenswatch comes into play: We scrape all data about all vessels on your rout and calculate estimated value of the ship. We use the ship classification and size of vesle to estimate the value of the ships cargo.
All opportunities are ranked and given a carrot score 🥕. The more carrots the better. Optimally we want to strike all triple carrot targets in our path and also deviate if total yield will be better persuiting triple carrots over single carrots. In short, 🥕🥕🥕 is better than 🥕.
“Using carrots for value! Haha, thats stupid but also so correct”
– øystein from @in1
Challenge 2: Steering Clear of Threats
There will be compassion out there. No worries, Plundrr’s has your stern. From barenswatch we also get classification types for warships, police and even with some tinkering and estimation: Other Pirates.
We label all ships as opportunities, friendly or threats and higlight which to avoid and what strength they have in the UI. We call this the Scull Score 💀. As for carrots, the more sculls the more danger. We allways reccomend avoiding tripple scull boats, however we give you the intel and freedom to choose to attach tripple carrot ships even when they are close to a single skull. This risk/reward judgement is for you to decide.
“Wow, I love how the UI is really clear. Also the boats are fun!”
– Håvard from @in1
Challenge 3: Intercept all Opportunities in most Cost Efficient Manner
Once the user has chosen a rout, we plot that rout and use machine learning to simulate optimal target vectors. The problem we are solving is called the traveling salesman’s problem: A person travels from point A to point N ion a two dimensional plane. In between his origin and destination there are multiple opportunities for sales scattered in every direction. Our challenge is to plot the optimal course between ale opportunities traveling the shortest totalt distanse. Although for sea travel we need to slightly adopt the model to account for sea current and the travel of other ships.
Exploring the Technical Side
The technical Architecture, although straight forward leverages Power Automation and Power Flows to achieve more with less.
Mining Data
Data is predominant gathered from Bareenswatch, an open API for ship traffic on the Seven Seas. For each ship we get heading, size, global position and weight. We store the data in Dataverse for further use and manipulation.
One of the ways we used the mined data during this hackathon was to viziualise ships as opportunities based on weight and size. The data was rendered in a Power App dashboard enabling the team to examine data to better understand how we should define high opportunity targets for Plundrr.
The data mining and visualization granted us badges in:
Dash it Out
Dataminer
Storing Data data Using Data Verse
The backend functions of the solution is mainly built on the Power Platform allowing an excellent user experience when it comes to accessing and managing data. A modeldriven Power App is used as a killer app for administrators to work with saved information about raids, pirates and ships with travel logs. This data is saved in Dataverse to make sure a database is available for all kinds of applications created by users of rock solid geekness.
The tables in Dataverse can be accessed using Power Automate API, wich allow us to manipulate and control the dataflow, making sure each request is handled based on where it comes from and what it is used for. This way, a various amount of data can be returned based on input value, and therefore creating a most extreme business value where customers can retrieve amounts of data defined by how much coin they invest in the solution.
Use of Data Verse for the backend granted us badges in:
Go with the Flow
Power User Love
Plotting Paths using Python and Bokeh
We use the coveted algorithm “shameless extrapolation”, taking into account the ships current speed to estimate the ships future route. The 20 closest ships are considered opportunities and highlighted with the bright turquoise icon. Hovering over ships in the map gives information about name, type and destination.
Paths are plotted using Bokeh – a Python framework for creating graphs and plots.
Initial research into plotting and vector calculation made us explore prediction algorithms taking into account additional factors to predict opportunities location even furter in the future. Luckily after working on this we discovered we can achieve the same results with less complex algorithms like the “shameless extrapolation”, meaning we can decrease the cost of our service to the end user: Granting even more bang for theoretical buck.
“This is ridiculous. So much effort must have gone into this!”
– Rune from @in1
Solveig’s ☀️ amazing work awarded us badges in:
Crawler
The Existential Risk
Retro
Automating Deployment and Infrastructure
We built a SPA application with .NET 7 and React. With these tools, we have build a sturdy vessel that’ll weather any storm.
.NET 7 provides us a robust backend framework while React keep front-end shipshaped. Together they make a formidable team that can take on any challenge on the high seas the is web development.
With GitHub Actions our ship will get the lates features and bug fixes in no time. Our ship is deployed with Pulumi, so we have Infrastructure as code and is running in Azure.
Use of these technologies awarded us the badges
Sharing is Caring
Power User Lover
ACDC Craftsman
Client Side Salsa
Packaging the User Experience
Good design is never an afterthought nor a last feature. Good design comes from understanding human needs and quality experiences. We wanted to appeal to an audience that grew up in the 90’s and are aquented to all the services the 20’s afford, like Uber, Tinder, Porterbuddy and Oda. Following an initial brainstorming we landed on Neon Retro as inspiration (cyberpunk music was playing at the time which might have sparked the idea).
“Wow! You can do this with Figma? I never knew. I’m installing Figma now.”
– Guy at luch I cant remember the name of
Fist step was to find a pallet and to create some sort of brand. We used Adobe Color to mix a color pallet based on a selected base. This base was used to make custom vector inspiring the rest of the apps design.
The color pallet and main logos look and feel is used throughout the UI to make a modern yet eery subtly retro design. Icons for the app are altered versions for existing icons from Nounproject. Since they are heavily modified from originals we claim fair use.
Prototyping using Figma awarded us these badges:
Retro
Glossy Pixels
Summarizing Three Amazing Days
During the hackaton we’ve all been forced to take a step back and reconsider the direction of the project and our solutions. We tried connection an arduino to a Raspberry pi to use as a notification system for when the ship was in danger, but with (almost) faulty equipment and with limited knowledge, we had to scrap the idea. A lot of hours spent, and one could say wasted. But its not really wasted at all, as we definetly learned something on the way.
We are truly happy we got to test our limits, share our ideas with new friends and discover how awesome it is to be a happy camper and spread joy.
“MMM.. is the ultimate tool for pirate captains looking to maximize their raids. With our comprehensive system, you’ll be able to plan, organize, and execute the perfect plunder. No more manual logs, no more disorganized crew. Just a streamlined, efficient process for taking what you want from the high seas. Upgrade your pirate game today with Maritime Mischief Manager.”
Pirates are met with a friendly user interface where they can manage all aspects of pillaging.
Creating a Raid
Under the “New Raid” menu, pirates with the rank of Captain can take a look at possible raid prospects and get an overview of the best possible strategy and also an estimation of the loot potential in the selected location.
The results we are seeing above comes from a query against the ChatGPT API with the use of Power Automate.
Being satisified with the current target, the Captain selects the vessel for the raid, choose to create it and the raid enters planning mode.
Raid planning
From the raid list, piratescan view raids in the planning stage. Simple pirates can choose to join a raid if they want to, but Captains can also choose to invite his best men. Pirates receiving an invitation will have to accept/reject the invitation through their Teams account.
Once the pirate has received and accepted an invitation, they have to scan their provided RFID card when they board the ship. They will then enter the status of Checked in.
Once the RFID card is scanned at the ship, a HTTP request is sent to the raid check-in flow, the flow will update the crew status.
When the Captain is happy with his band of criminals, he chooses to start the raid. The app conveniently simulates the battle as well, together with some nifty battle music. This is of course to save the hassle of actually raiding a place.
Reporting
The pirate management team of course also have access to beautiful numbers of the entire operation
Arrr matey! Welcome to the world of reporting on port ships, pirates and fleets!
As ye might already know, Power BI is a powerful data visualization tool that can help ye track and monitor yer fleet, port ships and even pirate activities. Ye can easily gain insights into yer fleet’s performance, inventory & pirate attacks, and so this is what we have done as shown in the images below.
Arrr matey, we’ve created not just 4 graphs, but 4 report pages to address some of the various business needs a pirate may face on the high seas. Thar be a pirate HRGhh report, a Ship o’hoy report for fleet management, a map displayin’ our location and foes in real-time usin’ the DirectQuery capabilities in Power BI, and lastly a raid report utilizin’ Azure ML to predict which pirate ships we may loot successfully. This’ll aid us in makin’ informed decisions about the future of our fleet and help us chart our next course.
Below be a snapshot of the real-time, allowin’ captains and pirates to collaborate from different ports. This way the captain can signal which pirate to attack or stand down based on their proximity to the enemy, makin’ the collaboration smoother, even from a distance. This map be based on the location data from other ships and our own (Raspberry Pi) and displays the last recorded location through longitude and latitude.
When it be comin’ to pirate raids, there be other booty that be worth a king’s ransom, and that be understandin’ the chances of a successful pillage. To be doin’ this, we scallywags created a simple map usin’ Azure ML that be givin’ us the lay of the land, predictin’ our loot based on the target’s worth such as their ship’s maker, the distance they’ve sailed, the power of their engine, and their current state.
We be feedin’ th’ AutoML model int’ th’ Power BI report, savvy? This be allowin’ us t’ do predictions on new data ’bout our enemies, which we be displayin’ in our report fer th’ captains t’ make well informed decisions, specially when coupled with th’ map. By clickin’ on a license plate number as ye see below, th’ captain will be taken t’ our model driven app and shown more information ’bout th’ possible target, aye aye!
Our PRT is doing real time object detection to spot pirates and bottles, but the data doesn’t leave the terminal. As the captain wants to know how many recruits we are attracting we needed to expose it to his “weapon of choice”; Power BI.
Luckily as the app is using a modern data store🍍 this is easily achievable by inserting a listener and sending the data to power bi on change 👍.
The way we connected to Power BI was to go to “My workspace” and create a new dashboard which has a streaming dataset which connects to an API. Then we gave inserted the right values from stream in order to get the right data.
We are now getting the real-time data into Power BI, we just need to add visuals into the dashboard. We go to “Edit”, and add a tile by using the real-time dataset that we created. We added two gauges and two line charts.
These visuals will now give real-time metrics from the parrot when there are either faces or bottles detected from the camera.
Canvas apps “recently” introduced the ability to work 2 people on a single canvas app. This has not been available before and is absolutely a gamechanger for app development!
We started out by turning on the function in preview from settings
After this, we have to connect the app to a GitHub source. I chose to connect this to my personal GitHub for simplicity.
At this point we were not able to work together on canvas apps without having to close the app between changes!!
This is a big change for Canvas apps being able to work at the same time in the same app👏
I Oslofjorden er det plassert ut sensorer for å detekere forurensing.
Sensorene er koblet til en pc og kommuniserer mot port COM3. Vi har derfor utviklet en tjeneste som kjører lokalt på pcen og lytter på COM3. Når sensorene detekterer en hendelse så sendes det en melding over port COM3 og servicen våkner, validerer input og sender informasjonen til Azure Event Hub.
Årsaken til at vi har valgt å benytte Azure Eventhub 1) Vi ser for oss at vi på sikt skal motta mye sensor data i realtime 2) Kommuniksajonen skal kun gå en vei, dvs fra sensorer til eventhub
Det utføres en validering om Turtels skal sendes ut og redde miljøet.
Når turtels er ute på oppdrag ønsker vi at de skal kunne kommunisere med hverandre. Derfor hatr vi tatt utgangspunkt i eksempel app og modifisert denne til å funger fort vårt case.
We started this mornig with a call to a company called OneFlow. We have had some conversations with them before, and decided that we wanted to try to get something done together with them today. We wanted to bulid an API integration from Dynamics 365 HR to OneFlow, so that we can sen out Employment Agreements for digital signatures directly from Dynamics! If this is not a huge business value, then we don’t know what are! This is what this we did together with them right here and now, and how it works:
Contract is generated in oneflow, and is sent to the company representative as well as the employee.
When the contract is signed electronically by both parties, power automate get a callback with information on which employee that has signed and we can tick off “contract signed” in HR and add an url link to the contract in oneflow.
The solution is set up to support Swedish BankID,
Danish NemID, Finnish BankID,
Norwegian BankID and advanced e-signature for other countries with e-mail/phone verification