Real-time Dataverse data for real-time business overview

What is it good that you can have improved customer communication with chatbots and forums, if the plumbers can’t get notified in realtime of relevant cases? Moreover, Mario and Luigi as CEO and CTO respectively want real-time data for improving decision support (e.g. plumber allocation) and PlumbQuest trends for further analysis.

Dataverse Webhook on Incident reports

To extract real time data, we created a Web hook using the plugin tool box for Dataverse, which calls our Azure Function whenever a new PlumbQuest is made.

XRMToolbox to add a Web hook to Dataverse for real time PlumbQuest analysis

To ensure safe access, function level authentication is applied, where the toolbox allows for HTTP Query parameters, safely accessing our Function which uses a traditional HTTP-trigger:

However – Here is the hacky part. The Web hook payload is too large, which makes the traditional JSON-payload corrupted with highly dynamic lengths and content of each PlumbQuest. Therefore we had to do some custom string manipulation to extract the values of most business and de-corrupt the JSON and preparing it for analysis – Almost a Complete ETL-pipeline (*cough*)!

But to access this real-time data in an Analytics environment – Fabric is the way to go (as by Microsoft huge Hype-wave). We created a Custom app Source for an Event Stream in Fabric with an EventHub output binding, which then can map to many different destinations, including a Lakehouse for historisation and trend analysis, as well as Data Factory Reflexes for reactive actions in real-time.

With Data Activator’s Reflexes directly on the stream, one can e.g. trigger additional flows for highly acute PlumbQuest from members in distress, or highlight plumbers who did not provide proper service according to the PlumbQuest review.

Our Fabric Event Stream with the Custom app as Source and the Lakehouse for historisation and down-the-line processing and analysis

In addition, we set up a Dataverse Shortcut (Link) to Fabric, allowing for direct access to Dataverse without ETL or ingestion, providing ease of access and down-the-line deeper analysis on key business metrics, trends and community engagement.

Our PlumbQuests in Fabric Lakehouse using a Dataverse Connection for e.g. a more complete 365 customer view using Fabric items

Reproducible deployment

Although we are nasty hackers, we are reproducible hackers. As these were the only Azure resources used (directly), we deployed them using bicep and the Azure CLI. Sensitive variables are marked as secure and not included in the scripts, but parameterised.

The main bicep deployment definitions for our Azure Function app and related resources, the resource group naturally had a separate BICEP definition.

So if you want to do it hacky, at least make it traceable.

Nasty Hack

While working with the tiles in the game we kept getting an error message due to Delegation. We didn’t have time to solve this error message, but the game was still working all of the time.

Having to redesign the whole structure just to be in line with a delegation warning was not something we wanted to do, so we found a brilliant way of going onwards.

We simply created our own message over this message in the onload of the game 😉

As stated before, the game still works as it should, but now the user is informed that the map is ready.

Crashing the Azure AI Studio and Copilot

Badges to claim:

  • Nasty Hacker – for retrieving data from 3rd party services and syncing it to the blob storage to connect to Azure AI Studio
  • Data miner – for retrieving the case information from dataverse and calculating the average score using AI.
  • Embedding numbnut – for embedded copilot in Model-driven app
  • Stairway to heaven – for using Azure AI Studio, Copilot, Blob storage, and in previous articles also Azure Function, WebApp

Our solution includes the latest features of the PowerPlatform and Azure connection the Low and Pro code approaches together, to allows you to boost the performance of resolving the cases by using some insights from the Copilot. 

  • The business use case is about complicated cases when we need external consultancy and assistant, so task is to find the suitable Marios-consultants according to the customer request, by searching the professionals on the Indeed, comparing their background and experience and finding the best matches. (to find the suitable Marios according to the request.) 
  • The business use case is about simplify the KYC(Know Your Customer) process by using unified workspace for all operations.  From Indeed we can understand the company background,do semantic analysis of the comments to have insights on how technicians can approach customer (princess) (simplify the KYC(Know Your Customer) process by using unified workspace for all operations.) 
  • The business use case is about analyzing current princess (customer) and her needs based on the Indeed. The data that we were able to retrieve contains open job postings, company in for, recent news and personal profile info. Based on this information, we can suggest more other services and provide information to sales and marketing departments. (analyze current princess and her needs based on the Indeed profile and suggest more other services.) 

We deployed GPT-4 model to our personal instance using AI Studio, then fine-tuned it with company’s internal data and data from open sources like Indeed, Glassdor or proff.no. 

Than using Retrieval Augmented Generation (RAG) technic to inject generative responses from big LLM into answer of Copilot. 

With RAG, the external data used to augment your prompts can come from multiple data sources, such as a document repositories, databases, or APIs. The first step is to convert your documents and any user queries into a compatible format to perform relevancy search. To make the formats compatible, a document collection, or knowledge library, and user-submitted queries are converted to numerical representations using embedding language models. Embedding is the process by which text is given numerical representation in a vector space.  

AI Studio has seamless integrated feature use Azure Blob Indexer to implement search functionality search for AI. It bring the possibility to simplify the access to the Datalake from LLM side.   

By implementing multiple connectors to the third-party services and data sources, together with Dynamic chaining feature of the Copilot it gives cleaner user experience for the user using only one tool for analyses. 

How it works:

Piping our way through the Azure Cloud

To make our amazing service Tubi work, a lot of cloud is needed. We aim to make the plumber’s job easier by recommending the best layout for where the pipes should go, and for that, we need AI. We have trained a model in Custom Vision to recognize all components in a bathroom that need water. So, when the plumber uploads a floor plan to our Static Web App, the image is sent to our Azure Function App backend in C# Asp.net through our own API. But both the image and the equipment list must be stored somewhere. Therefore, we have also connected to Azure Blob Storage. Then last but not at all least. The people working in the back office have instant interactive reports available to help them with filing and billing through Power BI and alerting the using an automated flow (Badges: Feature Bombing)

Sometimes it works, and that’s plenty

Databases are good, but sometimes it’s easier to just dump everything in one place until you need it again. Yes, it might not be very scalable or very normalized. SQL became too heavy, and we already needed a Blob storage to store the images, so we also dump the order data in the same blob storage as JSON files. It’s old fashioned way of serverstorage, and a bit dirty, but it works! (Badges: Nasty hacker, Retro badge)

Power the backoffice

As the final list of components are decided, they still have to be approved from the accounting team in the office. To make sure they have all the information they require, we have developed a Power BI dashboard to crawl through our registered data and make sure the orders are handled properly (Badges: Crawler, Dash it Out, Dataminer). And to make sure the orders are handled easy and fast, the dashboard is embedded into teams and an alert is automated by using a logic app to make sure the workers can receive and cooperate in realtime (Badges: Embedding Numbnuts, Go with the flow, Pug N’ Play, Power user love, Right Now, Stairway to heaven).

Five badges in one poost?

Toad, a character widely recognized as Mario’s loyal companion, assumes a different role in our scenario. Instead of assisting Mario, Toad will be guiding Princess Peach through the perilous landscapes of the Mushroom Kingdom. Known for his extensive knowledge, Toad has documented everything in his personal wiki, which serves as a comprehensive guide to their world. In our application, Toad will utilize this knowledge to answer Princess Peach’s queries via a chat interface. This interactive feature will allow Princess Peach to navigate her journey with greater confidence and understanding.

Toad, an AI bot powered by Copilot Studio, learns from the web and adapts to new data. It uses advanced machine learning algorithms to understand and interpret information. This continuous learning makes Toad a versatile tool for tasks like data analysis, content creation, and problem-solving. It represents the next generation of AI, ready to assist with its ever-growing knowledge. (New text added here)

Live demo of Toad:

Who can resist the thrill of a captivating game? Our app takes you on an exhilarating journey, brimming with elements of gamification. It features a sophisticated scoring system for our beloved Princess Peach, tracking her progress throughout the game.

Live demo of score:

But that’s not all! Princess Peach has access to a dynamic Power BI dashboard, allowing her to review her progress and identify her strengths and areas for improvement. This interactive tool serves as a constant motivator, pushing her to strive for excellence

Last night, we celebrated a significant achievement as we won the award for ‘Excellent User Experience’ for Thursday. This accolade is a testament to the hard work of our experienced UX designer, who has crafted a fabulous Figma prototype to assist our developers. This victory leads us to the final chapter of our journey.

Last but not least, we invite you to step into the world of our application. Here, a vibrant pink palette, reminiscent of Peach’s dress, meets a shiny and glossy user interface, crowned with a touch of yellow, much like Peach’s crown. Our design choices, carefully considered for ease of navigation, make interacting with our app not just user-friendly, but a truly delightful experience. (New text added here)

Color palette from Figma

Live demo of shiny and glossy design (the design is not 100% complete in this video):

This post was edited 03.02 12:40

New text is marked with: (New text added here)

  • Added live demo of bot and some text
  • Added live demo of gamification
  • Added live demo of glossy design, picture and some text

Nasty hacker; import Power Pages when provision fails

With this blog post we aim for the badge “Nasty Hack”

Lots of teams have struggled provisioning new Power Pages sites in newly created environments. We have experienced the same problem.

Provisioning in older environments still work, so the idea was to provision a blank new power pages site in an old environment in another tenant, put it in a solution, export the solution and import it into the new environment.

Export the solution

Unmanaged, of course, to be able to edit the site in the destination environment.

Import

Success

Reactivate site

All good so far..

But then..

Provisioning still is in progress like the first one..

DOH!

Maybe this should qualify for the DOH! Badge instead..

The Orange Nasty Hackers

In simple words, we though that this below wasn’t a very sustainable solution!

We though we could have used a more sustainable solution using our mobile phones and Power Apps. Anyone can get location coordinates from the mobile app in Power Apps. Though, Power Apps canvas doesn’t have a looping mechanism (a function) that can keep sending location coordinates. The hack here is to use a timer control and call the location tracking service every 5 seconds. and you end end up with a tracking app in your mobile 🙂

Once we are in position, the admiral click start loot and the tracking starts right away.

Pretty simple but nasty! 😉

It even works on the operations management tracking live map.

Controlling a Canvas app using a USB Gamepad

In Pirates 365 we want to control the awesome Canvas App Game with either a USB Gamepad or a Steering wheel.

Seemed like a great idea

Controlling a Power App using a wheel or gamepad not that easy. Power Apps doesen’t have event listeners from keypress-events. There are several forum- and blogposts that are mentioning this, but it seems like nobody had any really good solutions for it.

We had several ideas to make this work

  • Map gamepad klicks to keyboard events
  • Map gamepad klick to mouse clics
  • Create a Browser addin that listen to the keyUp event and by javascript find the Power Apps left/right buttons in the DOM and click them.

The solution we went for was mapping the gamepad to keyboard presses.We found an application AntiMicroX where it was possible to map gamepad presses to key-presses on the keyboad.

We mapped left click on the gamepad to “Space”, and up to “tab” – making it possible to toggle between the different power apps buttons

Demo of the solution

Her we can see that we are moving the ship with the gamepad – emulating clicks on power app buttons

Badges 🤩🥳

  • We thing this may be awarderd the “Nasty Hacker” because it is…..hacky
  • We think this may be awarded “Embedding numbnut since we are attaching “things that can flip bits”

🏎 Try to Beat this RPM. We Dare You! ⚡️

These engines turn at the whopping speed of 0.000001 m/s. Not only is the engine slow, the technology is old. The transistors that make the coquetry and the coils that make the engine revolve are the most retro data component of them all. Unfortunately we already claimed the retro badge, but would like to show you what we do with physical component and how we do small hacks to make things work.

Captain Mats working on why the motor is not revolving at sufficient speed for what we require. Look at that focus! He is a legende! Girls and boys, he is unfortunately taken 💔

Unfortunately we had to hack our solution in the end, even with stellar help from Marius and @bastards we could not get the engine revolving any faster enough for our needs. The voltage output and ossilating is stuck making it perform suboptimal. We think the motor or voltage output needs to be adjusted but lack the tools and time, so the hack we ended doing was to swap the Arduino with the laptop and running a virtual Arduino instead of the real deal. Simulating something easy with something complex to make stuff work. We feel this is a hack in the spirit of making up lost time.

Meme about using expensive power computers to perform mundane tasks meant for matchbox computers