Bonus to ALM Magic, Digital Transformation and Low Code!

Kudos to one of our judges @benedikt bergmann for all the work shared at GitHub. We strongly recommend all fellow magic makers here to check it out GitHub – BenediktBergmann/PCFIntro Will it be too much to say, we really want to win a book about Application Lifecycle Management on Microsoft Power Platform?

We also released the PowerPlatform custom connector. That is available in our GitHub repo. The repository also includes GitHub pipeline implementation to release the fresh version of the connector. 

So, you can now use the connector to process the emotional recognition reports by uploading the prepared dataset to SharePoint and notifying the Admin that the artifact is available.

The repo structure is simple enough. 

The repo also contains the release pipelines based on the GitHub actions. 

How to build a new release? 

Follow these steps to manually create a new release on GitHub: 

  1. Create a New Tag: Use the naming convention v.x.x.x.x, where x is a digit. For example: v.1.0.0.0 
  1. Generate Artifacts: Ensure that the artifact name is automatically derived from package/Other/Solution.xml using the UniqueName attribute. 
  1. Update Documentation: Make sure all relevant documentation is up-to-date with the new release details. 
  1. Publish the Release: On GitHub, draft a new release, add the appropriate tag, and upload the managed and unmanaged solution files. 

Artifacts 

Artifacts generated during the release process include two types of solutions: 

  • Managed: This contains the finalized version of the connector, which is ready for deployment. It ensures that all customizations are locked and can be used in production environments. 
  • Unmanaged: This includes the editable version of the connector for further development and customization. It is ideal for testing and development environments. 

The secret key to productivity: Low Code Magic Powers our solution

In the world of Microsoft tech, much like Hogwarts, the real magic happens when all the pieces work together

We take all of the feedbacks from our honorable judges as a gift (THANK YOU!!! and we hope you like the morning sweets, those are healthy snacks from Spain brought by Yurii the Wise, and we do accept orders and offer free delivery). Pls don’t take this wrong, we are just trying our best!

Thanks for your yesterday recognition

Since yesterday evening, we could not stop thinking about low code.

So, today we aim to do the series of updates sharing our efforts in the Low code category. We know we might not be the best in there, but we are exploring, listening around and learning as we speak. Would Low code dance or something will help to improve our image in the low code field?

Getting down to business, we are using the Power automate in our solution. The process we automate is the creation of the activities required for each student in order to be allocated to a faculty. We have a template of the activities. After the student starts the assessment process – the guidance is available for them.


In order to pass the survey, we are using the survey responses table, the drafts are prepared in advance using the Power Automate:

We will share more in the coming posts. Please excuse us for the long start of this one, it was a clumsy attempt to make it feel like a morning read after a late night yesterday. We had a lot of fun and still having it!

Another Magic with Power of the Shell: Automating Speech-to-Text Setup with ARM Templates.

Here we go. Some more details we promised yesterday. On our Logiquill platform, where do we analyze the data from students’ applications and interviews (check out on of our earlier posts if you want a reminder on the whole flow Even the most enchanted hats can make a wrong call…  | Arctic Cloud Developer Challenge Submissions) we built a project combining Azure Speech-to-Text and automation to claim the Power of the Shell badge in the ALM category. Here’s how we used Azure (WE LOVE IT!!) and scripting to streamline our workflow.

What We Did

We used Azure’s Speech-to-Text service to convert spoken words into text, perfect for automating meeting transcripts or capturing dynamic input. To make deployments repeatable and fast, we created an ARM template that defines the service setup.

How It Works

The following ARM template automatically deploys the Speech-to-Text service: {

  "$schema": "https://schema.management.azure.com/schemas/2019-04-01/deploymentTemplate.json#",
  "contentVersion": "1.0.0.0",
  "parameters": {
    "accounts_sp_la_powerpages_name": {
      "defaultValue": "sp-la-powerpages",
      "type": "String"
    }
  },
  "resources": [
    {
      "type": "Microsoft.CognitiveServices/accounts",
      "name": "[parameters('accounts_sp_la_powerpages_name')]",
      "location": "northeurope",
      "sku": { "name": "F0" },
      "kind": "SpeechServices",
      "properties": {
        "publicNetworkAccess": "Enabled"
      }
    }
  ]
}

This template deploys the Speech-to-Text service with the F0 pricing tier, ensuring cost efficiency.

Automating with PowerShell

To deploy the service, we used a simple PowerShell script: New-AzResourceGroup -Name "ACDC-RG" -Location "northeurope"

 
New-AzDeployment -ResourceGroupName "ACDC-RG" `
  -TemplateFile "speech-to-text-template.json" `
  -TemplateParameterFile "parameters.json"

This script creates a resource group and deploys the template, making setup quick and error-free.

Why This Matters

Using the ARM template and PowerShell together:

  • Saves time by automating resource creation.
  • Ensures every environment is consistent.
  • Fits neatly into CI/CD pipelines for ALM workflows.

This approach shows how scripting and Azure services can simplify tasks, letting us focus on building great solutions.

United under the Gryffindor House and having Fun!!!

https://www.kaggle.com/datasets/omrfaruk92/harry-potter-hogwarts-student-data

The Fabricator weaves once more, blending magic and technology.

Here lies a compendium of Hogwarts students, their traits, and achievements—crafted with care, or perhaps conjured from thin air. Gryffindor bravery, Slytherin cunning, Ravenclaw wisdom, and Hufflepuff loyalty, all woven into this digital tapestry.

But beware, dear wizards and witches: is this a truth etched into the annals of Hogwarts or a clever fabrication of the Fabricator? Dive in, but remember—what you find may enlighten… or enchant.

Is this data fabricated or real, only the Fabricator knows.

Here is a photo proof of that our fellow Gryffindor team, Pass me a sock, did use our data set.

PS:

Community Champion: Goes out of their way to help and encourage other teams as we went out to first find all Gryffindor teams (thanks to @scott we managed to find them all, or almost all…) and then offered the MVP team “Pass me a sock” our help or contributions, which they highly welcomed as we felt 🙂

Sharing is Caring: Code, dataset or api is made available for other teams and you do a sensible contribution(pull request, Integration) on a competing teams solution.

PS2: we did not stop just there but also went to another neighbor team Dumbledore’s Developers as we had some synergies in our solutions in terms of their relation to Sorting Hat… And although they are from the different #ravenclaw house, and we complete between houses also, we are here to have fun, aren’t we? Which is why we decided to share some of our data with them too, and they succeeded with their use case, which they described in the following post The Power of Collaboration | Arctic Cloud Developer Challenge Submissions

Faruk the Fabricator Fun with Fabric

Do you remember the Fabricator and his efforts described in the earlier post?

So now the Fabricator created deployment pipelines to always keep his changes across multiple workspace. No change will go unnoticed, no pipeline will flow alone, no developer will overwrite another. That was the promise The Fabricator gave.. And he delivered.

We are using pipelines inside fabric to deploy fabric components between workspaces.

PS: this article is the part 1 of the Power Of The Shell badge claim, due to us using Fabric pipeline. We started working on CI/CD pipelines for Dataverse solutions and Power Pages. Second article is coming when we finish the remaining of the ALM solutions.

LinkMobility & Logiquill Love story

We love to get this challenge from the sponsors! And with getting to know those, we got an immediate idea. Kudos to @linkmobility!  

Given our workflow, Wayfinder academy offers a Logiquill portal. That portal is used to make an application, upload some documents, schedule and run an interview, and all of that is used to allocate a student to the new faculty. Once that it done, we still believe there is a huge asset in all of that information and therefore, we want to use it further to follow up the student in his or her integration/adaptation journey.  

We store all the data in Sharepoint (we love it!) and also set up some automated workflows (we describe it in the different artcle). 

There are several things we do, and one of those is personalized sms.  

As a notification service our platform uses Link Mobility for sending sms messages to the students with some daily reminders, encouragement, motivation, tips and tricks (based on all the info we collected about the student). For this purpose, we are using an Azure function that receives message and the number of students. 

Our Azure function is written in C# and has a Http triggered function that receives the data necessary for sending sms messages. Link Mobility uses Basic and Oauth2 authentication, and we used the last one is it is more secure and reliable.  

In addition we will make time triggered function, that has own schedule, and will refine all records of the students that are stored in sharepoint.  

If you are also wanting to use this wonderful and easy to use API in addition to the username and password you need also to specify correct Uri, PlatformId and PlatformPartnerId. 

platformId”:”SMS”, 

platformPartnerId = “lHGdgewX” 

The most simple message look like this: 

In documentation to Link Mobile API we found these Urls: 

So, we were the first one to try and it was successful 😊 

As an addition we will try to receive a delivery reports. The delivery report Uri also could be configured in the message as a property 

“deliveryReportGates”: [“{our response URL}”] 

Unveiling Hidden Feelings with the Magic of AI

NOTE TO THE JURY: we have taken your comment in and added details in the bottom of this article.

In our Wayfinder Academy, we take a comprehensive and magical approach to understanding the student’s history, aspirations, and potential to recommend the best possible new school. The process is detailed, thorough, and personalized, ensuring the student is matched with an environment where they can thrive.

Just to remind the process, here we assume a student who didn’t feel right about their current faculty, filed an application. Immediately after that we request the tabelle from their current faculty (historical data), ask a student to upload some photos from most memorable moments, and then invited to an interview. While we are still working on the interview step and will share the details later, with this article we want to add more details about one of our approaches to mining extra insight from the student’s interview by analysing the emotions. 

We use this emotional recognition along with the interview, to get 360 degree insight on the student`s reaction to the questions, that are designed to figure out their values, aspirations, fears, etc we can use to calculate the probability of their relation to the faculties and identify the one with the highest score (the scoring approach will be shared in a different post).

So, we are using a video stream capture to record an interview session and extract the emotional dataset.  

It allows us to receive one more dimension that will extend standard datasets of the student, such as feedback, historical data from previous schools, etc. 

We use the imentiv.ai API to analyze the video and grab the final report. We then make the final dashboard in Power BI (we love it)

and embed it into OneLake. 

Imentiv AI generates emotion recognition reports using different types of content, such as video, photos, text, and audio.  

We implemented the single-page application to create an interactive experience by recognizing the emotions in the image captured via the webcam on our tablet. The analysis of the video stream takes more time, so we will demonstrate it later.  

The app consists of two parts: a PoC to recognize the emotions in a photo from a webcam and an example of an emotion recognition report. 

To build that PoC application, we decided to use the NodeJS stack. The engine is based on Bun, which is a modern and highly effective alternative to NodeJs. Compared to NodeJs, Bun was written with Rust. 

For the front end, we are using React and ChartJs. We are hosting the PoC on our laptop. To make it available to the public internet, we are using CloudFlare tunnels. It also covers the SSL certificate termination, so your service will be secured by default without any significant effort. 

The app server and the client app run inside a docker container, so you can deploy easily with a single command: docker-compose up—build. 

To optimize the final container size and improve the speed of the build, we are using docker files with two stages: one to build the app and the second one to run the final artifacts.  

PS:

Badges we claim:

Thieving bastards   – we are using third party platform to recognize emotions in video and photo.  

Hipster – we use BUN to run the application 

Hogwarts Enchanter – we use Mystical AI  imentiv.ai API to grab the emotional reports and visualize it in an user friendly way (see the screenshot above). Our enchanted workflow is using the data and making it available in OneLake. Wizarding world becomes closer when we see the AI based deep insight from various data sources in one place, in easy to read and interpret format.

Right now – we are using web socket server to implement real time communication between client and server site.  

Client side salsa – we use React to implement front end.  

PS2: pls come over to our camp and test it out! We want to know how you feel! 🙂

“The fabric doesn’t call me anything. He grimly does his work, then he sits motionless until it’s time to work again. We could all take a page from his book”

(c) The Fabricator and the Silicon Valley series.

(c) Faruk The Fabricator inspired by the Silicon Valley series.

If you think a student’s story begins when they enroll at Hogwarts, you could not be more wrong.
The Fabricator is evil and does not care about privacy. The Fabricator is guileful and does not care about truth. He will do everything in his power to gather or fabricate every detail of their lives  and use it to achieve his goals.

At the moment,
The Fabricator uses Fabric to access previous data of the students wishing to enroll at Hogwarts.
We call the Kaggle API within notebook code to retrieve data from Kaggle and write it as a CSV file.

Python code in another notebook is then used to transform this data and divide it into clusters.


Finally, a “Copy Data” activity moves the data to its final destination. But is this truly the end?

Follow the Fabricator for more—if you can, that is.

In the coming days, the Fabricator plans to:

  • Show clustered data in Power BI reports.
  • Use insights to plan interventions or recommendations for students.
  • Perform behavioral predictions: Use the clusters as labels for supervised learning models to predict future performance.
  • Trigger emails or alerts for specific clusters needing attention.

Data is born into Fabric, molded by it. Data does not see the light until it is ready to face users. And when it is finally presented, it is blinding.

(c) The Fabricator and The Batman.

PS: with this article we claim the following badges:

Thieving Bastards – we use online data source from kaggle

Dataminer – we are doing data transformation for better reporting and we are using extrernal data.

Go With The Flow – we create the pipeline that can be used to retrive any data from kaggle. We plan to use data activators to send alerts based on the processed data.

Power User Love – in fabric we created pipeline as a low code solution. inside pipeline we are using python code for advanced operations.