In addition to our API, the nr. 1 howler service also has a website. It is built using React Router, with an impeccable developer-experience with hot-reloading, react hooks, sass, and more!
Using GitHub Secrets and Workflows ensures secure handling of sensitive information through encryption and access control. As an example from the github-actions yml-file: azure_static_web_apps_api_token: ${{ secrets.AZURE_STATIC_WEB_APPS_API_TOKEN_THANKFUL_HILL_0CC449203 }}
In the world of Application Lifecycle Management (ALM), being a hipster isn’t just about wearing glasses and drinking artisanal coffee. ☕ It’s about using the latest tech and best practices to ensure your solutions are compliant, governed, and respectful of privacy. Here’s how we’ve upgraded our ALM setup to be as sleek and modern as a barista’s latte art. 🎨
🌟 Delegated Deployment with SPNs
The backbone of our ALM setup is delegated deployment using a Service Principal (SPN). This ensures that our deployment process is:
Secure: Using the SPN credentials, we minimize risks of unauthorized access.
Streamlined: Delegated deployment allows automation without compromising governance.
Key configuration steps:
Set “Is Delegated Deployment” to Yes.
Configure the required credentials for the SPN.
UAT:
PROD:
📜 Approval Flows for Deployment
Governance is king 👑, and we’ve built a solid approval process to ensure that no deployment goes rogue:
Triggering the Approval:
On deployment start, a flow triggers the action “OnApprovalStarted”.
An approval request is sent to the administrator in Microsoft Teams.
Seamless Collaboration:
Once approved, the deployment process kicks off.
If rejected, the process halts, protecting production from unwanted changes.
📢 Teams Integration: Keeping Everyone in the Loop
Adaptive Cards are the stars of our communication strategy:
Monitoring Deployments:
Each deployment sends an adaptive card to the “ALM” channel in Microsoft Teams.
Developers and stakeholders can easily follow the deployment routine in real-time.
Error Alerts:
If a deployment fails, an error flow triggers and sends a detailed adaptive card to the same channel.
This ensures transparency and swift troubleshooting.
Release Notes:
Upon a successful production deployment, a flow automatically posts an adaptive card in the “Release Notes” channel.
End users can see the latest updates and enhancements at a glance.
🛡️ Why This Matters
Our upgraded ALM setup doesn’t just look cool—it delivers real business value:
Compliance: Ensures all deployments follow governance policies.
Privacy: Protects sensitive credentials with SPN-based authentication.
Efficiency: Automates processes, reducing manual intervention and errors.
Transparency: Keeps all stakeholders informed with real-time updates and error reporting.
💡 Lessons from the ALM Hipster Movement
Automation is Key: From approvals to error handling, automation reduces the risk of human error.
Communication is Power: Integrating Teams with adaptive cards keeps everyone in the loop, fostering collaboration.
Governance is Non-Negotiable: With SPNs and approval flows, we’ve built a secure and compliant deployment pipeline.
🌈 The Cool Factor
By blending cutting-edge tech like adaptive cards, Power Automate flows, and Teams integration, we’ve turned a routine ALM process into a modern masterpiece. It’s not just functional—it’s stylish, efficient, and impactful.
The area of AI and Copilot has started and finally has reached the wizarding world. The Ministry of Magic is using several AI-based features to automate their Letter heavy processes.
And now it’s also time that school like Hogwarts use AI to support students and upcoming students with their application process.
This follows certain guidelines to protect the magical users privacy and the right handling of sensible data. We don’t want to spill the tea about death eater, unforgiving curses or you-know-who don’t we?
First of all, we want to protect the privacy of the users and internal business data. Such as the information about our students and their applications
This sensible data is excluded from our externally accessible HermioneAI. We only make that available as knowledge base for specific teachers on Microsoft Teams.
Additional the authentication itself is based on Microsoft Entra, to limit the risks of leaking sensible data.
This makes general and internal knowledge available to selected teachers of the school:
In this example, we use AI to search for specific information from Dataverse and display internal data that has been summarized and collected for easy access to the End User Wizard.
The handling of information and authentication is different on our web application:https://owlexpress.app/chat
As this is publicly available, mainly for students going through the application process, no authentication is necessary. Also these users don’t have access to internal data and only rely on Hermione’s general knowledge. Which already is huge.
Similar to the internal chat experience, the external relies on security measurements to avoid certain topics or harmful content. This for example covers Death eaters, You-Know-Who and the dark side of magic in general, but also jailbreak attempts
So, if a potential student or any other user of the chatbot on the website asks about joining the dark side, forbidden spells or details on You-know-who, we won’t help here
Hip is usually used about things or people when its new, unconventional or non-mainstream. However, hipsters have in modern time become more of a oldschool thing – people dressing up in vintage clothing, listening to old music and living a lifestyle thats is more similar to how it was before (and usually living in the center of Grunerløkka). Being Hip is a thing that is open to interpretation.🤔
Our team has a lot of junior resources – new to the magical world of Power Platform. All of these different functions are considered new and cool – and therefore also hip. And one especially hip and up-and-coming thing is privacy concerns. With everything going digital it is much more important to be aware of security in the solutions: especially in the digital transformation in the Harry Potter Universe: the danger awaits around every turn. Maybe Voldemort somehow find a way to hack into Hermiones app so he is able to locate Harry Potter. This would be catastrophic.
To prevent disaster DLP policies, setting up Audit Logs and Environment-Level Security are some of the important steps that needs to be taken:
Data Loss Prevention (DLP) Policies help prevent the accidental or unauthorized sharing of sensitive information by controlling which connectors can be used in apps and flows. This ensures that sensitive data remains protected and compliant with regulatory requirements.
Audit logs track user activities and changes within the Power Platform. They provide a detailed record of actions, which is crucial for monitoring, compliance reporting, and investigating any suspicious activities or breaches.
Environment-Level Security. Setting up security roles and permissions at the environment level ensures that only authorized users can access specific data and resources. This helps maintain data integrity, protects sensitive information, and ensures that users only have access to the data necessary for their roles.
These are just some of the things that we want to focus on in the further development of our solution.
In our app, we have already implemented a secure login process using Microsoft account authentication to ensure your user information is accurately identified and protected.
As shown below, when opening the app: is checks the user credentials and name
The User Credentials is also used in a feedback flow that provides the oppurtunity to provide the Weasley Twins with feedback on the app. That way, they can continuosly make the app better and more user friendly.
And speaking of cool functions: POWER AUTOMATE Go with the flow
This is especially magical feature in our app. With the press of a botton it triggers a flow sending out a email through Outlook: The input is dynamic from the users input in the app and dynamic user information gathered as shown above.
Take notice of the tidy and neat code-naming standards following Best Practice.
The outcome To user:
The feedback sent to the twins (service-user):
Keep it HIP and cool, and always go with the flow! 😘
A first time for everything. I want to learn new spells and want to try fabric and power BI for the first time.
Testing import data to Power BI desktop – With both data import and Direct Query.
Setting the Data source credentials to get the queried data to the Power Bi Service.
Test is working – Now lets wave the wand and build!
Fabric
HACK:
Got help from a team in the same house – HUFFLEPUFF POWER.
We can not get the trial to work in our tenant that we have for ACDC, so i had to create a service principal user in the ACDC tenant – and make it available multitenant. And then use this service principal in fabric in my work tenant to get the data in there.
We want to make a lakehouse with fabric, so after the data is clean, we can use it in Power BI and also share the data with other instances that needs to use the data.
Made a new Lakehouse: WizardData
Made the connection to the ACDC tenant
Cleaned the data:
Did this for all 7 tables.
I could not get compliant with the Power BI for my work tenant. So i decided to use Power BI desktop direct query to get the data from Dataverse and build a dashboard.
Start of dashboard: To be continued.
One last comment – We helped another team with the HACK to get the ACDC data into another tenant. COMMUNITY! – SHARING IS CARING!
NOTE TO THE JURY: we have taken your comment in and added details in the bottom of this article.
In our Wayfinder Academy, we take a comprehensive and magical approach to understanding the student’s history, aspirations, and potential to recommend the best possible new school. The process is detailed, thorough, and personalized, ensuring the student is matched with an environment where they can thrive.
Just to remind the process, here we assume a student who didn’t feel right about their current faculty, filed an application. Immediately after that we request the tabelle from their current faculty (historical data), ask a student to upload some photos from most memorable moments, and then invited to an interview. While we are still working on the interview step and will share the details later, with this article we want to add more details about one of our approaches to mining extra insight from the student’s interview by analysing the emotions.
We use this emotional recognition along with the interview, to get 360 degree insight on the student`s reaction to the questions, that are designed to figure out their values, aspirations, fears, etc we can use to calculate the probability of their relation to the faculties and identify the one with the highest score (the scoring approach will be shared in a different post).
So, we are using a video stream capture to record an interview session and extract the emotional dataset.
It allows us to receive one more dimension that will extend standard datasets of the student, such as feedback, historical data from previous schools, etc.
We use the imentiv.ai API to analyze the video and grab the final report. We then make the final dashboard in Power BI (we love it)
and embed it into OneLake.
Imentiv AI generates emotion recognition reports using different types of content, such as video, photos, text, and audio.
We implemented the single-page application to create an interactive experience by recognizing the emotions in the image captured via the webcam on our tablet. The analysis of the video stream takes more time, so we will demonstrate it later.
The app consists of two parts: a PoC to recognize the emotions in a photo from a webcam and an example of an emotion recognition report.
To build that PoC application, we decided to use the NodeJS stack. The engine is based on Bun, which is a modern and highly effective alternative to NodeJs. Compared to NodeJs, Bun was written with Rust.
For the front end, we are using React and ChartJs. We are hosting the PoC on our laptop. To make it available to the public internet, we are using CloudFlare tunnels. It also covers the SSL certificate termination, so your service will be secured by default without any significant effort.
The app server and the client app run inside a docker container, so you can deploy easily with a single command: docker-compose up—build.
To optimize the final container size and improve the speed of the build, we are using docker files with two stages: one to build the app and the second one to run the final artifacts.
PS:
Badges we claim:
Thieving bastards – we are using third party platform to recognize emotions in video and photo.
Hipster – we use BUN to run the application
Hogwarts Enchanter – we use Mystical AI imentiv.ai API to grab the emotional reports and visualize it in an user friendly way (see the screenshot above). Our enchanted workflow is using the data and making it available in OneLake. Wizarding world becomes closer when we see the AI based deep insight from various data sources in one place, in easy to read and interpret format.
Right now – we are using web socket server to implement real time communication between client and server site.
Client side salsa – we use React to implement front end.
PS2: pls come over to our camp and test it out! We want to know how you feel! 🙂
All data and in our app communicaation is happening within our own tenant and using services within our tenant. The rules we have created for compliance and security will also apply to the game we have created.
Governance should and can de done using the tools you use to manage and audit security, privacy and compliance for Microsoft 365, Azure and Dynamics.
If the solution is used on other tenants, the rules for that tenant will be applied.
Azure OpenAI, keeps the information in the tenant, and will adhere to the privacy rules and ethics that has been apllied to it.
User are all authenticated and given permission to the solution and tables.
Data is stored in Dataverse. We govern the data in this solution and will be alerted if there are store any personal information. We do not store anything that the users puts into the prompt.
Just a few administrators have access to give other people permissions to use the app.
Our dazzling frontend application is implemented with the coolest of the coolest technologies. React allows us to run our application seamlessly in the client’s browser. In addition to our static code analysis pipeline, these technologies make sure we are compliant and protects the privacy of the user.
We used:
React
Node.js
Typescript
SASS
Fluent UI for React
Not only are they hippety hip, they are third party libraries super useful for our application! Weehoo!
Our PlumbBot is essential for helping our potential, and existing plumbers to solving plumbing issues, and needs to be highly trusted in the advice it gives. Therefore we’ve grounded it with Plumbing domain knowledge as well as limited AI generation with the trigger phrases. We’ve also provided a system prompt to make sure it answers within its limits:
How does this relate to the Hipster-badge?
With Copilot Studio, and launch of copilot in General, governance and content moderation is important on different dimensions:
Unauthorized knowledge access – Users of the chat bot, if not secured or governed, might get access to proprietary data or PII which could be catastrophic.
Misapplication of copilot to other domains – Grounded, domain specific copilots, or copilots in general, should be applied to domains or topics it does not have knowledge of or is able to provide reliable answer too
Misleading, not-grounded or true responses – More generally, there should be high confidence in the answers of the chatbot to use it for customer-facing applications. It should also not be lead away with misleading prompts.
So most importantly of PlumbBot, by default, Copilot locks high content moderation to uploaded files as shown above. This makes sure the Copilot is grounded when prompted and will not answer to request out of its scope and knowledge domain. In addition to the trigger phrases, described in PlumbBot is your first line of defence against a clogged toilet!, we have high confidence that our PlumbBot will not alleviate from its goal of helping customers with their wet problems
Know your limits – Get professional Help
More importantly, some issues are too difficult to handle yourself, especially if you have not gotten PL-600 (Stop at nothing to get certified, the bravest might face the wrath of JV Kong). Therefore, PlumbBot will helpfully suggest to the customer that it can instead create a PlumbQuest, redirecting the customer to the Request form to get in contact with price-aware plumbers better qualified to solve the issue.