Our Potions examination interface exemplifies the standards of the Glossy Pixels badge through its exceptional implementation of responsive design principles. The interface maintains visual integrity and functionality across all device sizes, from desktop monitors in the Hogwarts library to mobile devices in the dungeons.
The parchment-style containers and medieval typography demonstrate sophisticated scaling, ensuring readability without compromising the magical aesthetics. It resides magically and consistently above our three-column ingredient list and gracefully adapts to different screen sizes while maintaining clear visibility of crucial examination elements.
Most importantly, the interface achieves this responsiveness without sacrificing the mystical atmosphere essential to a Hogwarts examination. Like a perfectly brewed potion, each visual element maintains its properties across all viewing conditions, creating an experience that is both functional and authentically magical.
In our application, we need to use a plugin for a specific part of our challenges. One of our games involves potion brewing, where we need potion ingredients that are added to an inventory table, which is related to the contact that navigates our tasks.
For our students we are providing a guidance when we start working with their request.
After they submitted a form – the activities are created for them.
They can find it in the My Activities page:
Then they have to finish all the activities and be allocated to a faculty.
For example, when user clicks on the Interview scheduled activity – they will be navigated to the Survey page and can start answering to the questions.
It’s EVIDIosa, not Leviosaaaa thoughts on ALM and the way of work.
Claims the badge ACDC Craftsman.🚀
Workflow – Azure DevOps Boards
Using boards to keep track of badges and tasks. For better collaboration, progress and overview.
Scrum
Given the very short length of the project, there has been a need for tight collaborations and check-ins. In a way we have practised a super compressed variation of Scrum, with several daily standups each day to secure progress. In a project like this, it is particularly important to avoid banging your head against the wall for too long time on one task. Regular meetings has made sure that the team has been able to maintain a best possible progress at all times, while also being a platform for contributing with new ideas.
Naming conventions
Tables (Entities)
General
Change tracking should be enabled by default for all “business” entities where history might be needed.
This is because change tracking is often indispensable for troubleshooting.
Change tracking can be disabled for specific fields if necessary, e.g., when automation frequently changes irrelevant fields.
Forms
Never modify the standard form. Clone it (save as) and hide the original form from users.
Standard forms can be updated by the vendor, which may overwrite changes or disrupt updates.
Having the original form for comparison is also useful for troubleshooting.
This does not apply to custom tables.
Ensure fields like status, status reason, created by/on, modified by/on, and owner are visible on the form.
These are useful for both administrators and end-users.
Option Sets
Do not reuse “Default” option set values by giving them new names.
When Should I Use Configuration, Business Rules, Processes, Flows, JavaScript, Plugins, or Other Tools?
It is important to choose the right tool for the job.
Considering administration, maintenance, and documentation, we prioritize tools in the following order:
Standard functionality/configuration
Many things can be solved with built-in functionality, e.g., setting a field to read-only doesn’t require a Flow. 😉
Business Rules
Processes/Workflows
Can run synchronously.
JavaScript
Flows
Suitable for querying Dataverse data.
Plugins
Solutions
Unmanaged in development:
We have not yet decided on a solution model but will likely use a base package initially and then move to feature-based solutions.
Managed in test and production.
Deployment
All changes are deployed via pipelines in DevOps.
NO MANUAL STEPS/ADJUSTMENTS IN TEST OR PRODUCTION.
All data referenced by automation (e.g., Flow) must be scripted and inserted programmatically (preferably via pipeline) to ensure GUID consistency across environments.
Application Lifecycle Management
Why ALM?
Application Lifecycle Management (ALM) enables structured development, testing, and deployment processes, ensuring quality and reducing risks.
It is a consistant way of deploying features
You get equal environments
Better collaboration
History
Common way of working
The whole team is always up to date with changes in developmen
Overall solution
A diagram showing the overall deployment cycle for our CRM Solution.
Solution Strategy
Using a single solution strategy. All ready changes is added to the solution. Fewer dependencies and less complexity is preferable when working within such a time span, and allows for smoother collaboration. The solution is not too complex still, so it makes sense to gather all components in a single solution. As mentioned earlier, it should be considered over time to move to feature-based solutions.
Service Principles
Service Principals are used for between Azure DevOps each environment to ensure industry standard connection to Dataverse without exposing credentials in the code. These are the names of our Service Connections.
hogverse-dev
hogverse-validation
hogverse-test
hogverse-prod
In honor of the great ALM Wizard 👉Benedik Bergmann👈the Service Connections are configured with the new feature of Workload Identity federation for ADO Connection. This eliminates the need for managing and renewing secrets, reducing the risk of pipeline failures due to expired credentials. Credentials are not exposed in code or stored in repositories.
Setup of Workload Identity federation for ADO Connection
App Registration: Registering app registrations in Microsoft Entra ID.
Dataverse: Adding the app registration as an application user in our target Dataverse environment, assigning it System Administrator role.
ADO Service Connection: Creating a service connection in Azure DevOps, linking it to the Dataverse instance using Workload Identity Federation.
Adding Federated Credentials: Configuring the app registration to recognize the ADO service connection by setting up federated credentials with the correct issuer and subject identifier.
Entra ID:
Environments
DEV The Room of Requirement The development environment for Hogverse.
Validation – The Chamber of Truth The validation environement to check changes before merging to our main branch.
TEST: The Restricted Section The UAT test environment for Hogverse.
PROD: The Great Hall The production environment for Hogverse.
Pipelines
Our ALM solution for deploying changes in Power Platform and CRM is the following:
Export The Export Pipeline is retrieving the changes from the “DEV The Room of Requirement” environment and creates a pull request.
Build The Build Pipeline packs the changes in the pull request branch and deployes it to “Validation – The Chamber of Truth” to see if something breaks before mergin it to our main branch. When this completes successfully it creates a Release and can be deployed with the Release Pipeline.
Release The Release Pipeline deployes the changes to “TEST: The Restricted Section” and “PROD: The Great Hall” after a user has in the team has approved the release, by using Approval and checks in Azure DevOps Pipelines.
Approval and checks for “TEST: The Restricted Section”
Approval and checks for “PROD: The Great Hall”
Repository Settings:
Settings for the Build Service User.
Setting and requirements for merging changes to our main branch that goes to production.
Azure DevOps Pipelines for CRM Solution
Export Pipeline
The Export Pipeline is retrieving the changes from the “DEV The Room of Requirement” environment and creates a pull request. Below is the yml code for export pipeline.
trigger: none
pr: none
resources:
pipelines:
- pipeline: Build # Reference to the build pipeline
source: Build # Name of the build pipeline to trigger from
trigger:
branches:
include:
- main
stages:
- stage: DeployTest
displayName: "Deploy to Test"
jobs:
- deployment: deployTest # Use deployment job for environment reference
environment: "The Great Hall - Test" # Reference the 'The Great Hall - Test' environment
pool:
vmImage: "windows-latest"
strategy:
runOnce:
deploy:
steps:
- checkout: self
- download: Build # pipeline resource identifier.
artifact: drop
- task: PowerPlatformToolInstaller@2
displayName: "Power Platform Tool Installer"
- task: PowerPlatformImportSolution@2
inputs:
authenticationType: "PowerPlatformSPN"
PowerPlatformSPN: "hogverse-test"
SolutionInputFile: '$(Pipeline.Workspace)\Build\drop\Solutions\HogverseBasis.zip'
AsyncOperation: true
MaxAsyncWaitTime: "60"
- task: PowerPlatformPublishCustomizations@2
inputs:
authenticationType: "PowerPlatformSPN"
PowerPlatformSPN: "hogverse-test"
AsyncOperation: true
MaxAsyncWaitTime: "60"
- stage: DeployProd
displayName: "Deploy to Production"
dependsOn: DeployTest # Depends on successful deployment to Test
condition: succeeded()
jobs:
- deployment: deployProd # Use deployment job for environment reference
environment: "The Restricted Section - Prod" # Reference the 'The Restricted Section - Prod' environment
pool:
vmImage: "windows-latest"
strategy:
runOnce:
deploy:
steps:
- checkout: self
- download: Build # pipeline resource identifier.
artifact: drop
- task: PowerPlatformToolInstaller@2
displayName: "Power Platform Tool Installer"
- task: PowerPlatformImportSolution@2
inputs:
authenticationType: "PowerPlatformSPN"
PowerPlatformSPN: "hogverse-prod"
SolutionInputFile: '$(Pipeline.Workspace)\Build\drop\Solutions\FE360Basis.zip'
AsyncOperation: true
MaxAsyncWaitTime: "60"
- task: PowerPlatformPublishCustomizations@2
inputs:
authenticationType: "PowerPlatformSPN"
PowerPlatformSPN: "hogverse-prod"
AsyncOperation: true
MaxAsyncWaitTime: "60"
Issues
We had this problem during the event and unfortunately we did get to run hour pipelines the way we wanted.
And we tried a work around but that ended up with… Self Hosted Agent.
Looks good all the way in the installation aaaaand in the final step was blocked by admin rights at the company…. You need admin rights…
ALM for Fabric
We have implemented deployment pipelines for deploying changes between workspaces so that the reports can be tested and verified before going into production.
Hogverse Deployment Pipelines for deploying items between workspaces.
ALM for Svelte and Front end
The Svelte project is for now hosted in a private Github repository shared between the developers. Each developer creates their own branch for each new feature added. When a feature is ready to commit, a pull request is created and approved from others on the team. On approval, the branches are merged and the feature branch is normally deleted to ensure a clean project at all times.
With more time on our hands, we would have preferred to import the repository to Azure DevOps and created pipelines for Dev, Validation, Test and Prod as for the CRM solution.
Bicep honourable mention and hot tip
The Azure resources used in the project has mostly been created on the fly as part of experimentation, but we would most definitely have created Bicep files for deployment of them for each environment as well. Microsoft MVP 👉Jan Vidar Elven👈 have created a super useful public repository with templates for deploying resources on his Github account: https://github.com/JanVidarElven/workshop-get-started-with-bicep
It’s been five years since the whole world realized what we’ve known for years – the future is digital, and we all need to get on the wagon. Unfortunately, Hogwarts staff and board has been lagging behind, not realizing the immense capabilities that lies in intelligent automation.
💡 The Solution 💡
First of all, we needed to create a digital platform for staff and students board. We decided to store all information about who and what’s going on at Hogwarts in Dataverse, making it easily accessible for both pro- and low-code apps and automations.
In the Hogverse Student Management App, Hogwarts staff can keep track of enrolled students and subjects, they can create teams for teaching and send out notifications to students. In this way, time consuming administrative tasks are cut down to a minimum, and each teacher has more time for magical moments with their students.
With a one-click-only to create a team, teachers and students are all set for online teaching in case of trolls, especially bad weather, or a really good party a pandemic. Teams will also be used for teacher-student communication, as well as automatic notifications in form of an adaptive card in Teams, in case a student is about to fail in a subject:
We are sending out similar notifications if a student has done something particularly stupid, which removes the need for detention and other time consuming, medieval methods of punishments. Contract signing automated using OneFlow connectors in Power Automate, and storing the signed contracts in the safe “Secret of Chambers” – SharePoint, triggered if a student is alarmingly interested in death spells or the doctrine of Dark Magic.
The daily life of a student is also getting a lot easier with this digital transformation. No more carrying around heavy, ancient and outdated books. You will get all the information you need for studying by asking Dobby in the Student Portal.
The Student Portal is always updated, since the AI-assistant is instructed to request the Harry Potter DB (api.potterdb.com) for spells and potions. The Portal also gives them full oversight over test results and grades, so that they always know which subject to focus on.
The Student App to bring with you wherever you go, brings teaching out of the classroom, and is enabling learning on-the-go. Here the students have access to the teachers in their pockets, and they will be encouraged to always keep learning. It is also nice not to have to stand in line in front of prof.Sybills office to get a reading of your fortune after having a coffee.
The implementation of Teams will also bring a whole new world to the students. Getting in-app notifications for urgent matters, being able to communicate with teachers without searching the whole school premises, and having the ability to take that Monday morning 8 o’clock History of Magic lesson from the comfort of your warm bed, will hopefully increase the student satisfaction.
✨ The Outcome ✨
Everyone that has taken a pedagogy class or two knows that a student’s well-being is crucial for their ability to learn. These steps of digital transformation will therefore play an important role in education the future Magicians of the World.
We are also hopeful that the new digital tools provided to Hogwarts’ staff will decrease turnover and long-term sick leave. We have seen too many new hires at Hogwarts leading to the return of Voldemort, and this might mitigate the risk of a new Dark Age.
All in all, a more digital magic world, is a safer place for all of us Muggles. 🪄
In our Hogwarts App, we’ve conjured up some magical tools to make life even more exciting for our students and teachers. These enchanting features are designed to inspire, motivate, and bring a touch of wonder to everyday activities. Here is a list over our Magical Insights:
🎯 House Points Tracker
The battle for house glory just got more thrilling! We’ve introduced a House Points Chart, where students can keep track of the current standings and see which house is leading the charge. The bars shimmer in the house colors—scarlet for Gryffindor, emerald for Slytherin, sapphire for Ravenclaw, and gold for Hufflepuff—bringing the spirit of friendly competition to life.
🏆 Golden Snitch Leaderboard
Catch the Snitch, and claim your glory! The Golden Snitch Leaderboard highlights the top players in the beloved Snitch-chasing game. The rankings sparkle with magical hues: gold for first place, silver for second, and bronze for third. Will you be the next Seeker extraordinaire?
📚 Class Attendance Tile
At Hogwarts, striving for 100% class attendance is a noble goal! To aid in this quest, we’ve added an Attendance Tile to the report. It reveals how each class fares in attendance, with fascinating insights. However, beware—the data shows a troublingly low turnout in Defense Against the Dark Arts! Could a bit of extra encouragement—or perhaps a strong Patronus—be the solution?
📊 Students by House Chart
Curious about the balance of Hogwarts houses? The Students by House Pie Chart offers a delightful overview, showing the distribution of students across the four houses. Each slice of the pie is enchanted with its house color, providing a clear and colorful glimpse into Hogwarts’ magical diversity. This tool fosters greater understanding and unity among the houses.
With these mystical updates, we’re ensuring that every student and professor can dive deeper into the wonders of Hogwarts life. Keep exploring, keep competing, and most importantly—keep the magic alive! 🌟
Greetings, magical technologists! ✨ At Team PowerPotters, we’ve combined the power of Bing Search API, OpenAI GPT-4, and Power Automate to create a truly innovative web-crawling solution. Our workflow dynamically discovers, analyzes, and integrates external APIs, transforming how potion-related data is sourced and utilized. Here’s how we earned the Crawler Badge by blending search, AI, and automation into one seamless process.
The Crawler Workflow: Step-by-Step Magic
Discovering APIs with Bing Search API
Purpose: To dynamically find public APIs related to Harry Potter spells or magical data.
Execution:
Query: "public API Harry Potter spells".
Filters: Results are restricted to recent entries with keywords like /api or “documentation”.
Analyzing APIs with OpenAI GPT-4
Purpose: To validate URLs as APIs and extract relevant schemas, field mappings, and example data.
Execution: For each URL, OpenAI determines if the URL links to an API or its documentation. If valid, it provides:
API schema.
Example JSON response.
Field mappings for key data (e.g., Name for spell name, Description for spell effects).
Integrating with Power Automate
Purpose: To process, validate, and integrate the data into our system.
Workflow Steps:
Parse Bing Results: Extract relevant URLs using JSON parsing.
Validate URLs: OpenAI determines if the URL links to a valid API and provides field mappings.
Dynamic Integration: Call validated APIs and use extracted data to:
Create new product entries in D365FO.
Enrich existing products with spell names (Name) and effects (Description).
Automation: Run schema validations dynamically, ensuring data consistency.
A Scenario in Action
A potion master requests information on new magical spells for potion research. Using this workflow:
Search: Bing Search API identifies APIs like Potterhead API.
Validation: OpenAI ensures the API provides valid spell data, extracting fields like Name (e.g., Accio) and Description (e.g., Summons an object).
Integration: Power Automate dynamically updates the potion master’s research database with enriched spell information, saving hours of manual effort.
Why This Deserves the Crawler Badge
Innovative Use of Search:
Bing Search API dynamically finds and filters public APIs, extending its use beyond static results.
AI-Powered Validation:
OpenAI GPT-4 dynamically analyzes URLs, validates APIs, and generates schemas and field mappings for seamless integration.
Solving Real Business Problems:
Potion masters gain enriched, real-time product data without manual intervention, enabling informed decisions.
Scalability:
The workflow is adaptable for future needs, such as integrating potions, artifacts, or even non-magical domains.
Crawling the Web for Magical Insights
This dynamic web-crawling solution exemplifies how search, AI, and automation can revolutionize the way data is discovered and integrated. With this innovation, we humbly submit our case for the Crawler Badge, showcasing how Team PowerPotters continues to push boundaries: acdc.blog/category/cepheo25.
The orders are flying in like owls in the morning, and you are losing track. Our solution is giving the back-office of your shop an easy overview over new orders and sales. Integrated into your CRM-system.
Data is synchronized through Dataverse to Fabric with the Fabric Link.
Any person can run power automate flows from dataverse. But only the Fabricator can trigger them from fabric lakehouse data changes using data activator in workspace.
Here we combined the power of fabric with flexibility of power automate to manage our image collection process.
We firstly created data activator and selected our data sources, so activator knows when and where to trigger.
We configured it so it will only trigger when an image file is added to our Images folder for students.
Define action and create flow to run.
Here we define an action and it gives us endpoint to be used as flow trigger. We will come back to here at last step.
We need to create a rule to call the action we defined. This rule allow us to add additional conditions to our filter if needed, let us choose which action to call. Also we can add additional parameters to be sent to power automate flow.
And lastly our power automate flow: The endpoint we received before needs to be set for connection of trigger.
We are using power platforms ai builder to recognize the data and categorize it for further usage.
We send our response to sharepoint for further operations.
As Fabricator it is important to automate our business and keep it tidy and neat. This is the way of the fabricator.
Cool looking stuff! Ever wanted to get into the Gryffindor common room? Well, let me introduce you to the Fat Lady:
She even moves around! (come see us for a live demo). What better way would it be to introduce new children to their school-life long companions?
When no person is detected in the mirror, the Fat Lady invites you in to get your photo taken. She is welcoming and gives greetings. Entering the mirror, you are awarded your animal after a quick think of the magic. Me myself I got an owl. No wonder, since my middle name is Harry!
Luckily, I can also get food and a cage for my animal pretty easily, but this flow isn’t as magical.