It’s EVIDIosaas final blog post – HOGVERSE

🎯 Goal: To equip Hogwarts to seize opportunities in the future.

🛠️ Method: Implement the wide range of Microsoft Technology, from low-code apps and automations to pro-code portals and AI-models.

💡 Solution:

Fabric Fera Verto

Workspaces Setup with Dev: Room of Requirement, Test: Restricted Section, Prod: Great Hall. We utilized deployment pipelines to manage seamless transitions between workspaces.

Medallion Architecture Implementation with Bronze, Silver and Gold data.

Data Ingestion: Used Dataflows Gen2 to retrieve data from another Dataverse tenant, leveraging a service principal for authentication. Utilized Data Pipelines to fetch data from Harry Potter APIs (spells, potions) and store it in the Bronze Lakehouse.

Semantic Model and Reporting – Built a semantic model on the Gold Layer for reporting. Aimed for a star schema for optimal reporting but noted incomplete implementation. Generated reports showcasing aggregated data and transformed columns.

And in depth blog post has been written here.

Low Code Charms

  • Automated Teams channel creation using Power Automate.
  • Canvas App that enables on-the-go learning for enrolled subjects, breaking reliance on fixed class schedules.
  • In the subject of Divination, students analyze tea or coffee cup patterns using AI Builder, ensuring interactive learning experiences.
  • Adaptive cards simplify communication between teachers and students, using Power Automate.
  • Fields in Dataverse with Formula as Data Type using PowerFX

If you want to know more, come by and get a fortune reading 🔮

Pro-Code Potions

Our pro-code solution is a student portal made with Svelte 5 and SvelteKit.

It’s main features are

  1. Dobby – AI-assistant for students to learn new spells and potions
  2. Student Dashboard (Sorcerers Central)
  3. Fellow Students – Call your classmates using Azure Communication Services

Dobby – AI-assistant

  • Dobby is trained to recognize if the student is looking for a spell or potion that induces a certain effect.
  • Based on analysis of the request, it makes requests to respective endpoint in the Harry Potter DB API and uses the desired effects as filters.
  • It returns the data back to the student where the student can add the spells and potions to their spell or potion book.

Student Dashboard (Sorcerers Central)

  • The student dashboard is the main hub for the students
  • Here they can get a quick glance of their personal progress at Hogwarts.
  • The dashboard have a section for Mischievements, Spellbook, Potions and Subjects.

Fellow Students – Call your classmates

  • This page contains a list of all students in your year.
  • If you ever need to call a friend, you can do that right from this page using Azure Communication Services in the background.

Read about searching in natural language here.

Read about the ACS and phone solution here

Digital Transformation

  • Replaced manual processes with automated workflows for subject management, teacher-student communication, and achievement tracking.
  • Enabled online learning to maintain education during crises, like school closures or emergencies.
  • Integrated adaptive cards in Teams for real-time updates on tests, achievements, and behavioral notifications.
  • Used Power Automate to trigger parent notifications and contract signing for mischievements, ensuring accountability.
  • Leveraged AI (GPT-4 and AI Builder) to create interactive and engaging learning experiences for students.

More detailed explanation here.

ALM Magic

  • Utilized Azure DevOps for boards, repos, pipelines, and Service Principal setup with Workload Identity Federation for ADO Connection . And in depth overview of all code
  • Team Collaboration – Ensured seamless teamwork across CRM environments, Fabric pipelines, and GitHub collaboration for the Svelte project.
  • Code and Solution Management – Maintained an organized approach with naming conventions, solution strategies, and in-depth code overviews.
  • Streamlined Deployment – Automated pipelines supported structured deployments across Dev, Test, and Prod, with detailed diagrams for clarity.
  • Issue Resolution – Tracked and resolved challenges transparently through DevOps boards and GitHub issues.

And in depth blog post has been written here.

Magic Matrix

  • Combined Microsoft 365 tools like Teams, SharePoint, and Power Apps for seamless collaboration and learning management.
  • Integrated GPT-4-powered Dobby Chat to fetch data from Harry Potter APIs, providing instant access to spells, potions, and more.
  • Designed a Canvas App for on-the-go learning, utilizing AI Builder for interactive activities like tea leaf analysis in Divination.
  • Prioritized user experience with real-time dashboards, responsive portals, and accessible tools for students and teachers.
  • Ensured privacy, scalability, and innovation with a strong ALM strategy and secure integrations across platforms.

More detailed explanation here.

ACDC Craftsman

It’s EVIDIosa, not Leviosaaaa thoughts on ALM and the way of work.

Claims the badge ACDC Craftsman.🚀

Workflow – Azure DevOps Boards

Using boards to keep track of badges and tasks. For better collaboration, progress and overview.

Scrum

Given the very short length of the project, there has been a need for tight collaborations and check-ins. In a way we have practised a super compressed variation of Scrum, with several daily standups each day to secure progress. In a project like this, it is particularly important to avoid banging your head against the wall for too long time on one task. Regular meetings has made sure that the team has been able to maintain a best possible progress at all times, while also being a platform for contributing with new ideas.

Naming conventions

Tables (Entities)

General

  • Change tracking should be enabled by default for all “business” entities where history might be needed.
    • This is because change tracking is often indispensable for troubleshooting.
    • Change tracking can be disabled for specific fields if necessary, e.g., when automation frequently changes irrelevant fields.

Forms

  • Never modify the standard form. Clone it (save as) and hide the original form from users.
    • Standard forms can be updated by the vendor, which may overwrite changes or disrupt updates.
    • Having the original form for comparison is also useful for troubleshooting.
  • This does not apply to custom tables.
  • Ensure fields like status, status reason, created by/on, modified by/on, and owner are visible on the form.
    • These are useful for both administrators and end-users.

Option Sets

Views

When Should I Use Configuration, Business Rules, Processes, Flows, JavaScript, Plugins, or Other Tools?

  • It is important to choose the right tool for the job.
  • Considering administration, maintenance, and documentation, we prioritize tools in the following order:
    • Standard functionality/configuration
      • Many things can be solved with built-in functionality, e.g., setting a field to read-only doesn’t require a Flow. 😉
    • Business Rules
    • Processes/Workflows
      • Can run synchronously.
    • JavaScript
    • Flows
      • Suitable for querying Dataverse data.
    • Plugins

Solutions

  • Unmanaged in development:
    • We have not yet decided on a solution model but will likely use a base package initially and then move to feature-based solutions.
  • Managed in test and production.

Deployment

  • All changes are deployed via pipelines in DevOps.
  • NO MANUAL STEPS/ADJUSTMENTS IN TEST OR PRODUCTION.
  • All data referenced by automation (e.g., Flow) must be scripted and inserted programmatically (preferably via pipeline) to ensure GUID consistency across environments.

Application Lifecycle Management

Why ALM?

Application Lifecycle Management (ALM) enables structured development, testing, and deployment processes, ensuring quality and reducing risks.

  • It is a consistant way of deploying features
  • You get equal environments
  • Better collaboration
  • History
  • Common way of working
  • The whole team is always up to date with changes in developmen

Overall solution

A diagram showing the overall deployment cycle for our CRM Solution.

Solution Strategy

Using a single solution strategy. All ready changes is added to the solution. Fewer dependencies and less complexity is preferable when working within such a time span, and allows for smoother collaboration. The solution is not too complex still, so it makes sense to gather all components in a single solution. As mentioned earlier, it should be considered over time to move to feature-based solutions.

Service Principles

Service Principals are used for between Azure DevOps each environment to ensure industry standard connection to Dataverse without exposing credentials in the code. These are the names of our Service Connections.

  • hogverse-dev
  • hogverse-validation
  • hogverse-test
  • hogverse-prod

In honor of the great ALM Wizard 👉Benedik Bergmann👈the Service Connections are configured with the new feature of Workload Identity federation for ADO Connection. This eliminates the need for managing and renewing secrets, reducing the risk of pipeline failures due to expired credentials. Credentials are not exposed in code or stored in repositories.

Setup of Workload Identity federation for ADO Connection

  1. App Registration: Registering app registrations in Microsoft Entra ID.
  2. Dataverse: Adding the app registration as an application user in our target Dataverse environment, assigning it System Administrator role.
  3. ADO Service Connection: Creating a service connection in Azure DevOps, linking it to the Dataverse instance using Workload Identity Federation.
  4. Adding Federated Credentials: Configuring the app registration to recognize the ADO service connection by setting up federated credentials with the correct issuer and subject identifier.

Entra ID:

Environments

  • DEV The Room of Requirement
    The development environment for Hogverse.
  • Validation – The Chamber of Truth
    The validation environement to check changes before merging to our main branch.
  • TEST: The Restricted Section
    The UAT test environment for Hogverse.
  • PROD: The Great Hall
    The production environment for Hogverse.

Pipelines

Our ALM solution for deploying changes in Power Platform and CRM is the following:
image.png

  • Export
    The Export Pipeline is retrieving the changes from the “DEV The Room of Requirement” environment and creates a pull request.
  • Build
    The Build Pipeline packs the changes in the pull request branch and deployes it to “Validation – The Chamber of Truth” to see if something breaks before mergin it to our main branch. When this completes successfully it creates a Release and can be deployed with the Release Pipeline.
  • Release
    The Release Pipeline deployes the changes to “TEST: The Restricted Section” and “PROD: The Great Hall” after a user has in the team has approved the release, by using Approval and checks in Azure DevOps Pipelines.

Approval and checks for “TEST: The Restricted Section”

Approval and checks for “PROD: The Great Hall”
image.png

Repository Settings:

Settings for the Build Service User.
image.png

Setting and requirements for merging changes to our main branch that goes to production.
image.png

Azure DevOps Pipelines for CRM Solution

Export Pipeline

The Export Pipeline is retrieving the changes from the “DEV The Room of Requirement” environment and creates a pull request. Below is the yml code for export pipeline.

trigger: none

parameters:
  - name: SolutionName
    displayName: Solution to export
    type: string
    default: HogverseBasis
  - name: TypeOfTask
    displayName: "Select task type"
    type: string
    default: "feature"
    values:
      - feature
      - bug
  - name: WorkItems
    displayName: Work Item ID needs to be attached to Automated Pull Request. Multiple work items are space separated.
    type: string
    default:
  - name: Description
    displayName: Pullrequest description
    type: string
    default: "This Pull request is generated automatically through build"

# Define branch name dynamically based on build number and solution name
variables:
  - name: BranchName
    value: "${{ parameters.TypeOfTask }}/${{ parameters.SolutionName }}-$(Build.BuildNumber)"

pool:
  vmImage: "windows-latest"

steps:
  - checkout: self
    persistCredentials: true
  - task: PowerPlatformToolInstaller@2
    displayName: "Power Platform Tool Installer"

  # Create New Branch and Commit Changes
  - script: |
      echo checkout souce branch
      echo git config user.email $(Build.RequestedForEmail)
      echo git config user.name $(Build.RequestedFor)
      git config user.email "$(Build.RequestedForEmail)"
      git config user.name "$(Build.RequestedFor)"
      git fetch origin
      git checkout main
      git checkout -b "${{ variables.BranchName }}" main
      git -c http.extraheader="AUTHORIZATION: bearer $(System.AccessToken)" push origin "${{ variables.BranchName }}"
      echo git checkout -b "${{ variables.BranchName }}"
      git fetch origin
      git checkout ${{ variables.BranchName }}
    displayName: "Checkout GIT Automated CRM Build Branch"

  - task: PowerPlatformPublishCustomizations@2
    displayName: Publish Customizations
    inputs:
      authenticationType: "PowerPlatformSPN"
      PowerPlatformSPN: "hogverse-dev"
      AsyncOperation: true
      MaxAsyncWaitTime: "60"
  - task: PowerPlatformSetSolutionVersion@2
    displayName: "PowerApps Set Solution Version - ${{ parameters.SolutionName }}"
    inputs:
      authenticationType: "PowerPlatformSPN"
      PowerPlatformSPN: "hogverse-dev"
      SolutionName: "${{ parameters.SolutionName }}"
      SolutionVersionNumber: "$(Build.BuildNumber)"
  - task: PowerPlatformExportSolution@2
    displayName: Export ${{ parameters.SolutionName }} - Unmanaged
    inputs:
      authenticationType: "PowerPlatformSPN"
      PowerPlatformSPN: "hogverse-dev"
      SolutionName: "${{ parameters.SolutionName }}"
      SolutionOutputFile: '$(Build.ArtifactStagingDirectory)\\${{ parameters.SolutionName }}.zip'
      AsyncOperation: true
      MaxAsyncWaitTime: "60"
  - task: PowerPlatformExportSolution@2
    displayName: Export Solution - Managed
    inputs:
      authenticationType: "PowerPlatformSPN"
      PowerPlatformSPN: "hogverse-dev"
      SolutionName: "${{ parameters.SolutionName }}"
      SolutionOutputFile: '$(Build.ArtifactStagingDirectory)\\${{ parameters.SolutionName }}_managed.zip'
      Managed: true
      AsyncOperation: true
      MaxAsyncWaitTime: "60"
  - task: PowerPlatformUnpackSolution@2
    displayName: Unpack unmanaged solution ${{ parameters.SolutionName }}
    inputs:
      SolutionInputFile: '$(Build.ArtifactStagingDirectory)\\${{ parameters.SolutionName }}.zip'
      SolutionTargetFolder: '$(build.sourcesdirectory)\PowerPlatform\Solutions\${{ parameters.SolutionName }}'
      SolutionType: "Both"

  - script: |
      echo git push "${{ variables.BranchName }}"
      git push origin ${{ variables.BranchName }}
      echo git add -all
      git add --all
      echo git commit -m "Solutions committed by build number $(Build.BuildNumber)"
      git commit -m "Solutions committed by build number $(Build.BuildNumber)"
      echo push code to ${{ variables.BranchName }}
      git -c http.extraheader="AUTHORIZATION: bearer $(System.AccessToken)" push origin ${{ variables.BranchName }}
    displayName: "Commit CRM solutions to Automated CRM Build Branch"

  # Install Azure DevOps extension
  - script: az --version
    displayName: "Show Azure CLI version"

  # Install Azure DevOps Extension
  - script: az extension add -n azure-devops
    displayName: "Install Azure DevOps Extension"

  # Login to Azure DevOps Extension
  - script: echo $(System.AccessToken) | az devops login
    env:
      AZURE_DEVOPS_CLI_PAT: $(System.AccessToken)
    displayName: "Login Azure DevOps Extension"

  # Configure Azure DevOps Extension
  - script: az devops configure --defaults organization=https://dev.azure.com/hogverse project="Platform9¾Hub" --use-git-aliases true
    displayName: "Set default Azure DevOps organization and project"

  - script: |
      az repos pr create --repository "CRM.Solutions" --title "Automated Pull Request for solution HogverseBasis from branch ${{ variables.BranchName }}" --auto-complete false --bypass-policy false --description "${{parameters.Description}}" --detect --source-branch "${{ variables.BranchName }}" --target-branch "main" --work-items ${{parameters.WorkItems}}
    displayName: "Create automated Pull request for merging CRM solutions to master build list and PRs"

Using work items to track the changes for each pull request and releases.

Build Pipeline

The yml code for the Build Pipeline.

trigger:
  branches:
    include:
      - main # Trigger build pipeline on changes to main
pr:
  branches:
    include:
      - "*" # Trigger on all PR branches

pool:
  vmImage: "windows-latest"

steps:
  - script: |
      echo "Validating pull request from source branch: $(System.PullRequest.SourceBranch)"
      echo "Target branch: $(System.PullRequest.TargetBranch)"
    displayName: "Validate Pull Request Source Branch"
  - task: PowerPlatformToolInstaller@2
    displayName: "Power Platform Tool Installer "

  - task: PowerPlatformPackSolution@2
    inputs:
      SolutionSourceFolder: '$(build.sourcesdirectory)\PowerPlatform\Solutions\HogverseBasis'
      SolutionOutputFile: '$(Build.ArtifactStagingDirectory)\Solutions\HogverseBasis.zip'
      SolutionType: "Managed"

  - task: PublishPipelineArtifact@1
    displayName: Publish Artifacts
    inputs:
      targetPath: "$(Build.ArtifactStagingDirectory)"
      artifact: "drop"
      publishLocation: "pipeline"

  - task: PowerPlatformImportSolution@2
    displayName: "Power Platform Import HogverseBasis to Validation"
    inputs:
      authenticationType: PowerPlatformSPN
      PowerPlatformSPN: "hogverse-validation"
      SolutionInputFile: '$(Build.ArtifactStagingDirectory)\Solutions\HogverseBasis.zip'
      StageAndUpgrade: false
      ActivatePlugins: false
      SkipLowerVersion: true

Release Pipeline

The code for the release pipeline.

trigger: none
pr: none

resources:
  pipelines:
    - pipeline: Build # Reference to the build pipeline
      source: Build # Name of the build pipeline to trigger from
      trigger:
        branches:
          include:
            - main

stages:
  - stage: DeployTest
    displayName: "Deploy to Test"
    jobs:
      - deployment: deployTest # Use deployment job for environment reference
        environment: "The Great Hall - Test" # Reference the 'The Great Hall - Test' environment
        pool:
          vmImage: "windows-latest"
        strategy:
          runOnce:
            deploy:
              steps:
                - checkout: self
                - download: Build # pipeline resource identifier.
                  artifact: drop
                - task: PowerPlatformToolInstaller@2
                  displayName: "Power Platform Tool Installer"
                - task: PowerPlatformImportSolution@2
                  inputs:
                    authenticationType: "PowerPlatformSPN"
                    PowerPlatformSPN: "hogverse-test"
                    SolutionInputFile: '$(Pipeline.Workspace)\Build\drop\Solutions\HogverseBasis.zip'
                    AsyncOperation: true
                    MaxAsyncWaitTime: "60"
                - task: PowerPlatformPublishCustomizations@2
                  inputs:
                    authenticationType: "PowerPlatformSPN"
                    PowerPlatformSPN: "hogverse-test"
                    AsyncOperation: true
                    MaxAsyncWaitTime: "60"
  - stage: DeployProd
    displayName: "Deploy to Production"
    dependsOn: DeployTest # Depends on successful deployment to Test
    condition: succeeded()
    jobs:
      - deployment: deployProd # Use deployment job for environment reference
        environment: "The Restricted Section - Prod" # Reference the 'The Restricted Section - Prod' environment
        pool:
          vmImage: "windows-latest"
        strategy:
          runOnce:
            deploy:
              steps:
                - checkout: self
                - download: Build # pipeline resource identifier.
                  artifact: drop
                - task: PowerPlatformToolInstaller@2
                  displayName: "Power Platform Tool Installer"
                - task: PowerPlatformImportSolution@2
                  inputs:
                    authenticationType: "PowerPlatformSPN"
                    PowerPlatformSPN: "hogverse-prod"
                    SolutionInputFile: '$(Pipeline.Workspace)\Build\drop\Solutions\FE360Basis.zip'
                    AsyncOperation: true
                    MaxAsyncWaitTime: "60"
                - task: PowerPlatformPublishCustomizations@2
                  inputs:
                    authenticationType: "PowerPlatformSPN"
                    PowerPlatformSPN: "hogverse-prod"
                    AsyncOperation: true
                    MaxAsyncWaitTime: "60"

Issues

We had this problem during the event and unfortunately we did get to run hour pipelines the way we wanted.

And we tried a work around but that ended up with… Self Hosted Agent.

Looks good all the way in the installation aaaaand in the final step was blocked by admin rights at the company…. You need admin rights…

ALM for Fabric

We have implemented deployment pipelines for deploying changes between workspaces so that the reports can be tested and verified before going into production.

Hogverse Deployment Pipelines for deploying items between workspaces.

ALM for Svelte and Front end

The Svelte project is for now hosted in a private Github repository shared between the developers. Each developer creates their own branch for each new feature added. When a feature is ready to commit, a pull request is created and approved from others on the team. On approval, the branches are merged and the feature branch is normally deleted to ensure a clean project at all times.

With more time on our hands, we would have preferred to import the repository to Azure DevOps and created pipelines for Dev, Validation, Test and Prod as for the CRM solution.

Bicep honourable mention and hot tip

The Azure resources used in the project has mostly been created on the fly as part of experimentation, but we would most definitely have created Bicep files for deployment of them for each environment as well. Microsoft MVP 👉Jan Vidar Elven👈 have created a super useful public repository with templates for deploying resources on his Github account: https://github.com/JanVidarElven/workshop-get-started-with-bicep

Digital Transformation: Welcome Hogverse!

🧙🏻‍♂️ The Challenge 🧙🏻‍♂️

It’s been five years since the whole world realized what we’ve known for years – the future is digital, and we all need to get on the wagon. Unfortunately, Hogwarts staff and board has been lagging behind, not realizing the immense capabilities that lies in intelligent automation.

💡 The Solution 💡

First of all, we needed to create a digital platform for staff and students board. We decided to store all information about who and what’s going on at Hogwarts in Dataverse, making it easily accessible for both pro- and low-code apps and automations.

In the Hogverse Student Management App, Hogwarts staff can keep track of enrolled students and subjects, they can create teams for teaching and send out notifications to students. In this way, time consuming administrative tasks are cut down to a minimum, and each teacher has more time for magical moments with their students.

With a one-click-only to create a team, teachers and students are all set for online teaching in case of trolls, especially bad weather, or a really good party a pandemic. Teams will also be used for teacher-student communication, as well as automatic notifications in form of an adaptive card in Teams, in case a student is about to fail in a subject:

We are sending out similar notifications if a student has done something particularly stupid, which removes the need for detention and other time consuming, medieval methods of punishments. Contract signing automated using OneFlow connectors in Power Automate, and storing the signed contracts in the safe “Secret of Chambers” – SharePoint, triggered if a student is alarmingly interested in death spells or the doctrine of Dark Magic.


The daily life of a student is also getting a lot easier with this digital transformation. No more carrying around heavy, ancient and outdated books. You will get all the information you need for studying by asking Dobby in the Student Portal.

The Student Portal is always updated, since the AI-assistant is instructed to request the Harry Potter DB (api.potterdb.com) for spells and potions. The Portal also gives them full oversight over test results and grades, so that they always know which subject to focus on.

The Student App to bring with you wherever you go, brings teaching out of the classroom, and is enabling learning on-the-go. Here the students have access to the teachers in their pockets, and they will be encouraged to always keep learning. It is also nice not to have to stand in line in front of prof.Sybills office to get a reading of your fortune after having a coffee.

The implementation of Teams will also bring a whole new world to the students. Getting in-app notifications for urgent matters, being able to communicate with teachers without searching the whole school premises, and having the ability to take that Monday morning 8 o’clock History of Magic lesson from the comfort of your warm bed, will hopefully increase the student satisfaction.


✨ The Outcome ✨

Everyone that has taken a pedagogy class or two knows that a student’s well-being is crucial for their ability to learn. These steps of digital transformation will therefore play an important role in education the future Magicians of the World.

We are also hopeful that the new digital tools provided to Hogwarts’ staff will decrease turnover and long-term sick leave. We have seen too many new hires at Hogwarts leading to the return of Voldemort, and this might mitigate the risk of a new Dark Age.

All in all, a more digital magic world, is a safer place for all of us Muggles. 🪄

ALM Magic

It’s EVIDIosa, not Leviosaaaa thoughts on ALM and the way of work.

Claims the badge ACDC Craftsman.🚀

Claims the badge Power Of The Shell.🚀 – Please have a look at the “Azure DevOps Pipelines for CRM Solution” section

Workflow – Azure DevOps Boards

Using boards to keep track of badges and tasks. For better collaboration, progress and overview.

Scrum

Given the very short length of the project, there has been a need for tight collaborations and check-ins. In a way we have practised a super compressed variation of Scrum, with several daily standups each day to secure progress. In a project like this, it is particularly important to avoid banging your head against the wall for too long time on one task. Regular meetings has made sure that the team has been able to maintain a best possible progress at all times, while also being a platform for contributing with new ideas.

Naming conventions

Tables (Entities)

General

  • Change tracking should be enabled by default for all “business” entities where history might be needed.
    • This is because change tracking is often indispensable for troubleshooting.
    • Change tracking can be disabled for specific fields if necessary, e.g., when automation frequently changes irrelevant fields.

Forms

  • Never modify the standard form. Clone it (save as) and hide the original form from users.
    • Standard forms can be updated by the vendor, which may overwrite changes or disrupt updates.
    • Having the original form for comparison is also useful for troubleshooting.
  • This does not apply to custom tables.
  • Ensure fields like status, status reason, created by/on, modified by/on, and owner are visible on the form.
    • These are useful for both administrators and end-users.

Option Sets

Views

When Should I Use Configuration, Business Rules, Processes, Flows, JavaScript, Plugins, or Other Tools?

  • It is important to choose the right tool for the job.
  • Considering administration, maintenance, and documentation, we prioritize tools in the following order:
    • Standard functionality/configuration
      • Many things can be solved with built-in functionality, e.g., setting a field to read-only doesn’t require a Flow. 😉
    • Business Rules
    • Processes/Workflows
      • Can run synchronously.
    • JavaScript
    • Flows
      • Suitable for querying Dataverse data.
    • Plugins

Solutions

  • Unmanaged in development:
    • We have not yet decided on a solution model but will likely use a base package initially and then move to feature-based solutions.
  • Managed in test and production.

Deployment

  • All changes are deployed via pipelines in DevOps.
  • NO MANUAL STEPS/ADJUSTMENTS IN TEST OR PRODUCTION.
  • All data referenced by automation (e.g., Flow) must be scripted and inserted programmatically (preferably via pipeline) to ensure GUID consistency across environments.

Application Lifecycle Management

Why ALM?

Application Lifecycle Management (ALM) enables structured development, testing, and deployment processes, ensuring quality and reducing risks.

  • It is a consistant way of deploying features
  • You get equal environments
  • Better collaboration
  • History
  • Common way of working
  • The whole team is always up to date with changes in developmen

Overall solution

A diagram showing the overall deployment cycle for our CRM Solution.

Solution Strategy

Using a single solution strategy. All ready changes is added to the solution. Fewer dependencies and less complexity is preferable when working within such a time span, and allows for smoother collaboration. The solution is not too complex still, so it makes sense to gather all components in a single solution. As mentioned earlier, it should be considered over time to move to feature-based solutions.

Service Principles

Service Principals are used for between Azure DevOps each environment to ensure industry standard connection to Dataverse without exposing credentials in the code. These are the names of our Service Connections.

  • hogverse-dev
  • hogverse-validation
  • hogverse-test
  • hogverse-prod

In honor of the great ALM Wizard 👉Benedik Bergmann👈the Service Connections are configured with the new feature of Workload Identity federation for ADO Connection. This eliminates the need for managing and renewing secrets, reducing the risk of pipeline failures due to expired credentials. Credentials are not exposed in code or stored in repositories.

Setup of Workload Identity federation for ADO Connection

  1. App Registration: Registering app registrations in Microsoft Entra ID.
  2. Dataverse: Adding the app registration as an application user in our target Dataverse environment, assigning it System Administrator role.
  3. ADO Service Connection: Creating a service connection in Azure DevOps, linking it to the Dataverse instance using Workload Identity Federation.
  4. Adding Federated Credentials: Configuring the app registration to recognize the ADO service connection by setting up federated credentials with the correct issuer and subject identifier.

Entra ID:

Environments

  • DEV The Room of Requirement
    The development environment for Hogverse.
  • Validation – The Chamber of Truth
    The validation environement to check changes before merging to our main branch.
  • TEST: The Restricted Section
    The UAT test environment for Hogverse.
  • PROD: The Great Hall
    The production environment for Hogverse.

Pipelines

Our ALM solution for deploying changes in Power Platform and CRM is the following:
image.png

  • Export
    The Export Pipeline is retrieving the changes from the “DEV The Room of Requirement” environment and creates a pull request.
  • Build
    The Build Pipeline packs the changes in the pull request branch and deployes it to “Validation – The Chamber of Truth” to see if something breaks before mergin it to our main branch. When this completes successfully it creates a Release and can be deployed with the Release Pipeline.
  • Release
    The Release Pipeline deployes the changes to “TEST: The Restricted Section” and “PROD: The Great Hall” after a user has in the team has approved the release, by using Approval and checks in Azure DevOps Pipelines.

Approval and checks for “TEST: The Restricted Section”
image.png

Approval and checks for “PROD: The Great Hall”
image.png

Repository Settings:

Settings for the Build Service User.
image.png

Setting and requirements for merging changes to our main branch that goes to production.
image.png

Azure DevOps Pipelines for CRM Solution

Claims the badge Power Of The Shell.🚀

Export Pipeline

The Export Pipeline is retrieving the changes from the “DEV The Room of Requirement” environment and creates a pull request. Below is the yml code for export pipeline.

trigger: none

parameters:
  - name: SolutionName
    displayName: Solution to export
    type: string
    default: HogverseBasis
  - name: TypeOfTask
    displayName: "Select task type"
    type: string
    default: "feature"
    values:
      - feature
      - bug
  - name: WorkItems
    displayName: Work Item ID needs to be attached to Automated Pull Request. Multiple work items are space separated.
    type: string
    default:
  - name: Description
    displayName: Pullrequest description
    type: string
    default: "This Pull request is generated automatically through build"

# Define branch name dynamically based on build number and solution name
variables:
  - name: BranchName
    value: "${{ parameters.TypeOfTask }}/${{ parameters.SolutionName }}-$(Build.BuildNumber)"

pool:
  vmImage: "windows-latest"

steps:
  - checkout: self
    persistCredentials: true
  - task: PowerPlatformToolInstaller@2
    displayName: "Power Platform Tool Installer"

  # Create New Branch and Commit Changes
  - script: |
      echo checkout souce branch
      echo git config user.email $(Build.RequestedForEmail)
      echo git config user.name $(Build.RequestedFor)
      git config user.email "$(Build.RequestedForEmail)"
      git config user.name "$(Build.RequestedFor)"
      git fetch origin
      git checkout main
      git checkout -b "${{ variables.BranchName }}" main
      git -c http.extraheader="AUTHORIZATION: bearer $(System.AccessToken)" push origin "${{ variables.BranchName }}"
      echo git checkout -b "${{ variables.BranchName }}"
      git fetch origin
      git checkout ${{ variables.BranchName }}
    displayName: "Checkout GIT Automated CRM Build Branch"

  - task: PowerPlatformPublishCustomizations@2
    displayName: Publish Customizations
    inputs:
      authenticationType: "PowerPlatformSPN"
      PowerPlatformSPN: "hogverse-dev"
      AsyncOperation: true
      MaxAsyncWaitTime: "60"
  - task: PowerPlatformSetSolutionVersion@2
    displayName: "PowerApps Set Solution Version - ${{ parameters.SolutionName }}"
    inputs:
      authenticationType: "PowerPlatformSPN"
      PowerPlatformSPN: "hogverse-dev"
      SolutionName: "${{ parameters.SolutionName }}"
      SolutionVersionNumber: "$(Build.BuildNumber)"
  - task: PowerPlatformExportSolution@2
    displayName: Export ${{ parameters.SolutionName }} - Unmanaged
    inputs:
      authenticationType: "PowerPlatformSPN"
      PowerPlatformSPN: "hogverse-dev"
      SolutionName: "${{ parameters.SolutionName }}"
      SolutionOutputFile: '$(Build.ArtifactStagingDirectory)\\${{ parameters.SolutionName }}.zip'
      AsyncOperation: true
      MaxAsyncWaitTime: "60"
  - task: PowerPlatformExportSolution@2
    displayName: Export Solution - Managed
    inputs:
      authenticationType: "PowerPlatformSPN"
      PowerPlatformSPN: "hogverse-dev"
      SolutionName: "${{ parameters.SolutionName }}"
      SolutionOutputFile: '$(Build.ArtifactStagingDirectory)\\${{ parameters.SolutionName }}_managed.zip'
      Managed: true
      AsyncOperation: true
      MaxAsyncWaitTime: "60"
  - task: PowerPlatformUnpackSolution@2
    displayName: Unpack unmanaged solution ${{ parameters.SolutionName }}
    inputs:
      SolutionInputFile: '$(Build.ArtifactStagingDirectory)\\${{ parameters.SolutionName }}.zip'
      SolutionTargetFolder: '$(build.sourcesdirectory)\PowerPlatform\Solutions\${{ parameters.SolutionName }}'
      SolutionType: "Both"

  - script: |
      echo git push "${{ variables.BranchName }}"
      git push origin ${{ variables.BranchName }}
      echo git add -all
      git add --all
      echo git commit -m "Solutions committed by build number $(Build.BuildNumber)"
      git commit -m "Solutions committed by build number $(Build.BuildNumber)"
      echo push code to ${{ variables.BranchName }}
      git -c http.extraheader="AUTHORIZATION: bearer $(System.AccessToken)" push origin ${{ variables.BranchName }}
    displayName: "Commit CRM solutions to Automated CRM Build Branch"

  # Install Azure DevOps extension
  - script: az --version
    displayName: "Show Azure CLI version"

  # Install Azure DevOps Extension
  - script: az extension add -n azure-devops
    displayName: "Install Azure DevOps Extension"

  # Login to Azure DevOps Extension
  - script: echo $(System.AccessToken) | az devops login
    env:
      AZURE_DEVOPS_CLI_PAT: $(System.AccessToken)
    displayName: "Login Azure DevOps Extension"

  # Configure Azure DevOps Extension
  - script: az devops configure --defaults organization=https://dev.azure.com/hogverse project="Platform9¾Hub" --use-git-aliases true
    displayName: "Set default Azure DevOps organization and project"

  - script: |
      az repos pr create --repository "CRM.Solutions" --title "Automated Pull Request for solution HogverseBasis from branch ${{ variables.BranchName }}" --auto-complete false --bypass-policy false --description "${{parameters.Description}}" --detect --source-branch "${{ variables.BranchName }}" --target-branch "main" --work-items ${{parameters.WorkItems}}
    displayName: "Create automated Pull request for merging CRM solutions to master build list and PRs"

Using work items to track the changes for each pull request and releases.

Build Pipeline

The yml code for the Build Pipeline.

trigger:
  branches:
    include:
      - main # Trigger build pipeline on changes to main
pr:
  branches:
    include:
      - "*" # Trigger on all PR branches

pool:
  vmImage: "windows-latest"

steps:
  - script: |
      echo "Validating pull request from source branch: $(System.PullRequest.SourceBranch)"
      echo "Target branch: $(System.PullRequest.TargetBranch)"
    displayName: "Validate Pull Request Source Branch"
  - task: PowerPlatformToolInstaller@2
    displayName: "Power Platform Tool Installer "

  - task: PowerPlatformPackSolution@2
    inputs:
      SolutionSourceFolder: '$(build.sourcesdirectory)\PowerPlatform\Solutions\HogverseBasis'
      SolutionOutputFile: '$(Build.ArtifactStagingDirectory)\Solutions\HogverseBasis.zip'
      SolutionType: "Managed"

  - task: PublishPipelineArtifact@1
    displayName: Publish Artifacts
    inputs:
      targetPath: "$(Build.ArtifactStagingDirectory)"
      artifact: "drop"
      publishLocation: "pipeline"

  - task: PowerPlatformImportSolution@2
    displayName: "Power Platform Import HogverseBasis to Validation"
    inputs:
      authenticationType: PowerPlatformSPN
      PowerPlatformSPN: "hogverse-validation"
      SolutionInputFile: '$(Build.ArtifactStagingDirectory)\Solutions\HogverseBasis.zip'
      StageAndUpgrade: false
      ActivatePlugins: false
      SkipLowerVersion: true

Release Pipeline

The code for the release pipeline.

trigger: none
pr: none

resources:
  pipelines:
    - pipeline: Build # Reference to the build pipeline
      source: Build # Name of the build pipeline to trigger from
      trigger:
        branches:
          include:
            - main

stages:
  - stage: DeployTest
    displayName: "Deploy to Test"
    jobs:
      - deployment: deployTest # Use deployment job for environment reference
        environment: "The Great Hall - Test" # Reference the 'The Great Hall - Test' environment
        pool:
          vmImage: "windows-latest"
        strategy:
          runOnce:
            deploy:
              steps:
                - checkout: self
                - download: Build # pipeline resource identifier.
                  artifact: drop
                - task: PowerPlatformToolInstaller@2
                  displayName: "Power Platform Tool Installer"
                - task: PowerPlatformImportSolution@2
                  inputs:
                    authenticationType: "PowerPlatformSPN"
                    PowerPlatformSPN: "hogverse-test"
                    SolutionInputFile: '$(Pipeline.Workspace)\Build\drop\Solutions\HogverseBasis.zip'
                    AsyncOperation: true
                    MaxAsyncWaitTime: "60"
                - task: PowerPlatformPublishCustomizations@2
                  inputs:
                    authenticationType: "PowerPlatformSPN"
                    PowerPlatformSPN: "hogverse-test"
                    AsyncOperation: true
                    MaxAsyncWaitTime: "60"
  - stage: DeployProd
    displayName: "Deploy to Production"
    dependsOn: DeployTest # Depends on successful deployment to Test
    condition: succeeded()
    jobs:
      - deployment: deployProd # Use deployment job for environment reference
        environment: "The Restricted Section - Prod" # Reference the 'The Restricted Section - Prod' environment
        pool:
          vmImage: "windows-latest"
        strategy:
          runOnce:
            deploy:
              steps:
                - checkout: self
                - download: Build # pipeline resource identifier.
                  artifact: drop
                - task: PowerPlatformToolInstaller@2
                  displayName: "Power Platform Tool Installer"
                - task: PowerPlatformImportSolution@2
                  inputs:
                    authenticationType: "PowerPlatformSPN"
                    PowerPlatformSPN: "hogverse-prod"
                    SolutionInputFile: '$(Pipeline.Workspace)\Build\drop\Solutions\FE360Basis.zip'
                    AsyncOperation: true
                    MaxAsyncWaitTime: "60"
                - task: PowerPlatformPublishCustomizations@2
                  inputs:
                    authenticationType: "PowerPlatformSPN"
                    PowerPlatformSPN: "hogverse-prod"
                    AsyncOperation: true
                    MaxAsyncWaitTime: "60"

Issues

We had this problem during the event and unfortunately we did get to run hour pipelines the way we wanted.

And we tried a work around but that ended up with… Self Hosted Agent.

Looks good all the way in the installation aaaaand in the final step was blocked by admin rights at the company…. You need admin rights…

ALM for Fabric

We have implemented deployment pipelines for deploying changes between workspaces so that the reports can be tested and verified before going into production.

Hogverse Deployment Pipelines for deploying items between workspaces.

ALM for Svelte and Front end

The Svelte project is for now hosted in a private Github repository shared between the developers. Each developer creates their own branch for each new feature added. When a feature is ready to commit, a pull request is created and approved from others on the team. On approval, the branches are merged and the feature branch is normally deleted to ensure a clean project at all times.

With more time on our hands, we would have preferred to import the repository to Azure DevOps and created pipelines for Dev, Validation, Test and Prod as for the CRM solution.

Bicep honourable mention and hot tip

The Azure resources used in the project has mostly been created on the fly as part of experimentation, but we would most definitely have created Bicep files for deployment of them for each environment as well. Microsoft MVP 👉Jan Vidar Elven👈 have created a super useful public repository with templates for deploying resources on his Github account: https://github.com/JanVidarElven/workshop-get-started-with-bicep

Fabric Fera Verto

“It’s EVIDIosa, not Leviosaaaa” a contribution to the delivery for the Fabric Fera Verto category. Here we explain our Fabric set up in for this category.

Workspaces

We are using three workspaces. Dev for development, test for UAT testing and prod for production.

  • Dev: The Room of Requirement
  • Prod: The Great Hall
  • Test: The Restricted Section

Deployment Pipelines

We have implemented deployment pipelines for deploying changes between workspaces so that the reports can be tested and verified before going into production.

Hogverse Deployment Pipelines for deploying items between workspaces.

Medallion Architecture

We use the medallion architecture to organize data into three layers: Bronze (raw, unprocessed data), Silver (cleaned and enriched data), and Gold (aggregated and analytics-ready data), enabling a structured, scalable approach to data processing and analytics.

Bronze layer
The bronze or raw layer of the medallion architecture is the first layer of the lakehouse. It’s the landing zone for all data, whether it’s structured, semi-structured, or unstructured. The data is stored in its original format, and no changes are made to it.

Silver layer
The silver or validated layer is the second layer of the lakehouse. It’s where you’ll validate and refine your data. Typical activities in the silver layer include combining and merging data and enforcing data validation rules like removing nulls and deduplicating. The silver layer can be thought of as a central repository across an organization or team, where data is stored in a consistent format and can be accessed by multiple teams. In the silver layer you’re cleaning your data enough so that everything is in one place and ready to be refined and modeled in the gold layer.

Gold layer
The gold or enriched layer is the third layer of the lakehouse. In the gold layer, data undergoes further refinement to align with specific business and analytics needs. This could involve aggregating data to a particular granularity, such as daily or hourly, or enriching it with external information.

Data ingestion

We used Data pipelines and Dataflows Gen2 for retrieving data into Fabric and store it in our bronze lakehouse. Below is a image of the items we use to ingest data into the platform.

Dataflows Gen 2

Using Dataflows Gen2 to retrive data from another dataverse tenant using a Service Principle that allows “Accounts in any organizational directory (Any Microsoft Entra ID tenant – Multitenant)” to connect. Since there was an error creating a free Fabric capacity in our CDX tenant. The data gets stored in our bronze lakehouse.

Example:

Steg 1: Retrieving data using the Dataverse connector in Dataflow Gen2.

Steg 2: Storing the data in our Bronze lakehouse.

Data pipeline

Where using Data pipeline for retrieving data from open Harry Potter API’s and storing them as well in our bronze lakehouse.

Example:

Step 1: Using a copy data step in Data Pipeline to retrieve data form external data:

Step 2: Retrieves data from external source using the endpoint,
https://api.potterdb.com/v1/potions

Step 3: Stores the data in our lakehouse in a table called Potions.

Step 4: Mapps all fields from the external source and to the destinations table.

Retrives data from the potions table in our lakehouse that has been updated from an external source and later used to import to Dataverse.

Medallion implementation

We are using Notebooks and Pyspark to implement the data transformation between the medallions. Below we go through some examples for each layer.

Bronze layer

The bronze layer consist of raw data. This is data imported “as is” without any transformation. In the image below we can see how the data retrieved from the Harry Potter API looks without any transformation.👇

Silver layer

In the silver layer, we are removing and renaming columns and a cleaner table in the Silver Lakehouse. Below is the code for transforming the Potions table from the bronze layer in the picture above. The rest of the tables are using the same structure to transform the columns and data.

# Table Potions

import pyspark.sql.functions as F

# 1. Read Bronze table
bronze_df = spark.read.table("Lakehouse_Bronze.potions")

# 2. Flatten & rename columns (and remove unneeded ones)
silver_df = bronze_df.select(
    F.col("`data.id`").alias("Id"),
    F.col("`data.type`").alias("Type"),
    F.col("`data.attributes.name`").alias("Name"),
    F.col("`data.attributes.slug`").alias("Slug"),
    F.col("`data.attributes.difficulty`").alias("Difficulty"),
    F.col("`data.attributes.effect`").alias("Effect"),
    F.col("`data.attributes.ingredients`").alias("Ingredients")
)

# 3. Write to Silver
silver_df.write \
    .format("delta") \
    .mode("overwrite") \
    .saveAsTable("`Lakehouse_Silver`.`potions`")

After the transformation the Silver Lakehouse and the Potion table looks like this👇

Gold layer

In the gold layer we are creating dimention tables, fact tables and aggregated data.

Below is an image of the code for creating the fact table for the Potions table as seen in previous examples:

Fact Table

# Create a Fact Table (FactPotions)
# 1. Join potions data to dimDifficulty so each potion references a numeric DifficultyID.
# 2. (Optional) add a PotionKey if you want a unique fact table key.
# 3. Write the result to a Gold “fact” table.

import pyspark.sql.functions as F
from pyspark.sql.window import Window

# Re-read silver potions to keep original schema
silver_potions_df = spark.read.table("Lakehouse_Silver.potions")

# Read the newly created dimension to get DifficultyID
dimDifficulty_df = spark.read.table("Lakehouse_Gold.dimDifficulty")

# 1) Join on the Difficulty column
factPotions_df = (silver_potions_df.join(dimDifficulty_df, on="Difficulty", how="left"))

# 2) (Optional) add an auto-increment surrogate key for each row
factPotions_df = factPotions_df.withColumn("PotionKey", F.row_number().over(Window.orderBy("Id")))

# Reorder columns for clarity
factPotions_df = factPotions_df.select( \
    "PotionKey", \
    "Id", \
    "Name", \
    "Slug", \
    "DifficultyID", \
    "Difficulty", \
    "Effect", \
    "Ingredients", \
    "Type" \
)

# 3) Write to Gold as a fact table
factPotions_df.write \
    .format("delta") \
    .mode("overwrite") \
    .option("overwriteSchema", "true") \
    .saveAsTable("Lakehouse_Gold.factPotions")

Aggregated Table

# Aggregated Summary Table

import pyspark.sql.functions as F
from pyspark.sql.window import Window

# Read Silver Potions
silver_potions_df = spark.read.table("Lakehouse_Silver.potions")

agg_potions_df = factPotions_df.groupBy("DifficultyID").agg(F.count("*").alias("CountOfPotions"))

# Join to the dimension to get the difficulty name
agg_potions_df = agg_potions_df.join(dimDifficulty_df, on="DifficultyID", how="left")

# Write as a separate summary table
agg_potions_df.write \
    .format("delta") \
    .mode("overwrite") \
    .option("overwriteSchema", "true") \
    .saveAsTable("Lakehouse_Gold.potionsByDifficulty")

Difficulty Dimension

# Create a Difficulty Dimension (DimDifficulty)
# 1. Read the silver potions.
# 2. Extract unique difficulty values.
# 3. Assign a numeric DifficultyID.

import pyspark.sql.functions as F
from pyspark.sql.window import Window

# Read Silver Potions
silver_potions_df = spark.read.table("Lakehouse_Silver.potions")

# 1) Create a distinct list of difficulties
dimDifficulty_df = (silver_potions_df.select("Difficulty").distinct().filter(F.col("Difficulty").isNotNull()))

# 2) Generate a numeric key (DifficultyID) using row_number
windowSpec = Window.orderBy("Difficulty")
dimDifficulty_df = dimDifficulty_df.withColumn("DifficultyID", F.row_number().over(windowSpec))

# 3) Reorder columns so the ID is first
dimDifficulty_df = dimDifficulty_df.select("DifficultyID", "Difficulty")

# 4) Write to Gold dimension table
dimDifficulty_df.write \
    .format("delta") \
    .mode("overwrite") \
    .option("overwriteSchema", "true") \
    .saveAsTable("Lakehouse_Gold.dimDifficulty")

After the transformation we have dimension, facts and aggregated tables👇

Semantic Model

The semantic model for reporting is built on the Gold layer. The best result would be a star model, but that was not implemented fully unfurtunately.

Aaand the report with the aggregated data and the transformed columns look ended up like this.

Let’s show and tell about our KnowItAll chat

Let’s be honest—Harry and Ron wouldn’t have made it through their years at Hogwarts without Hermione. Her knowledge of spells and magical theory saved the day more times than we can count. But let’s not forget, it’s not just about knowing the spell—it’s about saying it correctly too.

Here comes the KnowItAll chat available to all students at Hogwarts. Now everyone can have their own Hermione at hand in desperate, and not so desperate times.

We have utilized the Azure OpenAI endpoint to deliver the user’s message to the “magificial” Hermione, and sending the response through to the Azure AI Text to Speech model. The speech model provides the voice returned to the canvas app, in a British lady-like, Hermione voice. Explained more in detailed here.

While a lot already work, we reeeeeally wanted to get audio in the Canvas App so that the user was told how to pronounce the spells. However, we must admit our defeat and we haven’t made it work. Maybe if the Hackathon was a little bit longer…

Everyone shall pass!

The worst part of being a teacher? Having to spend time dragging lazy, unmotivated student through a course they show no interest in – just to tick off that KPI of 90% pass rate. ✅

Leave that to the magic cards! Sent out by owls a power automate flow, students at risk of failing a subject will automatically get notified directly in teams, two weeks before the semester ends.

A neat little reminder and the possibility to pass the subject by showing your magic in a quick quiz!

The hipsters are here with no intentions of going anywhere!

At It’s EVIDIosa, not Leviosaaaa, we never allow ourselves to å hvile på laurbærene, and this wonderful event at the mighty Soria Moria is no exception. While being extremely reasonable people, we are still in constant opposition to mainstream stuff unless we are given a good and well-thought-out justification. That leads us to…

Power Pages or NOT Power Pages

We all have a relationship with Power Pages, and it is quite complex as we recognize the great OOB capabilities in terms of forms, lists, webapi and so on. But the developer experience can often feel quite limited, often having to work in the same envirnoment, clearing cache at all times and having a hard time using the newest and hippest technology.

With our complete template set up using Svelte, each developer can easily work locally on their own git branch on their own machine with the new and flexible front end technology while never ever having to worry about going in the way of other developers. After the initial setup, there are mostly quite basic HTML and TypeScript, allowing for developers with some experience from working with templates in Power Pages to have a smoooooth experience and crafty performance.

Let’s have a look at some crucial setup. The authentication process for the signed in user is handled with Microsoft´s MSAL library, allowing for Single Sign On and silent token-handling.

We define types server side from Dataverse retrieving just the information we want, allowing for using each record’s columns easily, instead of using FetchXML and Liquid (Extremely hip!!).

But enough code for now! Let’s look at the beauty of the page itself! It’s so hip we feel younger just by looking at it! And by the way: we have not yet mentioned licences, which can be a huge factor.


The stairway of music to the sound of heaven’s voice… or something like that

While working on developing a feature in our beloved Canvas app featuring a slightly annoying for always being right, but very valuable helper in Hermione Granger, we decided that it was necessary to combine a couple of Microsoft Cloud APIs to get things going.

We have utilized the Azure OpenAI endpoint to deliver the user’s message to the “magificial” Hermione, and sending the response through to the Azure AI Text to Speech model. The speech model provides the voice returned to the canvas app.

As you can see, there has also been decided to store the audio files from the converstation in Azure Blob storage. Why?, you might ask. Well, paranoia given all the chaos every year. Anyway, they still promise to never listen to the files.

Thieving Bastards

Harry Potter APIs

For our Hogverse solution, we’re tapping into multiple HP open APIs as our database for potions, spells, spell books, and more. This approach is brilliant because it eliminates the need to maintain our own database, ensuring our app is always updated with the latest magical content, staying in sync with the rest of the wizarding world! ✨📚

Stolen the wand cursor from the event committee themselves

We are of course so sorry for stealing this for the good people who have made this hackathon possible. Stealing an idea hurts more—it’s like swiping someone’s spark, not just their stuff!

However, we could resist adding something so fun and magical to our portal.

Large design code from The Golden Snitches

Also, got our hands on a Canvas App code for The Golden Snitches. It is a real art to create visual pleasing and responsive design. The code snippet for a whole screen with multiple vertical and horizontal containers, shadows, glossy containers, shadow boxes, HTML controlls – eeeeverything.

With this code be stole some very useful tricks to create our responsive and eye pleasing design in our canvas app here.