Enchanting Dashboards: Claiming the Dash It Out Badge

Dash It Out: In the spirit of the Marauder’s Map, we have conjured a dashboard that is both visually stunning and incredibly informative. This dashboard is not just a collection of graphs and KPIs; it is a powerful tool designed to provide valuable insights.

  1. Current Total Points Handed Out for the Semester: Much like the House Points Hourglasses in the Great Hall, this graph shows the total points awarded throughout the semester, giving us a clear view of the academic achievements and contributions of our students.
  2. Who Awarded the Most Points: This chart reveals the professors who have awarded the most points. It highlights the dedication and encouragement provided by our esteemed faculty members, fostering a competitive yet supportive environment.
  3. Sum of Points Awarded by House: This graph, reminiscent of the House Cup standings, displays the total points awarded to each house. It provides a visual representation of the friendly rivalry between Gryffindor, Hufflepuff, Ravenclaw, and Slytherin, motivating students to strive for excellence.
  4. Statistics About the Professors Who Awarded Points: This report, much like the meticulous notes of Hermione Granger, details the statistics of the professors who have awarded points. It includes insights into their teaching styles, frequency of awarding points, and the impact of their encouragement on student performance.

Creating this dashboard was we utilized our preferred data visualization framework, leveraging its capabilities to build a solution that is both robust and user-friendly.

Mischief managed! 🧙‍♂️✨

RESCO

We used a Resco Kanban board PCF to visualize the Activities assigned to the students in the Model-Driven Apps.

A Kanban board PCF control visualizes workflow progress by displaying tasks in different stages of completion. It can be used for project management, process optimization, and enhancing team collaboration.

We added the new tab to the form Activity Overview.

Advantages of Kanban Board PCF Components:

  1. Visual Task Management
    • A Kanban board provides a clear and intuitive visual representation of tasks and their statuses. This helps users quickly understand workload distribution and progress at a glance.
  2. Drag-and-Drop Functionality
    • PCF components support interactive features like drag-and-drop, making it easier to update task statuses without manually editing fields or navigating between forms.
  3. Real-Time Updates
    • The Kanban board can fetch and display data in real-time from the Dataverse, ensuring that users are always working with the latest information.
  4. Improved Collaboration
    • Teams can use the board to assign, track, and prioritize tasks collaboratively, leading to better alignment and accountability.
  5. Increased Efficiency
    • By reducing the need for context-switching (e.g., switching between forms or views), a Kanban board improves task management efficiency within the Power Apps environment.
  6. Enhanced User Experience
    • The interactive and user-friendly interface of a PCF-based Kanban board enhances user engagement and adoption, especially for non-technical users.
  7. Task Prioritization and Tracking
    • The ability to sort tasks into columns (e.g., “To Do,” “In Progress,” “Done”) helps prioritize work and ensures nothing falls through the cracks.
  8. Supports Agile Methodologies
    • Ideal for teams using Agile or Scrum methodologies, allowing them to visualize backlogs, sprints, and task progress directly in the Dataverse.

Automating the deployment of a IOT-device isn’t just yaml.

Although we need stuff to happen after our build, it is more complicated. We have a server running that needs to receive a message and then make sure all new dependencies are installed and the code is restarted.

We have our usual deploy.yaml in our Github Actions. It makes a post request to our own API that needs to run some OS commands on-device for everything to update properly.

But we do in addition need our IOT device to capture this request and ensure it is updated. Also, please don’t take more time then what is required!

House Points for Slytherin: Helping “The Name Who Must Be Named” with Link Mobility



House Points for Slytherin: Helping “The Name Who Must Be Named” with Link Mobility 📞⚡

In the dark corridors of tech, where even the most powerful wizards and witches face challenges, it’s not just about pulling off the most daring magic, but about offering a hand to those in need. Sometimes, the most unexpected of allies can step in to help, and in this case, Slytherin has found itself not only achieving greatness but helping others do the same.

One of the lesser-known struggles for The Name Who Must Be Named was obtaining the right phone number from Link Mobility—a crucial piece for triggering their desired process. But as with many such hurdles, it’s not about the problem but how you solve it. That’s where we, the Slytherins, stepped in.


Lending a Helping Hand

When we discovered that The Name Who Must Be Named did not have the necessary phone number from Link Mobility, we saw this as an opportunity to lend a hand. Rather than just letting them continue their journey without the proper tools, we reached out to Link Mobility and provided our own phone number and code number. This seemingly small act paved the way for something much larger.


The Magic Behind the Flow

But, of course, magic can’t just stop there. Once we shared the phone number and code number, the next step was ensuring the flow worked seamlessly in their environment. This is where the true magic happened. 🔮

We designed and triggered a cloud flow within our tenant. Here’s how it works:

  1. Triggering the Process: Once the phone number is triggered, our cloud flow takes over. It acts as a proxy between Link Mobility and The Name Who Must Be Named’s flow, ensuring that all data is properly transmitted.
  2. Sending the Body: The body of the received message from Link Mobility is forwarded to their flow via the cloud flow, making sure everything runs smoothly and automatically.
  3. Achieving Greatness: With the data now flowing effortlessly, they are able to achieve what they set out to do. What once appeared to be a stumbling block has now been transformed into an opportunity for success. ⚡

A Lesson in Collaboration

This small act serves as a reminder that the world of tech and magic isn’t just about competition or looking out for yourself. It’s about collaboration, and lending a helping hand when others need it. By sharing our phone number and providing the cloud flow proxy, we helped The Name Who Must Be Named overcome their challenge and achieve their goals. After all, Slytherin doesn’t just win—it helps others win too.


House Points for Slytherin! 🐍✨

In the end, it wasn’t just about solving a problem. It was about showing that even in a world where magical and technological solutions reign supreme, it’s always the people who make the most difference. By using cloud flows and proxy magic, we’ve not only enhanced our own capabilities but helped our fellow wizards and witches soar.

Just remember: It’s not about the victory, but about making sure everyone’s journey is as seamless as possible. And Slytherin is always here to help. 🖤

Virtual potion ingredients: XR in PCF components

We already have access to MR (aka. Magic Reality) components in Canvas Apps. Implementation is straight forward, but as expected they come with a limited set of features. The components are based on Babylon.js, and make for a quick and easy way to place and view a 3D model in an augmented reality context.

For our solution, we wanted the user to also be able to interact with the virtual objects in front of them, which is not OOB features, so by expressing our power user love, we decided to explore the possibilities around custom XR enabled PCF components.

Being ACDC craftsmen, knowing the potential issues of going too far down the wrong path, we decided to do some proof of concepts, creating custom PCF components with third party XR libraries, acting like proper thieving bastards on the way.

First off, we had a look at AR.js, which is built on ARToolkit, a relatively old library. This library could provide us with wide device support, which really didn’t have that much value, considering the component would be running inside the Power Apps mobile app. We would also be forced to use either image target or marker tracking, with no modern AR spatial tracking.

Looking closer at the OOB components, we tried to find a way to leverage the OOB Babylon.js logic, hopefully being able to hook into the React Native part of the implementation, which would give great benefits in terms of access to device specific XR features (ARCore for Android and ARKit for iOS). We did, however, decide to leave this path, and focus elsewhere.

Wizard Tracking: 3D geolocation in Canvas App

In our solution, users will be gathering ingredients using object detection in a Canvas App. The AI model used for this has been trained on objects around the conference venue, and so we wanted to enhance the connection between the app and the real world. Already having access to the users geo location through the geolocation web API inside the Canvas App and any PCF components, we decided to these data to place the active users on a 3D representation of the venue, expressing our power user love by merging 3D graphics with the OOB Canvas App elements.

We were able to find a simple volume model of the buildings on the map service Kommunekart 3D, but these data seem to be provided by Norkart, which is not freely available.

Like the thieving bastards we are, we decided to scrape the 3D model off of the site, by fetching all the resources that looked like binary 3D data. We found the data was in B3DM format and we found the buildings in one of these. We used Blender to clean up the model, by removing surrounding buildings and exporting it to glTF 3D file format, for use in a WebGL 3D context.

The representation of the 3D model, we decided to do with Three.js, which let us create an HTML canvas element inside the PCF component and using its WebGL context to render out the model in 3D. The canvas is continuously rendered using requestAnimationFrame under the hood, making it efficient in a browser context. The glTF model was loaded using a data URI, as a workaround for the web resource file format restrictions.

The coordinates from the user’s mobile device comes in as geographical coordinates, with longitude, latitude and altitude. The next step was to map these values relative to a known coordinate in the building, which we chose to be the main entrance. By using the main entrance geographical coordinates, we could then convert that to cartesian coordinates, with X, Y and Z, do the same to the realtime coordinates from the user, and subtract the origin, to get the offset in meters. The conversion from geographic to geocentric coordinates were done like so:

// eslint-disable-next-line @typescript-eslint/consistent-type-definitions
export type CartesianCoordinates = { x: number; y: number; z: number };

// eslint-disable-next-line @typescript-eslint/consistent-type-definitions
export type GeographicCoordinates = { lat: number; lon: number; alt: number };

// Conversion factor from degrees to radians
const DEG_TO_RAD = Math.PI / 180;

// Constants for WGS84 Ellipsoid
const WGS84_A = 6378137.0; // Semi-major axis in meters
const WGS84_E2 = 0.00669437999014; // Square of eccentricity

// Function to convert geographic coordinates (lat, lon, alt) to ECEF (x, y, z)
export function geographicToECEF(coords: GeographicCoordinates): { x: number; y: number; z: number } {
	// Convert degrees to radians
	const latRad = coords.lat * DEG_TO_RAD;
	const lonRad = coords.lon * DEG_TO_RAD;

	// Calculate the radius of curvature in the prime vertical
	const N = WGS84_A / Math.sqrt(1 - WGS84_E2 * Math.sin(latRad) * Math.sin(latRad));

	// ECEF coordinates
	const x = (N + coords.alt) * Math.cos(latRad) * Math.cos(lonRad);
	const y = (N + coords.alt) * Math.cos(latRad) * Math.sin(lonRad);
	const z = (N * (1 - WGS84_E2) + coords.alt) * Math.sin(latRad);

	return { x, y, z };
}

This gave us fairly good precision, but not without the expected inaccuracy caused by being indoors.

In our solution the current position is then represented by an icon moving around the 3D model based on the current GPS data from the device.

To connect this representation to realtime data from all the currently active users, we decided to set up an Azure SignalR Service, with an accompanying Azure Storage and Azure Function App for the backend, bringing it all to the cloud, almost like a stairway to heaven. With this setup, we could use the @microsoft/azure package inside the PCF component, receiving connection, disconnection and location update message broadcast from all other users, showing where they are right now.

Keep calm and ALM on: Managing your apps from spark to finish – ACDC Craftsman

In our journey to streamline our development and deployment processes, we embraced the principles of low-code development. Our goal was to align our CI/CD processes with these principles, and we found the perfect solution in Microsoft Power Platform Pipelines, enhanced by the new Git integration feature. This approach provided us with business value, offering an easily maintainable ALM solution that ensured compliance, security, and privacy protection.

Setting Up Our Environment

To begin, we created a dedicated production environment for hosting our pipeline app. This step was crucial in segregating our production workloads from development and testing activities, ensuring a stable and secure environment for our live applications.

Next, we installed the installed the Power Platform Pipelines Model-Driven App and started configuring our pipeline:

Once the pipeline was fully configured, we verified our solution by checking its connection to the pipeline. This verification step was essential to ensure that our setup was correct and that we could initiate the first deployment to our TEST environment without any issues.

Leveraging Source Control

One of the standout features of our implementation was the integration with Git. By utilizing the source control menu, we could easily view the change log and track modifications made to our solutions. This transparency was invaluable for facilitating change validation and code reviews among our developers.

In Azure DevOps, we created a new project with branches related to the respective environments.

Establishing Naming Conventions

To maintain clarity, we used prefixes on our canvas app screens and for other components within the solution. This practice helped ensure that each component was easily identifiable and organized, facilitating better management and reducing confusion.

For example, as illustrated in the below image we aimed to standardize naming convetion for screens, containers and other controls in general. The purpose of this was to make it easier when referencing these components later using Power FX. We applied this practice to our canvas app as it is a common best pratice to use elsewhere when working with Dynamics 365 modules and related components (etc, forms, security roles).

Adhering to Best Practices

Throughout this process, we adhered to several best practices for ALM in Power Platform:

  • Environment Strategy: We used separate environments for development and production, ensuring that changes were tested in DEV before deployment.
  • Solutions: We utilized managed solutions for production environments and unmanaged solutions for development, aligning with industry guidelines.
  • Source Control: Our integration with Git and the implementation of a branching strategy ensured effective version control and collaboration.
  • Automation: By configuring Power Platform Pipelines, we automated our deployment processes, reducing manual errors and ensuring consistency.
  • Governance and Security: We implemented role-based access control and ensured compliance with security protocols, protecting our data and applications.

Application lifecycle management (ALM) basics with Microsoft Power Platform – Power Platform | Microsoft Learn

Have your school under control with Marauder’s Map, incuding mobile app and searching in multiple ways

14:41: Updated to include additional information around embedding the map in a mobile app and searching in multiple ways.

You want to see if other professors are around in the school? We found a magical map in the School form of our OwlExpress app. The Marauder’s Map is using Websockets magic to track everyone on the school premises Right Now.

It also is using device embedded voice recognition to understand your spells, you need to say: “I solemnly swear that I’m up to no good.” if you want to see the map.

When you are done and want to hide the map you must say: “Mischief Managed!”.

Do you like the glossy pixels of this map?

Mobile Map

We have also managed to embed this into a canvas app for a mobile delivery for sneaky students upto no good.

Searching in many ways

We have also introduced a glossy new feature for searching our student database in multiple magical ways, searching via standard text boxes, searching by scanning a business card, drawing a students name and utlising Copilot.

ALM Magic

It’s EVIDIosa, not Leviosaaaa thoughts on ALM and the way of work.

Claims the badge ACDC Craftsman.🚀

Claims the badge Power Of The Shell.🚀 – Please have a look at the “Azure DevOps Pipelines for CRM Solution” section

Workflow – Azure DevOps Boards

Using boards to keep track of badges and tasks. For better collaboration, progress and overview.

Scrum

Given the very short length of the project, there has been a need for tight collaborations and check-ins. In a way we have practised a super compressed variation of Scrum, with several daily standups each day to secure progress. In a project like this, it is particularly important to avoid banging your head against the wall for too long time on one task. Regular meetings has made sure that the team has been able to maintain a best possible progress at all times, while also being a platform for contributing with new ideas.

Naming conventions

Tables (Entities)

General

  • Change tracking should be enabled by default for all “business” entities where history might be needed.
    • This is because change tracking is often indispensable for troubleshooting.
    • Change tracking can be disabled for specific fields if necessary, e.g., when automation frequently changes irrelevant fields.

Forms

  • Never modify the standard form. Clone it (save as) and hide the original form from users.
    • Standard forms can be updated by the vendor, which may overwrite changes or disrupt updates.
    • Having the original form for comparison is also useful for troubleshooting.
  • This does not apply to custom tables.
  • Ensure fields like status, status reason, created by/on, modified by/on, and owner are visible on the form.
    • These are useful for both administrators and end-users.

Option Sets

Views

When Should I Use Configuration, Business Rules, Processes, Flows, JavaScript, Plugins, or Other Tools?

  • It is important to choose the right tool for the job.
  • Considering administration, maintenance, and documentation, we prioritize tools in the following order:
    • Standard functionality/configuration
      • Many things can be solved with built-in functionality, e.g., setting a field to read-only doesn’t require a Flow. 😉
    • Business Rules
    • Processes/Workflows
      • Can run synchronously.
    • JavaScript
    • Flows
      • Suitable for querying Dataverse data.
    • Plugins

Solutions

  • Unmanaged in development:
    • We have not yet decided on a solution model but will likely use a base package initially and then move to feature-based solutions.
  • Managed in test and production.

Deployment

  • All changes are deployed via pipelines in DevOps.
  • NO MANUAL STEPS/ADJUSTMENTS IN TEST OR PRODUCTION.
  • All data referenced by automation (e.g., Flow) must be scripted and inserted programmatically (preferably via pipeline) to ensure GUID consistency across environments.

Application Lifecycle Management

Why ALM?

Application Lifecycle Management (ALM) enables structured development, testing, and deployment processes, ensuring quality and reducing risks.

  • It is a consistant way of deploying features
  • You get equal environments
  • Better collaboration
  • History
  • Common way of working
  • The whole team is always up to date with changes in developmen

Overall solution

A diagram showing the overall deployment cycle for our CRM Solution.

Solution Strategy

Using a single solution strategy. All ready changes is added to the solution. Fewer dependencies and less complexity is preferable when working within such a time span, and allows for smoother collaboration. The solution is not too complex still, so it makes sense to gather all components in a single solution. As mentioned earlier, it should be considered over time to move to feature-based solutions.

Service Principles

Service Principals are used for between Azure DevOps each environment to ensure industry standard connection to Dataverse without exposing credentials in the code. These are the names of our Service Connections.

  • hogverse-dev
  • hogverse-validation
  • hogverse-test
  • hogverse-prod

In honor of the great ALM Wizard 👉Benedik Bergmann👈the Service Connections are configured with the new feature of Workload Identity federation for ADO Connection. This eliminates the need for managing and renewing secrets, reducing the risk of pipeline failures due to expired credentials. Credentials are not exposed in code or stored in repositories.

Setup of Workload Identity federation for ADO Connection

  1. App Registration: Registering app registrations in Microsoft Entra ID.
  2. Dataverse: Adding the app registration as an application user in our target Dataverse environment, assigning it System Administrator role.
  3. ADO Service Connection: Creating a service connection in Azure DevOps, linking it to the Dataverse instance using Workload Identity Federation.
  4. Adding Federated Credentials: Configuring the app registration to recognize the ADO service connection by setting up federated credentials with the correct issuer and subject identifier.

Entra ID:

Environments

  • DEV The Room of Requirement
    The development environment for Hogverse.
  • Validation – The Chamber of Truth
    The validation environement to check changes before merging to our main branch.
  • TEST: The Restricted Section
    The UAT test environment for Hogverse.
  • PROD: The Great Hall
    The production environment for Hogverse.

Pipelines

Our ALM solution for deploying changes in Power Platform and CRM is the following:
image.png

  • Export
    The Export Pipeline is retrieving the changes from the “DEV The Room of Requirement” environment and creates a pull request.
  • Build
    The Build Pipeline packs the changes in the pull request branch and deployes it to “Validation – The Chamber of Truth” to see if something breaks before mergin it to our main branch. When this completes successfully it creates a Release and can be deployed with the Release Pipeline.
  • Release
    The Release Pipeline deployes the changes to “TEST: The Restricted Section” and “PROD: The Great Hall” after a user has in the team has approved the release, by using Approval and checks in Azure DevOps Pipelines.

Approval and checks for “TEST: The Restricted Section”
image.png

Approval and checks for “PROD: The Great Hall”
image.png

Repository Settings:

Settings for the Build Service User.
image.png

Setting and requirements for merging changes to our main branch that goes to production.
image.png

Azure DevOps Pipelines for CRM Solution

Claims the badge Power Of The Shell.🚀

Export Pipeline

The Export Pipeline is retrieving the changes from the “DEV The Room of Requirement” environment and creates a pull request. Below is the yml code for export pipeline.

trigger: none

parameters:
  - name: SolutionName
    displayName: Solution to export
    type: string
    default: HogverseBasis
  - name: TypeOfTask
    displayName: "Select task type"
    type: string
    default: "feature"
    values:
      - feature
      - bug
  - name: WorkItems
    displayName: Work Item ID needs to be attached to Automated Pull Request. Multiple work items are space separated.
    type: string
    default:
  - name: Description
    displayName: Pullrequest description
    type: string
    default: "This Pull request is generated automatically through build"

# Define branch name dynamically based on build number and solution name
variables:
  - name: BranchName
    value: "${{ parameters.TypeOfTask }}/${{ parameters.SolutionName }}-$(Build.BuildNumber)"

pool:
  vmImage: "windows-latest"

steps:
  - checkout: self
    persistCredentials: true
  - task: PowerPlatformToolInstaller@2
    displayName: "Power Platform Tool Installer"

  # Create New Branch and Commit Changes
  - script: |
      echo checkout souce branch
      echo git config user.email $(Build.RequestedForEmail)
      echo git config user.name $(Build.RequestedFor)
      git config user.email "$(Build.RequestedForEmail)"
      git config user.name "$(Build.RequestedFor)"
      git fetch origin
      git checkout main
      git checkout -b "${{ variables.BranchName }}" main
      git -c http.extraheader="AUTHORIZATION: bearer $(System.AccessToken)" push origin "${{ variables.BranchName }}"
      echo git checkout -b "${{ variables.BranchName }}"
      git fetch origin
      git checkout ${{ variables.BranchName }}
    displayName: "Checkout GIT Automated CRM Build Branch"

  - task: PowerPlatformPublishCustomizations@2
    displayName: Publish Customizations
    inputs:
      authenticationType: "PowerPlatformSPN"
      PowerPlatformSPN: "hogverse-dev"
      AsyncOperation: true
      MaxAsyncWaitTime: "60"
  - task: PowerPlatformSetSolutionVersion@2
    displayName: "PowerApps Set Solution Version - ${{ parameters.SolutionName }}"
    inputs:
      authenticationType: "PowerPlatformSPN"
      PowerPlatformSPN: "hogverse-dev"
      SolutionName: "${{ parameters.SolutionName }}"
      SolutionVersionNumber: "$(Build.BuildNumber)"
  - task: PowerPlatformExportSolution@2
    displayName: Export ${{ parameters.SolutionName }} - Unmanaged
    inputs:
      authenticationType: "PowerPlatformSPN"
      PowerPlatformSPN: "hogverse-dev"
      SolutionName: "${{ parameters.SolutionName }}"
      SolutionOutputFile: '$(Build.ArtifactStagingDirectory)\\${{ parameters.SolutionName }}.zip'
      AsyncOperation: true
      MaxAsyncWaitTime: "60"
  - task: PowerPlatformExportSolution@2
    displayName: Export Solution - Managed
    inputs:
      authenticationType: "PowerPlatformSPN"
      PowerPlatformSPN: "hogverse-dev"
      SolutionName: "${{ parameters.SolutionName }}"
      SolutionOutputFile: '$(Build.ArtifactStagingDirectory)\\${{ parameters.SolutionName }}_managed.zip'
      Managed: true
      AsyncOperation: true
      MaxAsyncWaitTime: "60"
  - task: PowerPlatformUnpackSolution@2
    displayName: Unpack unmanaged solution ${{ parameters.SolutionName }}
    inputs:
      SolutionInputFile: '$(Build.ArtifactStagingDirectory)\\${{ parameters.SolutionName }}.zip'
      SolutionTargetFolder: '$(build.sourcesdirectory)\PowerPlatform\Solutions\${{ parameters.SolutionName }}'
      SolutionType: "Both"

  - script: |
      echo git push "${{ variables.BranchName }}"
      git push origin ${{ variables.BranchName }}
      echo git add -all
      git add --all
      echo git commit -m "Solutions committed by build number $(Build.BuildNumber)"
      git commit -m "Solutions committed by build number $(Build.BuildNumber)"
      echo push code to ${{ variables.BranchName }}
      git -c http.extraheader="AUTHORIZATION: bearer $(System.AccessToken)" push origin ${{ variables.BranchName }}
    displayName: "Commit CRM solutions to Automated CRM Build Branch"

  # Install Azure DevOps extension
  - script: az --version
    displayName: "Show Azure CLI version"

  # Install Azure DevOps Extension
  - script: az extension add -n azure-devops
    displayName: "Install Azure DevOps Extension"

  # Login to Azure DevOps Extension
  - script: echo $(System.AccessToken) | az devops login
    env:
      AZURE_DEVOPS_CLI_PAT: $(System.AccessToken)
    displayName: "Login Azure DevOps Extension"

  # Configure Azure DevOps Extension
  - script: az devops configure --defaults organization=https://dev.azure.com/hogverse project="Platform9¾Hub" --use-git-aliases true
    displayName: "Set default Azure DevOps organization and project"

  - script: |
      az repos pr create --repository "CRM.Solutions" --title "Automated Pull Request for solution HogverseBasis from branch ${{ variables.BranchName }}" --auto-complete false --bypass-policy false --description "${{parameters.Description}}" --detect --source-branch "${{ variables.BranchName }}" --target-branch "main" --work-items ${{parameters.WorkItems}}
    displayName: "Create automated Pull request for merging CRM solutions to master build list and PRs"

Using work items to track the changes for each pull request and releases.

Build Pipeline

The yml code for the Build Pipeline.

trigger:
  branches:
    include:
      - main # Trigger build pipeline on changes to main
pr:
  branches:
    include:
      - "*" # Trigger on all PR branches

pool:
  vmImage: "windows-latest"

steps:
  - script: |
      echo "Validating pull request from source branch: $(System.PullRequest.SourceBranch)"
      echo "Target branch: $(System.PullRequest.TargetBranch)"
    displayName: "Validate Pull Request Source Branch"
  - task: PowerPlatformToolInstaller@2
    displayName: "Power Platform Tool Installer "

  - task: PowerPlatformPackSolution@2
    inputs:
      SolutionSourceFolder: '$(build.sourcesdirectory)\PowerPlatform\Solutions\HogverseBasis'
      SolutionOutputFile: '$(Build.ArtifactStagingDirectory)\Solutions\HogverseBasis.zip'
      SolutionType: "Managed"

  - task: PublishPipelineArtifact@1
    displayName: Publish Artifacts
    inputs:
      targetPath: "$(Build.ArtifactStagingDirectory)"
      artifact: "drop"
      publishLocation: "pipeline"

  - task: PowerPlatformImportSolution@2
    displayName: "Power Platform Import HogverseBasis to Validation"
    inputs:
      authenticationType: PowerPlatformSPN
      PowerPlatformSPN: "hogverse-validation"
      SolutionInputFile: '$(Build.ArtifactStagingDirectory)\Solutions\HogverseBasis.zip'
      StageAndUpgrade: false
      ActivatePlugins: false
      SkipLowerVersion: true

Release Pipeline

The code for the release pipeline.

trigger: none
pr: none

resources:
  pipelines:
    - pipeline: Build # Reference to the build pipeline
      source: Build # Name of the build pipeline to trigger from
      trigger:
        branches:
          include:
            - main

stages:
  - stage: DeployTest
    displayName: "Deploy to Test"
    jobs:
      - deployment: deployTest # Use deployment job for environment reference
        environment: "The Great Hall - Test" # Reference the 'The Great Hall - Test' environment
        pool:
          vmImage: "windows-latest"
        strategy:
          runOnce:
            deploy:
              steps:
                - checkout: self
                - download: Build # pipeline resource identifier.
                  artifact: drop
                - task: PowerPlatformToolInstaller@2
                  displayName: "Power Platform Tool Installer"
                - task: PowerPlatformImportSolution@2
                  inputs:
                    authenticationType: "PowerPlatformSPN"
                    PowerPlatformSPN: "hogverse-test"
                    SolutionInputFile: '$(Pipeline.Workspace)\Build\drop\Solutions\HogverseBasis.zip'
                    AsyncOperation: true
                    MaxAsyncWaitTime: "60"
                - task: PowerPlatformPublishCustomizations@2
                  inputs:
                    authenticationType: "PowerPlatformSPN"
                    PowerPlatformSPN: "hogverse-test"
                    AsyncOperation: true
                    MaxAsyncWaitTime: "60"
  - stage: DeployProd
    displayName: "Deploy to Production"
    dependsOn: DeployTest # Depends on successful deployment to Test
    condition: succeeded()
    jobs:
      - deployment: deployProd # Use deployment job for environment reference
        environment: "The Restricted Section - Prod" # Reference the 'The Restricted Section - Prod' environment
        pool:
          vmImage: "windows-latest"
        strategy:
          runOnce:
            deploy:
              steps:
                - checkout: self
                - download: Build # pipeline resource identifier.
                  artifact: drop
                - task: PowerPlatformToolInstaller@2
                  displayName: "Power Platform Tool Installer"
                - task: PowerPlatformImportSolution@2
                  inputs:
                    authenticationType: "PowerPlatformSPN"
                    PowerPlatformSPN: "hogverse-prod"
                    SolutionInputFile: '$(Pipeline.Workspace)\Build\drop\Solutions\FE360Basis.zip'
                    AsyncOperation: true
                    MaxAsyncWaitTime: "60"
                - task: PowerPlatformPublishCustomizations@2
                  inputs:
                    authenticationType: "PowerPlatformSPN"
                    PowerPlatformSPN: "hogverse-prod"
                    AsyncOperation: true
                    MaxAsyncWaitTime: "60"

Issues

We had this problem during the event and unfortunately we did get to run hour pipelines the way we wanted.

And we tried a work around but that ended up with… Self Hosted Agent.

Looks good all the way in the installation aaaaand in the final step was blocked by admin rights at the company…. You need admin rights…

ALM for Fabric

We have implemented deployment pipelines for deploying changes between workspaces so that the reports can be tested and verified before going into production.

Hogverse Deployment Pipelines for deploying items between workspaces.

ALM for Svelte and Front end

The Svelte project is for now hosted in a private Github repository shared between the developers. Each developer creates their own branch for each new feature added. When a feature is ready to commit, a pull request is created and approved from others on the team. On approval, the branches are merged and the feature branch is normally deleted to ensure a clean project at all times.

With more time on our hands, we would have preferred to import the repository to Azure DevOps and created pipelines for Dev, Validation, Test and Prod as for the CRM solution.

Bicep honourable mention and hot tip

The Azure resources used in the project has mostly been created on the fly as part of experimentation, but we would most definitely have created Bicep files for deployment of them for each environment as well. Microsoft MVP 👉Jan Vidar Elven👈 have created a super useful public repository with templates for deploying resources on his Github account: https://github.com/JanVidarElven/workshop-get-started-with-bicep

Hogwarts Enchantment – Cast A Spell To Win

As part of the gamification experience of our app, we wanted to incorporate the ability to cast spells. The purpose of these spells were to gain an advantage over what was seemingly a superior opponent. Using spellcasting in a creative way enabled the user to be creative and test various spells to see their impact on the game.

This feature was solved by creating a Power Automate flow. The flow utlized the Azure Cognitive Services API and its Text-to-speech service. The user would then be prompted to use the phone microphone to record a spell, and the flow would then send back a text version and response based on the chosen spell. In addition, the flow also utilized an open API for converting the sound file to the appropriate format for further processing. Below is the outlined flow:

Below is also a demo video, illustrating the feature in effect: