Glossy Potion Exam Pixels

Our Potions examination interface exemplifies the standards of the Glossy Pixels badge through its exceptional implementation of responsive design principles. The interface maintains visual integrity and functionality across all device sizes, from desktop monitors in the Hogwarts library to mobile devices in the dungeons.

The parchment-style containers and medieval typography demonstrate sophisticated scaling, ensuring readability without compromising the magical aesthetics. It resides magically and consistently above our three-column ingredient list and gracefully adapts to different screen sizes while maintaining clear visibility of crucial examination elements.

Most importantly, the interface achieves this responsiveness without sacrificing the mystical atmosphere essential to a Hogwarts examination. Like a perfectly brewed potion, each visual element maintains its properties across all viewing conditions, creating an experience that is both functional and authentically magical.

Wow, look at that responsiveness!

#GlossyPixels

Go With The Flow

For our students we are providing a guidance when we start working with their request.

After they submitted a form – the activities are created for them.

They can find it in the My Activities page:

Then they have to finish all the activities and be allocated to a faculty.

For example, when user clicks on the Interview scheduled activity – they will be navigated to the Survey page and can start answering to the questions.

#GoWithTheFlow

ACDC Craftsman

It’s EVIDIosa, not Leviosaaaa thoughts on ALM and the way of work.

Claims the badge ACDC Craftsman.🚀

Workflow – Azure DevOps Boards

Using boards to keep track of badges and tasks. For better collaboration, progress and overview.

Scrum

Given the very short length of the project, there has been a need for tight collaborations and check-ins. In a way we have practised a super compressed variation of Scrum, with several daily standups each day to secure progress. In a project like this, it is particularly important to avoid banging your head against the wall for too long time on one task. Regular meetings has made sure that the team has been able to maintain a best possible progress at all times, while also being a platform for contributing with new ideas.

Naming conventions

Tables (Entities)

General

  • Change tracking should be enabled by default for all “business” entities where history might be needed.
    • This is because change tracking is often indispensable for troubleshooting.
    • Change tracking can be disabled for specific fields if necessary, e.g., when automation frequently changes irrelevant fields.

Forms

  • Never modify the standard form. Clone it (save as) and hide the original form from users.
    • Standard forms can be updated by the vendor, which may overwrite changes or disrupt updates.
    • Having the original form for comparison is also useful for troubleshooting.
  • This does not apply to custom tables.
  • Ensure fields like status, status reason, created by/on, modified by/on, and owner are visible on the form.
    • These are useful for both administrators and end-users.

Option Sets

Views

When Should I Use Configuration, Business Rules, Processes, Flows, JavaScript, Plugins, or Other Tools?

  • It is important to choose the right tool for the job.
  • Considering administration, maintenance, and documentation, we prioritize tools in the following order:
    • Standard functionality/configuration
      • Many things can be solved with built-in functionality, e.g., setting a field to read-only doesn’t require a Flow. 😉
    • Business Rules
    • Processes/Workflows
      • Can run synchronously.
    • JavaScript
    • Flows
      • Suitable for querying Dataverse data.
    • Plugins

Solutions

  • Unmanaged in development:
    • We have not yet decided on a solution model but will likely use a base package initially and then move to feature-based solutions.
  • Managed in test and production.

Deployment

  • All changes are deployed via pipelines in DevOps.
  • NO MANUAL STEPS/ADJUSTMENTS IN TEST OR PRODUCTION.
  • All data referenced by automation (e.g., Flow) must be scripted and inserted programmatically (preferably via pipeline) to ensure GUID consistency across environments.

Application Lifecycle Management

Why ALM?

Application Lifecycle Management (ALM) enables structured development, testing, and deployment processes, ensuring quality and reducing risks.

  • It is a consistant way of deploying features
  • You get equal environments
  • Better collaboration
  • History
  • Common way of working
  • The whole team is always up to date with changes in developmen

Overall solution

A diagram showing the overall deployment cycle for our CRM Solution.

Solution Strategy

Using a single solution strategy. All ready changes is added to the solution. Fewer dependencies and less complexity is preferable when working within such a time span, and allows for smoother collaboration. The solution is not too complex still, so it makes sense to gather all components in a single solution. As mentioned earlier, it should be considered over time to move to feature-based solutions.

Service Principles

Service Principals are used for between Azure DevOps each environment to ensure industry standard connection to Dataverse without exposing credentials in the code. These are the names of our Service Connections.

  • hogverse-dev
  • hogverse-validation
  • hogverse-test
  • hogverse-prod

In honor of the great ALM Wizard 👉Benedik Bergmann👈the Service Connections are configured with the new feature of Workload Identity federation for ADO Connection. This eliminates the need for managing and renewing secrets, reducing the risk of pipeline failures due to expired credentials. Credentials are not exposed in code or stored in repositories.

Setup of Workload Identity federation for ADO Connection

  1. App Registration: Registering app registrations in Microsoft Entra ID.
  2. Dataverse: Adding the app registration as an application user in our target Dataverse environment, assigning it System Administrator role.
  3. ADO Service Connection: Creating a service connection in Azure DevOps, linking it to the Dataverse instance using Workload Identity Federation.
  4. Adding Federated Credentials: Configuring the app registration to recognize the ADO service connection by setting up federated credentials with the correct issuer and subject identifier.

Entra ID:

Environments

  • DEV The Room of Requirement
    The development environment for Hogverse.
  • Validation – The Chamber of Truth
    The validation environement to check changes before merging to our main branch.
  • TEST: The Restricted Section
    The UAT test environment for Hogverse.
  • PROD: The Great Hall
    The production environment for Hogverse.

Pipelines

Our ALM solution for deploying changes in Power Platform and CRM is the following:
image.png

  • Export
    The Export Pipeline is retrieving the changes from the “DEV The Room of Requirement” environment and creates a pull request.
  • Build
    The Build Pipeline packs the changes in the pull request branch and deployes it to “Validation – The Chamber of Truth” to see if something breaks before mergin it to our main branch. When this completes successfully it creates a Release and can be deployed with the Release Pipeline.
  • Release
    The Release Pipeline deployes the changes to “TEST: The Restricted Section” and “PROD: The Great Hall” after a user has in the team has approved the release, by using Approval and checks in Azure DevOps Pipelines.

Approval and checks for “TEST: The Restricted Section”

Approval and checks for “PROD: The Great Hall”
image.png

Repository Settings:

Settings for the Build Service User.
image.png

Setting and requirements for merging changes to our main branch that goes to production.
image.png

Azure DevOps Pipelines for CRM Solution

Export Pipeline

The Export Pipeline is retrieving the changes from the “DEV The Room of Requirement” environment and creates a pull request. Below is the yml code for export pipeline.

trigger: none

parameters:
  - name: SolutionName
    displayName: Solution to export
    type: string
    default: HogverseBasis
  - name: TypeOfTask
    displayName: "Select task type"
    type: string
    default: "feature"
    values:
      - feature
      - bug
  - name: WorkItems
    displayName: Work Item ID needs to be attached to Automated Pull Request. Multiple work items are space separated.
    type: string
    default:
  - name: Description
    displayName: Pullrequest description
    type: string
    default: "This Pull request is generated automatically through build"

# Define branch name dynamically based on build number and solution name
variables:
  - name: BranchName
    value: "${{ parameters.TypeOfTask }}/${{ parameters.SolutionName }}-$(Build.BuildNumber)"

pool:
  vmImage: "windows-latest"

steps:
  - checkout: self
    persistCredentials: true
  - task: PowerPlatformToolInstaller@2
    displayName: "Power Platform Tool Installer"

  # Create New Branch and Commit Changes
  - script: |
      echo checkout souce branch
      echo git config user.email $(Build.RequestedForEmail)
      echo git config user.name $(Build.RequestedFor)
      git config user.email "$(Build.RequestedForEmail)"
      git config user.name "$(Build.RequestedFor)"
      git fetch origin
      git checkout main
      git checkout -b "${{ variables.BranchName }}" main
      git -c http.extraheader="AUTHORIZATION: bearer $(System.AccessToken)" push origin "${{ variables.BranchName }}"
      echo git checkout -b "${{ variables.BranchName }}"
      git fetch origin
      git checkout ${{ variables.BranchName }}
    displayName: "Checkout GIT Automated CRM Build Branch"

  - task: PowerPlatformPublishCustomizations@2
    displayName: Publish Customizations
    inputs:
      authenticationType: "PowerPlatformSPN"
      PowerPlatformSPN: "hogverse-dev"
      AsyncOperation: true
      MaxAsyncWaitTime: "60"
  - task: PowerPlatformSetSolutionVersion@2
    displayName: "PowerApps Set Solution Version - ${{ parameters.SolutionName }}"
    inputs:
      authenticationType: "PowerPlatformSPN"
      PowerPlatformSPN: "hogverse-dev"
      SolutionName: "${{ parameters.SolutionName }}"
      SolutionVersionNumber: "$(Build.BuildNumber)"
  - task: PowerPlatformExportSolution@2
    displayName: Export ${{ parameters.SolutionName }} - Unmanaged
    inputs:
      authenticationType: "PowerPlatformSPN"
      PowerPlatformSPN: "hogverse-dev"
      SolutionName: "${{ parameters.SolutionName }}"
      SolutionOutputFile: '$(Build.ArtifactStagingDirectory)\\${{ parameters.SolutionName }}.zip'
      AsyncOperation: true
      MaxAsyncWaitTime: "60"
  - task: PowerPlatformExportSolution@2
    displayName: Export Solution - Managed
    inputs:
      authenticationType: "PowerPlatformSPN"
      PowerPlatformSPN: "hogverse-dev"
      SolutionName: "${{ parameters.SolutionName }}"
      SolutionOutputFile: '$(Build.ArtifactStagingDirectory)\\${{ parameters.SolutionName }}_managed.zip'
      Managed: true
      AsyncOperation: true
      MaxAsyncWaitTime: "60"
  - task: PowerPlatformUnpackSolution@2
    displayName: Unpack unmanaged solution ${{ parameters.SolutionName }}
    inputs:
      SolutionInputFile: '$(Build.ArtifactStagingDirectory)\\${{ parameters.SolutionName }}.zip'
      SolutionTargetFolder: '$(build.sourcesdirectory)\PowerPlatform\Solutions\${{ parameters.SolutionName }}'
      SolutionType: "Both"

  - script: |
      echo git push "${{ variables.BranchName }}"
      git push origin ${{ variables.BranchName }}
      echo git add -all
      git add --all
      echo git commit -m "Solutions committed by build number $(Build.BuildNumber)"
      git commit -m "Solutions committed by build number $(Build.BuildNumber)"
      echo push code to ${{ variables.BranchName }}
      git -c http.extraheader="AUTHORIZATION: bearer $(System.AccessToken)" push origin ${{ variables.BranchName }}
    displayName: "Commit CRM solutions to Automated CRM Build Branch"

  # Install Azure DevOps extension
  - script: az --version
    displayName: "Show Azure CLI version"

  # Install Azure DevOps Extension
  - script: az extension add -n azure-devops
    displayName: "Install Azure DevOps Extension"

  # Login to Azure DevOps Extension
  - script: echo $(System.AccessToken) | az devops login
    env:
      AZURE_DEVOPS_CLI_PAT: $(System.AccessToken)
    displayName: "Login Azure DevOps Extension"

  # Configure Azure DevOps Extension
  - script: az devops configure --defaults organization=https://dev.azure.com/hogverse project="Platform9¾Hub" --use-git-aliases true
    displayName: "Set default Azure DevOps organization and project"

  - script: |
      az repos pr create --repository "CRM.Solutions" --title "Automated Pull Request for solution HogverseBasis from branch ${{ variables.BranchName }}" --auto-complete false --bypass-policy false --description "${{parameters.Description}}" --detect --source-branch "${{ variables.BranchName }}" --target-branch "main" --work-items ${{parameters.WorkItems}}
    displayName: "Create automated Pull request for merging CRM solutions to master build list and PRs"

Using work items to track the changes for each pull request and releases.

Build Pipeline

The yml code for the Build Pipeline.

trigger:
  branches:
    include:
      - main # Trigger build pipeline on changes to main
pr:
  branches:
    include:
      - "*" # Trigger on all PR branches

pool:
  vmImage: "windows-latest"

steps:
  - script: |
      echo "Validating pull request from source branch: $(System.PullRequest.SourceBranch)"
      echo "Target branch: $(System.PullRequest.TargetBranch)"
    displayName: "Validate Pull Request Source Branch"
  - task: PowerPlatformToolInstaller@2
    displayName: "Power Platform Tool Installer "

  - task: PowerPlatformPackSolution@2
    inputs:
      SolutionSourceFolder: '$(build.sourcesdirectory)\PowerPlatform\Solutions\HogverseBasis'
      SolutionOutputFile: '$(Build.ArtifactStagingDirectory)\Solutions\HogverseBasis.zip'
      SolutionType: "Managed"

  - task: PublishPipelineArtifact@1
    displayName: Publish Artifacts
    inputs:
      targetPath: "$(Build.ArtifactStagingDirectory)"
      artifact: "drop"
      publishLocation: "pipeline"

  - task: PowerPlatformImportSolution@2
    displayName: "Power Platform Import HogverseBasis to Validation"
    inputs:
      authenticationType: PowerPlatformSPN
      PowerPlatformSPN: "hogverse-validation"
      SolutionInputFile: '$(Build.ArtifactStagingDirectory)\Solutions\HogverseBasis.zip'
      StageAndUpgrade: false
      ActivatePlugins: false
      SkipLowerVersion: true

Release Pipeline

The code for the release pipeline.

trigger: none
pr: none

resources:
  pipelines:
    - pipeline: Build # Reference to the build pipeline
      source: Build # Name of the build pipeline to trigger from
      trigger:
        branches:
          include:
            - main

stages:
  - stage: DeployTest
    displayName: "Deploy to Test"
    jobs:
      - deployment: deployTest # Use deployment job for environment reference
        environment: "The Great Hall - Test" # Reference the 'The Great Hall - Test' environment
        pool:
          vmImage: "windows-latest"
        strategy:
          runOnce:
            deploy:
              steps:
                - checkout: self
                - download: Build # pipeline resource identifier.
                  artifact: drop
                - task: PowerPlatformToolInstaller@2
                  displayName: "Power Platform Tool Installer"
                - task: PowerPlatformImportSolution@2
                  inputs:
                    authenticationType: "PowerPlatformSPN"
                    PowerPlatformSPN: "hogverse-test"
                    SolutionInputFile: '$(Pipeline.Workspace)\Build\drop\Solutions\HogverseBasis.zip'
                    AsyncOperation: true
                    MaxAsyncWaitTime: "60"
                - task: PowerPlatformPublishCustomizations@2
                  inputs:
                    authenticationType: "PowerPlatformSPN"
                    PowerPlatformSPN: "hogverse-test"
                    AsyncOperation: true
                    MaxAsyncWaitTime: "60"
  - stage: DeployProd
    displayName: "Deploy to Production"
    dependsOn: DeployTest # Depends on successful deployment to Test
    condition: succeeded()
    jobs:
      - deployment: deployProd # Use deployment job for environment reference
        environment: "The Restricted Section - Prod" # Reference the 'The Restricted Section - Prod' environment
        pool:
          vmImage: "windows-latest"
        strategy:
          runOnce:
            deploy:
              steps:
                - checkout: self
                - download: Build # pipeline resource identifier.
                  artifact: drop
                - task: PowerPlatformToolInstaller@2
                  displayName: "Power Platform Tool Installer"
                - task: PowerPlatformImportSolution@2
                  inputs:
                    authenticationType: "PowerPlatformSPN"
                    PowerPlatformSPN: "hogverse-prod"
                    SolutionInputFile: '$(Pipeline.Workspace)\Build\drop\Solutions\FE360Basis.zip'
                    AsyncOperation: true
                    MaxAsyncWaitTime: "60"
                - task: PowerPlatformPublishCustomizations@2
                  inputs:
                    authenticationType: "PowerPlatformSPN"
                    PowerPlatformSPN: "hogverse-prod"
                    AsyncOperation: true
                    MaxAsyncWaitTime: "60"

Issues

We had this problem during the event and unfortunately we did get to run hour pipelines the way we wanted.

And we tried a work around but that ended up with… Self Hosted Agent.

Looks good all the way in the installation aaaaand in the final step was blocked by admin rights at the company…. You need admin rights…

ALM for Fabric

We have implemented deployment pipelines for deploying changes between workspaces so that the reports can be tested and verified before going into production.

Hogverse Deployment Pipelines for deploying items between workspaces.

ALM for Svelte and Front end

The Svelte project is for now hosted in a private Github repository shared between the developers. Each developer creates their own branch for each new feature added. When a feature is ready to commit, a pull request is created and approved from others on the team. On approval, the branches are merged and the feature branch is normally deleted to ensure a clean project at all times.

With more time on our hands, we would have preferred to import the repository to Azure DevOps and created pipelines for Dev, Validation, Test and Prod as for the CRM solution.

Bicep honourable mention and hot tip

The Azure resources used in the project has mostly been created on the fly as part of experimentation, but we would most definitely have created Bicep files for deployment of them for each environment as well. Microsoft MVP 👉Jan Vidar Elven👈 have created a super useful public repository with templates for deploying resources on his Github account: https://github.com/JanVidarElven/workshop-get-started-with-bicep

Digital Transformation: Welcome Hogverse!

🧙🏻‍♂️ The Challenge 🧙🏻‍♂️

It’s been five years since the whole world realized what we’ve known for years – the future is digital, and we all need to get on the wagon. Unfortunately, Hogwarts staff and board has been lagging behind, not realizing the immense capabilities that lies in intelligent automation.

💡 The Solution 💡

First of all, we needed to create a digital platform for staff and students board. We decided to store all information about who and what’s going on at Hogwarts in Dataverse, making it easily accessible for both pro- and low-code apps and automations.

In the Hogverse Student Management App, Hogwarts staff can keep track of enrolled students and subjects, they can create teams for teaching and send out notifications to students. In this way, time consuming administrative tasks are cut down to a minimum, and each teacher has more time for magical moments with their students.

With a one-click-only to create a team, teachers and students are all set for online teaching in case of trolls, especially bad weather, or a really good party a pandemic. Teams will also be used for teacher-student communication, as well as automatic notifications in form of an adaptive card in Teams, in case a student is about to fail in a subject:

We are sending out similar notifications if a student has done something particularly stupid, which removes the need for detention and other time consuming, medieval methods of punishments. Contract signing automated using OneFlow connectors in Power Automate, and storing the signed contracts in the safe “Secret of Chambers” – SharePoint, triggered if a student is alarmingly interested in death spells or the doctrine of Dark Magic.


The daily life of a student is also getting a lot easier with this digital transformation. No more carrying around heavy, ancient and outdated books. You will get all the information you need for studying by asking Dobby in the Student Portal.

The Student Portal is always updated, since the AI-assistant is instructed to request the Harry Potter DB (api.potterdb.com) for spells and potions. The Portal also gives them full oversight over test results and grades, so that they always know which subject to focus on.

The Student App to bring with you wherever you go, brings teaching out of the classroom, and is enabling learning on-the-go. Here the students have access to the teachers in their pockets, and they will be encouraged to always keep learning. It is also nice not to have to stand in line in front of prof.Sybills office to get a reading of your fortune after having a coffee.

The implementation of Teams will also bring a whole new world to the students. Getting in-app notifications for urgent matters, being able to communicate with teachers without searching the whole school premises, and having the ability to take that Monday morning 8 o’clock History of Magic lesson from the comfort of your warm bed, will hopefully increase the student satisfaction.


✨ The Outcome ✨

Everyone that has taken a pedagogy class or two knows that a student’s well-being is crucial for their ability to learn. These steps of digital transformation will therefore play an important role in education the future Magicians of the World.

We are also hopeful that the new digital tools provided to Hogwarts’ staff will decrease turnover and long-term sick leave. We have seen too many new hires at Hogwarts leading to the return of Voldemort, and this might mitigate the risk of a new Dark Age.

All in all, a more digital magic world, is a safer place for all of us Muggles. 🪄

✨ Dash It Out Badge✨

In our Hogwarts App, we’ve conjured up some magical tools to make life even more exciting for our students and teachers. These enchanting features are designed to inspire, motivate, and bring a touch of wonder to everyday activities. Here is a list over our Magical Insights:

🎯 House Points Tracker

The battle for house glory just got more thrilling! We’ve introduced a House Points Chart, where students can keep track of the current standings and see which house is leading the charge. The bars shimmer in the house colors—scarlet for Gryffindor, emerald for Slytherin, sapphire for Ravenclaw, and gold for Hufflepuff—bringing the spirit of friendly competition to life.

🏆 Golden Snitch Leaderboard

Catch the Snitch, and claim your glory! The Golden Snitch Leaderboard highlights the top players in the beloved Snitch-chasing game. The rankings sparkle with magical hues: gold for first place, silver for second, and bronze for third. Will you be the next Seeker extraordinaire?

📚 Class Attendance Tile

At Hogwarts, striving for 100% class attendance is a noble goal! To aid in this quest, we’ve added an Attendance Tile to the report. It reveals how each class fares in attendance, with fascinating insights. However, beware—the data shows a troublingly low turnout in Defense Against the Dark Arts! Could a bit of extra encouragement—or perhaps a strong Patronus—be the solution?

📊 Students by House Chart

Curious about the balance of Hogwarts houses? The Students by House Pie Chart offers a delightful overview, showing the distribution of students across the four houses. Each slice of the pie is enchanted with its house color, providing a clear and colorful glimpse into Hogwarts’ magical diversity. This tool fosters greater understanding and unity among the houses.

With these mystical updates, we’re ensuring that every student and professor can dive deeper into the wonders of Hogwarts life. Keep exploring, keep competing, and most importantly—keep the magic alive! 🌟

Crawling the Web for Magic: The PowerPotters’ Quest for the Crawler Badge

Greetings, magical technologists! ✨ At Team PowerPotters, we’ve combined the power of Bing Search API, OpenAI GPT-4, and Power Automate to create a truly innovative web-crawling solution. Our workflow dynamically discovers, analyzes, and integrates external APIs, transforming how potion-related data is sourced and utilized. Here’s how we earned the Crawler Badge by blending search, AI, and automation into one seamless process.


The Crawler Workflow: Step-by-Step Magic

  1. Discovering APIs with Bing Search API
  • Purpose: To dynamically find public APIs related to Harry Potter spells or magical data.
  • Execution:
    • Query: "public API Harry Potter spells".
    • Filters: Results are restricted to recent entries with keywords like /api or “documentation”.
  1. Analyzing APIs with OpenAI GPT-4
  • Purpose: To validate URLs as APIs and extract relevant schemas, field mappings, and example data.
  • Execution: For each URL, OpenAI determines if the URL links to an API or its documentation. If valid, it provides:
    • API schema.
    • Example JSON response.
    • Field mappings for key data (e.g., Name for spell name, Description for spell effects).
  1. Integrating with Power Automate
  • Purpose: To process, validate, and integrate the data into our system.
  • Workflow Steps:
    • Parse Bing Results: Extract relevant URLs using JSON parsing.
    • Validate URLs: OpenAI determines if the URL links to a valid API and provides field mappings.
    • Dynamic Integration: Call validated APIs and use extracted data to:
      • Create new product entries in D365FO.
      • Enrich existing products with spell names (Name) and effects (Description).
    • Automation: Run schema validations dynamically, ensuring data consistency.

A Scenario in Action

A potion master requests information on new magical spells for potion research. Using this workflow:

  1. Search: Bing Search API identifies APIs like Potterhead API.
  2. Validation: OpenAI ensures the API provides valid spell data, extracting fields like Name (e.g., Accio) and Description (e.g., Summons an object).
  3. Integration: Power Automate dynamically updates the potion master’s research database with enriched spell information, saving hours of manual effort.

Why This Deserves the Crawler Badge

  1. Innovative Use of Search:
    • Bing Search API dynamically finds and filters public APIs, extending its use beyond static results.
  2. AI-Powered Validation:
    • OpenAI GPT-4 dynamically analyzes URLs, validates APIs, and generates schemas and field mappings for seamless integration.
  3. Solving Real Business Problems:
    • Potion masters gain enriched, real-time product data without manual intervention, enabling informed decisions.
  4. Scalability:
    • The workflow is adaptable for future needs, such as integrating potions, artifacts, or even non-magical domains.

Crawling the Web for Magical Insights

This dynamic web-crawling solution exemplifies how search, AI, and automation can revolutionize the way data is discovered and integrated. With this innovation, we humbly submit our case for the Crawler Badge, showcasing how Team PowerPotters continues to push boundaries: acdc.blog/category/cepheo25.

#ACDC2025 #CrawlerBadge #PowerPotters #SearchAndAIInnovation

FABRIC CATEGORY: Data Activator and Power Automate AI builder Usage

Any person can run power automate flows from dataverse. But only the Fabricator can trigger them from fabric lakehouse data changes using data activator in workspace.

Here we combined the power of fabric with flexibility of power automate to manage our image collection process.

We firstly created data activator and selected our data sources, so activator knows when and where to trigger.

We configured it so it will only trigger when an image file is added to our Images folder for students.

Define action and create flow to run.

Here we define an action and it gives us endpoint to be used as flow trigger. We will come back to here at last step.

We need to create a rule to call the action we defined. This rule allow us to add additional conditions to our filter if needed, let us choose which action to call. Also we can add additional parameters to be sent to power automate flow.

And lastly our power automate flow: The endpoint we received before needs to be set for connection of trigger.

We are using power platforms ai builder to recognize the data and categorize it for further usage.

We send our response to sharepoint for further operations.

As Fabricator it is important to automate our business and keep it tidy and neat. This is the way of the fabricator.

In the Hogwarts style

Cool looking stuff! Ever wanted to get into the Gryffindor common room? Well, let me introduce you to the Fat Lady:

She even moves around! (come see us for a live demo). What better way would it be to introduce new children to their school-life long companions?

When no person is detected in the mirror, the Fat Lady invites you in to get your photo taken. She is welcoming and gives greetings. Entering the mirror, you are awarded your animal after a quick think of the magic. Me myself I got an owl. No wonder, since my middle name is Harry!

Luckily, I can also get food and a cage for my animal pretty easily, but this flow isn’t as magical.