Azure Cognitive Services to implement speech-to-text:
SharePoint API to invite the users to the portal and work on the SharePoint lists
Relevance Search API to implement search of the upcoming events on the portal to help the students complete the onboarding by including socialization activities in the new school.
Part 1: Automating Azure Function App with Durable Functions and CI/CD Pipelines
In our cloud infrastructure, we have designed and implemented an Azure Function App that utilizes Azure Durable Functions to automate a checklist validation process. The function operates in a serverless environment, ensuring scalability, reliability, and efficiency.
To achieve this, we:
–Use Durable Functions for long-running workflows and parallel execution. –Implement a timer-triggered function that regularly checks for missing documents. –Deploy using Azure DevOps CI/CD Pipelines for automated deployments and testing.
This post covers Azure Function App architecture, Durable Functions, and our CI/CD pipeline implementation.
🔹 Azure Durable Functions: Why We Chose Them
Our workflow involves:
–Retrieving all checklists from SharePoint. – Processing them in parallel to check for missing documents. – Updating the checklist if documents are missing.
We use Azure Durable Functions because: Stateful Execution – Remembers past executions. Parallel Execution – Checks multiple users simultaneously. Resilient and Reliable – Handles failures gracefully. Scales Automatically – No need to manage servers.
How Our Durable Function Works
Timer-Triggered Function: Initiates the Orchestrator
This function triggers every 5 minutes, calling the orchestrator.
Each activity function is responsible for a specific task.
📌 Get All Checklists
Retrieves all checklists from SharePoint.
📌 Process Individual Checklist Items
What It Does:
Retrieves missing documents for a user.
Updates the SharePoint checklist accordingly.
Handles errors and retries if needed.
PART 2: Automating Deployments with Azure DevOps CI/CD Pipelines
To ensure seamless deployment and updates, we use Azure DevOps Pipelines.
📌 CI/CD Pipeline Breakdown
–Build Stage – Runs dotnet build and dotnet test. –Deploy Stage – Uses Bicep templates (main.bicep) for infrastructure-as-code deployment.
🔹 Azure DevOps Pipeline (azure-pipelines.yml)
We use Azure CLI and Bicep for automated Azure Function deployment.
main.bicep
By leveraging Azure Durable Functions, we transformed a manual checklist validation process into an automated, scalable, and highly resilient system.
With Azure DevOps CI/CD, we now have a fully automated deployment pipeline, ensuring high reliability and faster releases. 💡 Next, we will discuss a new business logic, SharePoint interactions, and integrations in a dedicated post. Stay tuned!
In a world where wizardry meets cutting-edge technology, even the darkest assignments get a modern twist. Welcome to “The Dark Side of Harry Potter” Canvas App, where users verify their “assignments” (yes, kills 🪦) using the enchanting power of Azure Face API and Azure OpenAI.
This isn’t just tech; it’s a spellbinding mix of mystical AI, intuitive designs, and enchanted workflows. Let’s dive into how this solution flips bits, turns heads, and perhaps, toys with an existential threat to the world.
🖼️ Capturing the Kill
Step into the dark arts of delivery confirmation:
Snap the Moment: Users submit photographic evidence of their completed assignments via the app.
Cast the Spell: The image triggers a Power Automate flow, sending it to Azure Face API for identity matching.
Instant Confirmation: With a calculated similarity score, the app declares success with magical flair: “Assignment Complete!”
🔍 How AI Face Recognition Works Its Magic
Azure Face API isn’t just a tool; it’s the wand wielded behind the curtain. Here’s its spellbook:
Facial Feature Analysis 🧙♂️ The “kill” image is analyzed for key facial landmarks: eye position, jawline curves, and more. Each unique marker is measured with surgical precision.
Image Comparison ⚖️ Uploaded images are cross-referenced with pre-stored profiling images, calculating a similarity score based on:
Alignment of facial landmarks.
Proportions and symmetry.
Subtle markers that make faces unique.
Real-Time Results ⚡ With wizard-like speed, Azure Face API returns results to Power Automate in mere seconds. If the similarity score passes the threshold, the dark deed is verified.
🧠 Adding AI Sorcery with Azure OpenAI
We’ve upped the ante by integrating Azure OpenAI to enhance verification. Here’s what makes it extra enchanting:
Landmark Precision: OpenAI uses facial attributes like eye spacing, nose position, and cheekbone structure to calculate distances between landmarks.
Magic Math: These distances are used to generate a similarity score with almost clairvoyant accuracy.
But wait… does this tech have a conscience? Does it think? Could it outsmart a 5th grader? Maybe even you? By embracing such advanced AI, we’ve tiptoed into an existential risk realm:
Risk or Reward? The tech is smarter, faster, and eerily close to independent thought.
Conscience in Code: What if it started deciding on its own? Could it be charmed—or is it the new Dark Lord in disguise?
The response from the AI after verifying the image has been run through magic.
📲 Integration: A Solution for Every Platform
This isn’t just an app—it’s an omnipresent force:
Embedded Everywhere: From Phone to PC, the app integrates seamlessly into every digital corner.
Flipping Bits with Power: Whether on a desktop, tablet, or phone, this solution works its magic across devices.
🌟 Casting a Spell with Technology
What makes this solution truly magical?
Intuitive Designs: The interface is sleek, responsive, and dripping with a mystical vibe.
Enchanted Workflows: Every process, from snapping the image to confirming the deed, flows like a well-rehearsed spell.
Business Value Meets Wizardry: By automating and verifying critical tasks, this app doesn’t just entertain—it delivers results.
⚡ The Bigger Picture: Wizardry Meets AI Risks
As we push the boundaries of AI and magic, we also recognize the need for vigilance. Azure OpenAI introduces risks we must respect:
Could this tech someday outthink its creators?
Are we summoning tools too powerful to control?
Yet, as any great wizard knows, power isn’t inherently evil—it’s how we wield it that matters. And wield it, we shall.
“With AI face recognition from Azure Face API and OpenAI, the lines between magic and technology blur into something truly extraordinary.” 🪄
Our solution utilizes three distinct environments within Power Platform: Development, Testing, and Production. The Development environment is where initial coding and development take place, supported by robust version control to ensure code integrity and traceability. The Testing environment is used for rigorous testing to ensure code quality and expected functionality. Finally, the Production environment hosts the live code that is actively used by students.
Power Platform Pipelines
To streamline the movement of code between these environments (DEV – TEST – PROD), we have implemented Power Platform Pipelines. This ensures a smooth and efficient transition of code through each stage, maintaining consistency and reducing the risk of errors. Automation tools and scripts are used to facilitate this process, enhancing efficiency and minimizing manual intervention.
Environment variables
We leverage environment variables to manage our SharePoint sites and lists across different environments. This approach provides us with precise control and flexibility, ensuring that each environment operates with the correct configurations. Environment variables support dynamic configuration, allowing for easy updates and changes without altering the codebase.
Connectors
Our solution employs a smart naming standard for connectors. This naming convention simplifies the process of tracking and reusing the appropriate connectors, enhancing maintainability and clarity. Comprehensive documentation of the naming standards and usage guidelines ensures consistency and ease of understanding for new team members.
Service Account
We have established a dedicated service account for the twins to use with connectors, enhancing security by isolating permissions and reducing credential exposure risks. It simplifies auditing and ensures consistent configuration across environments. Adhering to the principle of least privilege, it grants only necessary permissions, providing clear accountability and preventing identity spoofing. Monitoring tools track the service account’s activities to ensure compliance with security policies.
In any community, the balance between tradition and innovation is delicate. The interplay between nostalgia and progress often sparks debate. Sooo let me tell you what happened recently. The controversial decision was made: old school quiz form was refused to get the retro badge:
Having in mind that the judges are seasoned and wise tech veterans, we decided to come up with something more retro! We’re all familiar with the power of nostalgia, but sometimes, it takes a little extra magic to prove that something truly embodies the spirit of the past. So have a look into our /too much retro/ client:
After the student is allocated to the new Faculty, the Wayfinder Academy provides a hyper care by allowing students to verbally communicate with the voice digital twin of their mentor.
From the technology part, we are using: 1. LiveKit – to handle real-time audio communication. Students join room via the LiveKit SDK embedded in the Next.js frontend.
2. ChatGPT Voice Assistant – to process voice inputs using a pipeline: – ASR (speech-to-text)
LLM (GPT-4) to generate intelligent responses
TTS text-to-speech
STT to process audio streams and convert them to text
3.Next.js application – serves as the frontend:
SSR ensures fast loading and API integration
connects students to LiveKit rooms and the assistant, displaying responses or playing them as audio
Here are more details on the backend part:
Transcription Handler:
an entrypoint function to connect to LiveKit Room, track subscriptions and initialise the assistant:
Stuck on code? Feeling down? Forget rubber ducking, rubber snaking is the new thing!Instead of explaining your code to a rubber duck, use a rubber snake, in true Slytherin ssssstyle.
“Building this solution has been a journey of passion and precision, where every element has been designed with purpose and care. We’re excited to showcase our work as a testament to quality and innovation in pursuit of the ACDC Craftsman badge.
We have three environments, DEV, TEST, and PROD, with different approval flows.
So, the production environment can be deployed only after the release manager has approved it.
We implemented an Export pipeline to simplify the contribution process.
Users can decide which solution to export and place in the GIT to track the change’s history.
For the functional consultants, we implemented the following flow:
The export procedure includes exporting the managed and unmanaged solution packages. All changes from the selected solution will be included in the PR, and the Solution Checker will start the validation process. A clean report from the Solution Checker is a prerequisite for the next step of the PR review, which requires the real human review step.
In the case of pro code customization, each component type has its steps for the PR validation, such as:
Run build to produce the artifacts:
Run unit test
Scan the code base with SonarQube (Quality Gate)
The Import pipeline will deploy the package with a specific type based on the environment, so deployment to PROD will always include only the managed version of the packages.
The import pipeline also includes extra steps to activate the flows after deployment, including calling the custom Power Shell script to activate the disabled flows in specific solutions.
We also use a custom version that reflects the build date and the build number at the end: 2025.01.25.13; from our previous experience, it looks more meaningful to the users.
Branching strategy:
We are using a trunk-based branching strategy. Short-lived branches contain as small changes as possible to make the review and validation process simple.
One of the inspirations behind the Parselthounge language (aka. Python), and closely related to the language used to write Python itself; is the C++ language written in 1985, the same year Barty Crouch Jr.’s was imprisoned in Azkaban :O
We are using Arduino to control the physical howler, using code written in the classical language of C++:
And as Snape instructed us, we wrote the code in snake_case, of course! #HouseOfSlytherin