Conducting surveys often involves tedious typing, which can be challenging, especially for students. To make the process easier, we’re leveraging Azure Speech Recognition in Power Pages to transcribe spoken responses directly into text fields. Students can simply speak their answers instead of typing them:
How It Works
Connecting Azure Speech SDK To enable speech recognition, we connect the Azure Cognitive Services Speech SDK to our Power Pages using this script.
HTML Setup for Speech Input We added a microphone button and a text area to capture and display the transcribed response. Here’s the code for the interface:
Clicking the microphone button starts recording.
The spoken response is transcribed into the text area.
Saving Responses and Navigating Once a student provides their answer, clicking the Save & Next button saves the response and moves to the next question. Here’s how it works:
Benefits
Ease of Use: Students can focus on their answers without worrying about typing.
Efficiency: Responses are saved automatically, and the survey flows smoothly.
Accessibility: Ideal for students with typing difficulties or those who prefer speaking.
By combining Azure Speech Services with Power Pages, we’re simplifying the survey process and improving the overall experience for users. Speech technology makes surveys faster, easier, and more engaging!
Azure Cognitive Services to implement speech-to-text:
SharePoint API to invite the users to the portal and work on the SharePoint lists
Relevance Search API to implement search of the upcoming events on the portal to help the students complete the onboarding by including socialization activities in the new school.
Part 1: Automating Azure Function App with Durable Functions and CI/CD Pipelines
In our cloud infrastructure, we have designed and implemented an Azure Function App that utilizes Azure Durable Functions to automate a checklist validation process. The function operates in a serverless environment, ensuring scalability, reliability, and efficiency.
To achieve this, we:
–Use Durable Functions for long-running workflows and parallel execution. –Implement a timer-triggered function that regularly checks for missing documents. –Deploy using Azure DevOps CI/CD Pipelines for automated deployments and testing.
This post covers Azure Function App architecture, Durable Functions, and our CI/CD pipeline implementation.
🔹 Azure Durable Functions: Why We Chose Them
Our workflow involves:
–Retrieving all checklists from SharePoint. – Processing them in parallel to check for missing documents. – Updating the checklist if documents are missing.
We use Azure Durable Functions because: Stateful Execution – Remembers past executions. Parallel Execution – Checks multiple users simultaneously. Resilient and Reliable – Handles failures gracefully. Scales Automatically – No need to manage servers.
How Our Durable Function Works
Timer-Triggered Function: Initiates the Orchestrator
This function triggers every 5 minutes, calling the orchestrator.
Each activity function is responsible for a specific task.
📌 Get All Checklists
Retrieves all checklists from SharePoint.
📌 Process Individual Checklist Items
What It Does:
Retrieves missing documents for a user.
Updates the SharePoint checklist accordingly.
Handles errors and retries if needed.
PART 2: Automating Deployments with Azure DevOps CI/CD Pipelines
To ensure seamless deployment and updates, we use Azure DevOps Pipelines.
📌 CI/CD Pipeline Breakdown
–Build Stage – Runs dotnet build and dotnet test. –Deploy Stage – Uses Bicep templates (main.bicep) for infrastructure-as-code deployment.
🔹 Azure DevOps Pipeline (azure-pipelines.yml)
We use Azure CLI and Bicep for automated Azure Function deployment.
main.bicep
By leveraging Azure Durable Functions, we transformed a manual checklist validation process into an automated, scalable, and highly resilient system.
With Azure DevOps CI/CD, we now have a fully automated deployment pipeline, ensuring high reliability and faster releases. 💡 Next, we will discuss a new business logic, SharePoint interactions, and integrations in a dedicated post. Stay tuned!
After the student is allocated to the new Faculty, the Wayfinder Academy provides a hyper care by allowing students to verbally communicate with the voice digital twin of their mentor.
From the technology part, we are using: 1. LiveKit – to handle real-time audio communication. Students join room via the LiveKit SDK embedded in the Next.js frontend.
2. ChatGPT Voice Assistant – to process voice inputs using a pipeline: – ASR (speech-to-text)
LLM (GPT-4) to generate intelligent responses
TTS text-to-speech
STT to process audio streams and convert them to text
3.Next.js application – serves as the frontend:
SSR ensures fast loading and API integration
connects students to LiveKit rooms and the assistant, displaying responses or playing them as audio
Here are more details on the backend part:
Transcription Handler:
an entrypoint function to connect to LiveKit Room, track subscriptions and initialise the assistant:
“Building this solution has been a journey of passion and precision, where every element has been designed with purpose and care. We’re excited to showcase our work as a testament to quality and innovation in pursuit of the ACDC Craftsman badge.
We have three environments, DEV, TEST, and PROD, with different approval flows.
So, the production environment can be deployed only after the release manager has approved it.
We implemented an Export pipeline to simplify the contribution process.
Users can decide which solution to export and place in the GIT to track the change’s history.
For the functional consultants, we implemented the following flow:
The export procedure includes exporting the managed and unmanaged solution packages. All changes from the selected solution will be included in the PR, and the Solution Checker will start the validation process. A clean report from the Solution Checker is a prerequisite for the next step of the PR review, which requires the real human review step.
In the case of pro code customization, each component type has its steps for the PR validation, such as:
Run build to produce the artifacts:
Run unit test
Scan the code base with SonarQube (Quality Gate)
The Import pipeline will deploy the package with a specific type based on the environment, so deployment to PROD will always include only the managed version of the packages.
The import pipeline also includes extra steps to activate the flows after deployment, including calling the custom Power Shell script to activate the disabled flows in specific solutions.
We also use a custom version that reflects the build date and the build number at the end: 2025.01.25.13; from our previous experience, it looks more meaningful to the users.
Branching strategy:
We are using a trunk-based branching strategy. Short-lived branches contain as small changes as possible to make the review and validation process simple.
As a team, we have a primary goal: We help magic happen by utilizing modern AI approaches.
We also understand that every student needs a mentor, but the number of available mentors is minimal today. Our idea is to introduce digital twins for available mentors, which would allow us to help a more significant number of students. To achieve that, we are using all available data about the mentors. But to keep it fresh and reliable, we must have the latest events in our OneLake.
So, we implemented the LinkedIn profiles’ social crawlers to track all the mentors’ activities and events and suggest new potential mentors.
We do that via the most modern and most straightforward way by utilizing the PowerAutomate Desktop:
Then, the data will be collected and stored in an Excel spreadsheet, making it available for future processing with our Data Factory.
The primary consumer of the data regarding upcoming events is the students, who use the complex search request to find the proper event by searching across different fields. Participation in community events potentially increases the chances of successful onboarding to the new school.
Relevant search API functionality if covering 98% percent of the required functionality. Unfortunately, this API is unavailable on the Power Pages side. We implemented the Power Automate Flow from the portal, a wrapper around the original Dataverse API, to resolve that issue.
On the portal side, we are using the Select2 component to implement autocomplete functionality.
Meet Dobby – our digital assistant for the Wayfinder Academy employees. Isnt it cute?
And this is how we low code consultants edited for before sitemap editor was added to the system
while working on our solution, we’re taking a moment to celebrate the charm of retro tech—our trusty cabled mouse and earphones. 🎧🖱️
Why the throwback?
✅ Reliability: No low-battery warnings when we’re mid-sprint or presenting at events. ✅ Clarity: Whether it’s code reviews or customer conversations, the sound is always crisp, and the connection is solid. ✅ Control: Sometimes, the best way to stay connected is with a literal connection.
Retro tech may not be flashy, but it keeps us focused on what matters—delivering solutions that work.
And, let’s be real, there’s something nostalgic about untangling those earphones. Who’s with us? 🙌
We hope you liked our Logiquill platform we demonstrated to you yesterday. With it, we really enable our Wayfinder academy employees to manage their work in more effective way, reducing operational stuff, manual work and all of it – by use of the technologies we describe below. Hence by saving time on those time consuming processes, they can allocate it to the more meaningful and strategic tasks. This is what we call Digital transformation.
At this portal, in the Admin view, we have features like (as displayed on the screenshot below):
Digital Twins of our judges (mentors), generated by AI, and connected to the database which has the custom survey tailored to guide the student thru questions, that helps to identify their values, aspirations, reactions to stress, etc.
Heart rate readings which is a real time data, coming from the Pulsoxymeter device.