As true Slytherins we of course have to steal, and steal we did!
ngrok We used third party ngrok to expose our local endpoint our local development server to the Internet!
SASS + REACT = <3 Clever clogs utilize existing frameworks for simplifying styling: Sass – CSS with superpowers in combination with React makes a perfect match.
Link Mobility It’s good to know that your message has been delivered. Therefore you get a confirmation when your message has been delivered to the Howler!
Stairway to HELL
In our solution we combine these three Cloud APIs to make our solution floooow through the cloud!
We already have access to MR (aka. Magic Reality) components in Canvas Apps. Implementation is straight forward, but as expected they come with a limited set of features. The components are based on Babylon.js, and make for a quick and easy way to place and view a 3D model in an augmented reality context.
For our solution, we wanted the user to also be able to interact with the virtual objects in front of them, which is not OOB features, so by expressing our power user love, we decided to explore the possibilities around custom XR enabled PCF components.
Being ACDC craftsmen, knowing the potential issues of going too far down the wrong path, we decided to do some proof of concepts, creating custom PCF components with third party XR libraries, acting like proper thieving bastards on the way.
First off, we had a look at AR.js, which is built on ARToolkit, a relatively old library. This library could provide us with wide device support, which really didn’t have that much value, considering the component would be running inside the Power Apps mobile app. We would also be forced to use either image target or marker tracking, with no modern AR spatial tracking.
Looking closer at the OOB components, we tried to find a way to leverage the OOB Babylon.js logic, hopefully being able to hook into the React Native part of the implementation, which would give great benefits in terms of access to device specific XR features (ARCore for Android and ARKit for iOS). We did, however, decide to leave this path, and focus elsewhere.
In our solution, users will be gathering ingredients using object detection in a Canvas App. The AI model used for this has been trained on objects around the conference venue, and so we wanted to enhance the connection between the app and the real world. Already having access to the users geo location through the geolocation web API inside the Canvas App and any PCF components, we decided to these data to place the active users on a 3D representation of the venue, expressing our power user love by merging 3D graphics with the OOB Canvas App elements.
We were able to find a simple volume model of the buildings on the map service Kommunekart 3D, but these data seem to be provided by Norkart, which is not freely available.
Like the thieving bastards we are, we decided to scrape the 3D model off of the site, by fetching all the resources that looked like binary 3D data. We found the data was in B3DM format and we found the buildings in one of these. We used Blender to clean up the model, by removing surrounding buildings and exporting it to glTF 3D file format, for use in a WebGL 3D context.
The representation of the 3D model, we decided to do with Three.js, which let us create an HTML canvas element inside the PCF component and using its WebGL context to render out the model in 3D. The canvas is continuously rendered using requestAnimationFrame under the hood, making it efficient in a browser context. The glTF model was loaded using a data URI, as a workaround for the web resource file format restrictions.
The coordinates from the user’s mobile device comes in as geographical coordinates, with longitude, latitude and altitude. The next step was to map these values relative to a known coordinate in the building, which we chose to be the main entrance. By using the main entrance geographical coordinates, we could then convert that to cartesian coordinates, with X, Y and Z, do the same to the realtime coordinates from the user, and subtract the origin, to get the offset in meters. The conversion from geographic to geocentric coordinates were done like so:
// eslint-disable-next-line @typescript-eslint/consistent-type-definitions
export type CartesianCoordinates = { x: number; y: number; z: number };
// eslint-disable-next-line @typescript-eslint/consistent-type-definitions
export type GeographicCoordinates = { lat: number; lon: number; alt: number };
// Conversion factor from degrees to radians
const DEG_TO_RAD = Math.PI / 180;
// Constants for WGS84 Ellipsoid
const WGS84_A = 6378137.0; // Semi-major axis in meters
const WGS84_E2 = 0.00669437999014; // Square of eccentricity
// Function to convert geographic coordinates (lat, lon, alt) to ECEF (x, y, z)
export function geographicToECEF(coords: GeographicCoordinates): { x: number; y: number; z: number } {
// Convert degrees to radians
const latRad = coords.lat * DEG_TO_RAD;
const lonRad = coords.lon * DEG_TO_RAD;
// Calculate the radius of curvature in the prime vertical
const N = WGS84_A / Math.sqrt(1 - WGS84_E2 * Math.sin(latRad) * Math.sin(latRad));
// ECEF coordinates
const x = (N + coords.alt) * Math.cos(latRad) * Math.cos(lonRad);
const y = (N + coords.alt) * Math.cos(latRad) * Math.sin(lonRad);
const z = (N * (1 - WGS84_E2) + coords.alt) * Math.sin(latRad);
return { x, y, z };
}
This gave us fairly good precision, but not without the expected inaccuracy caused by being indoors.
In our solution the current position is then represented by an icon moving around the 3D model based on the current GPS data from the device.
To connect this representation to realtime data from all the currently active users, we decided to set up an Azure SignalR Service, with an accompanying Azure Storage and Azure Function App for the backend, bringing it all to the cloud, almost like a stairway to heaven. With this setup, we could use the @microsoft/azure package inside the PCF component, receiving connection, disconnection and location update message broadcast from all other users, showing where they are right now.
In our app we have made use of the great opensource libraries of ready-made opportunities through the PowerApps Component Framework (PCF) Gallery. Inspired by the playful spirit of Fred and George Weasley from Harry Potter, these components add a touch of magic to the app: “Shaking Text” and “Confetti.”
Navigating all the lectures at Hogwarts can be tricky, but fear not! We’ve got a Time-Turner feature, just like Hermione’s, to ensure you never miss a single lesson. With this magical tool, you’ll be able to attend every class and ace your studies! 🕰️✨
Fred and George Weasley are known for their pranks and infectious energy. The “Shaking Text” component adds a dynamic, eye-catching effect to text, while the “Confetti” component brings a celebratory vibe, both reflecting the twins’ lively personalities. Power Apps
The Confetti component and the Shaking text component is published by Clavin Fernandes. They are built using TypeScript and React, and the Confetti component utilizes an NPM package. The developer on the team reviewed the source code and checked dependencies before importing the components into our solution. This thorough review ensures that all dependencies are compatible and up-to-date, preventing potential conflicts and integration issues. By doing this, the developer helps maintain the stability and reliability of the solution, ensuring a smooth and error-free deployment.
The PCF components are built using TypeScript and React, allowing for custom functionality and appearance in Power Apps. They integrate seamlessly and are reusable across different applications, enhancing the user experience. By leveraging these technologies, developers can create highly interactive and visually appealing components that meet specific business needs.
In the spirit of Hogwarts, where collaboration and resourcefulness reign supreme, we embarked on a quest to claim the coveted ‘Thieving Bastards’ badge. This badge celebrates the clever use of third-party solutions to enhance our magical creations. Just as the greatest wizards rely on ancient spells and enchanted artifacts, we too must harness the power of existing tools and APIs to weave our digital enchantments.
To bring our Hogwarts-inspired intranet to life, I delved into the vast realm of third-party APIs, selecting the most potent tools to aid students in their daily adventures.
The Entur API: The Floo Network of Transportation Much like the Floo Network enables swift travel across the wizarding world, the Entur API provides real-time transportation data. By integrating this powerful API, students can easily plan their journeys to Diagon Alley with minimal hassle.
Weather API: The Divination Crystal Ball Professor Trelawney may have her crystal ball, but we prefer data-driven forecasting. With the weather API, students can prepare for their daily adventures, be it sunny strolls around the castle grounds or braving the rain on their way to Herbology class.
Harry Potter Database: The Restricted Section of Knowledge No Hogwarts intranet would be complete without a comprehensive spellbook. By utilizing a Harry Potter-themed database, students can look up spells, potion recipes, and magical creatures with ease, ensuring they are always equipped for any magical challenge.
OneFlow API Handling magical agreements and contracts has never been easier with the Oneflow API. Much like the enchanted scrolls used at Hogwarts, this API allows for the seamless management of digital contracts, ensuring that all agreements—from Hogsmeade permission slips to Quidditch team sign-ups—are securely handled and stored.
Mining for Gold: Claiming the ‘Dataminer’ Badge Beyond integrating third-party solutions, we have also used these APIs to extract valuable insights and present them in an engaging way. By combining transportation schedules, weather forecasts, and magical data, our intranet transforms raw information into actionable intelligence. Students can now see the best routes to Diagon Alley considering the weather conditions or discover spell recommendations based on current atmospheric factors. This fusion of external data with our own enriches the user experience and adds real business value to our solution.
Configuring components in Power Apps isn’t always the funniest thing in the world, so whenever there is a possibility to try some new ones that might gives us a better a experience. And when the creators helps us with a great ACDC experience, it is even better, and we loved trying out the Resco Kanban board and Datepicker components.
Texting with Link Mobility
In the modern day of technology, there is every kind of communication methods everywhere, but there is still one way you can reach everyon (for the most part): Text messages
So thank you to Link Mobility for giving us a method to reach out to, and get a respond from, all the customers of Diagon Alley
With Ctelo on the phone
And since it might be to complicated to message somethings, it is sometimes necessary to speak directly with someone. And thanks to Ctelo, we have the options to give the businesses a possibility to talk with their customers over the phone… From Teams
PowerPotters of Cepheo: Brewing Badge-Winning Elixirs with Pro-Code Potions and Beyond!”
Greetings, magical tech community! 🧙♂️✨ Team PowerPotters of Cepheo is thrilled to unveil our progress in automating elixir production for the ACDC 2025 hackathon. Our solution blends the powers of pro-code Python and low-code Power Platform to craft a system that’s both functional and badge-worthy. Today, we’ll reveal how we’re targeting the Pro-Code Potions category as well as these coveted badges:
Right Now
ACDC Craftsman
Thieving Bastards
Power User Love
Let’s dive into the details of how our Python-powered magic aligns with these badge aspirations!
1. Claiming the “Right Now” Badge: Python – The Heart of Our Brew
The “Right Now” badge rewards smart, clean, and efficient code that elevates solutions beyond the realm of low-code alone. For us, Python isn’t just an enhancement—it’s the engine of our magical elixir automation.
Python Scripts: Unlocking Advanced Functionality
Our Python scripts (sensor_script.py, voice_script.py, integration_script.py) are designed for tasks that Power Platform cannot handle natively:
Direct Hardware Interaction: Using RPi.GPIO, our sensor_script.py captures real-time data from the potion cauldron’s liquid level sensor. Power Platform simply cannot replicate this hardware integration.
AI-Powered Voice Recognition: Our voice_script.py leverages the OpenAI Whisper API for advanced speech-to-text processing, turning verbal commands into actionable automation triggers.
Intelligent Orchestration: The integration_script.py ties everything together—sensor readings, voice commands, and workflows via Power Automate.
With these examples, we demonstrate how Python serves as the lifeblood of our system, embodying the essence of “Right Now.”
2. The “ACDC Craftsman” Badge: Best Practices in Code
The “ACDC Craftsman” badge celebrates development and deployment excellence. Our commitment to best practices is reflected in every line of Python code we write.
Highlights of Craftsmanship:
Modular Structure: Each script (sensor_script.py, voice_script.py, etc.) has a single responsibility, ensuring clarity and maintainability.
Error Handling and Logging: Robust try...except blocks and detailed logging ensure stability and traceability
Mocking for Testability: The Mock folder includes a GPIO.py mock module, allowing us to test sensor logic without a physical Raspberry Pi. This approach accelerates development while maintaining code quality.
By embracing modularity, testability, and robust error handling, we ensure our code stands as a shining example of “ACDC Craftsman” principles.
3. The “Thieving Bastards” Badge: Leveraging External Tools and APIs
The “Thieving Bastards” badge rewards the clever use of third-party tools to amplify solutions. Here’s how we “borrowed” brilliance:
Open-Source Libraries:
RPi.GPIO and requests for hardware and API interaction.
sounddevice and scipy for audio recording and processing.
python-dotenv for secure environment variable management.
OpenAI Whisper API: This external AI service powers our voice recognition functionality, enabling seamless integration of advanced speech-to-text capabilities without reinventing the wheel.
We’ve strategically combined these tools to accelerate development and expand functionality, earning our place as “Thieving Bastards” in the best sense!
4. The “Power User Love” Badge: Pro-Code and Low-Code Unite
The “Power User Love” badge highlights the magic that happens when pro-code customization enhances low-code platforms. Our project is a perfect example:
Power Platform for Low-Code Power: Power Automate orchestrates workflows, while Power BI visualizes potion progress.
Python for Pro-Code Power: Python bridges the physical and digital realms, enabling sensor integration and AI-driven voice commands.
Together, these platforms create a seamless, intelligent, and user-friendly potion production system.
Conclusion: Badge-Winning Elixir Automation!
With our meticulously crafted Python pro-code and Power Platform low-code synergy, we’re confident our solution is a contender for:
Right Now
ACDC Craftsman
Thieving Bastards
Power User Love
PowerPotters of Cepheo are proud to combine technical excellence with magical creativity. We look forward to seeing the results and continuing to share our journey. Stay tuned, fellow wizards!
NOTE TO THE JURY: we have taken your comment in and added details in the bottom of this article.
In our Wayfinder Academy, we take a comprehensive and magical approach to understanding the student’s history, aspirations, and potential to recommend the best possible new school. The process is detailed, thorough, and personalized, ensuring the student is matched with an environment where they can thrive.
Just to remind the process, here we assume a student who didn’t feel right about their current faculty, filed an application. Immediately after that we request the tabelle from their current faculty (historical data), ask a student to upload some photos from most memorable moments, and then invited to an interview. While we are still working on the interview step and will share the details later, with this article we want to add more details about one of our approaches to mining extra insight from the student’s interview by analysing the emotions.
We use this emotional recognition along with the interview, to get 360 degree insight on the student`s reaction to the questions, that are designed to figure out their values, aspirations, fears, etc we can use to calculate the probability of their relation to the faculties and identify the one with the highest score (the scoring approach will be shared in a different post).
So, we are using a video stream capture to record an interview session and extract the emotional dataset.
It allows us to receive one more dimension that will extend standard datasets of the student, such as feedback, historical data from previous schools, etc.
We use the imentiv.ai API to analyze the video and grab the final report. We then make the final dashboard in Power BI (we love it)
and embed it into OneLake.
Imentiv AI generates emotion recognition reports using different types of content, such as video, photos, text, and audio.
We implemented the single-page application to create an interactive experience by recognizing the emotions in the image captured via the webcam on our tablet. The analysis of the video stream takes more time, so we will demonstrate it later.
The app consists of two parts: a PoC to recognize the emotions in a photo from a webcam and an example of an emotion recognition report.
To build that PoC application, we decided to use the NodeJS stack. The engine is based on Bun, which is a modern and highly effective alternative to NodeJs. Compared to NodeJs, Bun was written with Rust.
For the front end, we are using React and ChartJs. We are hosting the PoC on our laptop. To make it available to the public internet, we are using CloudFlare tunnels. It also covers the SSL certificate termination, so your service will be secured by default without any significant effort.
The app server and the client app run inside a docker container, so you can deploy easily with a single command: docker-compose up—build.
To optimize the final container size and improve the speed of the build, we are using docker files with two stages: one to build the app and the second one to run the final artifacts.
PS:
Badges we claim:
Thieving bastards – we are using third party platform to recognize emotions in video and photo.
Hipster – we use BUN to run the application
Hogwarts Enchanter – we use Mystical AI imentiv.ai API to grab the emotional reports and visualize it in an user friendly way (see the screenshot above). Our enchanted workflow is using the data and making it available in OneLake. Wizarding world becomes closer when we see the AI based deep insight from various data sources in one place, in easy to read and interpret format.
Right now – we are using web socket server to implement real time communication between client and server site.
Client side salsa – we use React to implement front end.
PS2: pls come over to our camp and test it out! We want to know how you feel! 🙂
(c) Faruk The Fabricator inspired by the Silicon Valley series.
If you think a student’s story begins when they enroll at Hogwarts, you could not be more wrong. The Fabricator is evil and does not care about privacy. The Fabricator is guileful and does not care about truth. He will do everything in his power to gather or fabricate every detail of their lives and use it to achieve his goals.
At the moment, The Fabricator uses Fabric to access previous data of the students wishing to enroll at Hogwarts. We call the Kaggle API within notebook code to retrieve data from Kaggle and write it as a CSV file.
Python code in another notebook is then used to transform this data and divide it into clusters.
Finally, a “Copy Data” activity moves the data to its final destination. But is this truly the end?
Follow the Fabricator for more—if you can, that is.
In the coming days, the Fabricator plans to:
Show clustered data in Power BI reports.
Use insights to plan interventions or recommendations for students.
Perform behavioral predictions: Use the clusters as labels for supervised learning models to predict future performance.
Trigger emails or alerts for specific clusters needing attention.
Data is born into Fabric, molded by it. Data does not see the light until it is ready to face users. And when it is finally presented, it is blinding.
(c) The Fabricator and The Batman.
PS: with this article we claim the following badges:
Thieving Bastards – we use online data source from kaggle
Dataminer – we are doing data transformation for better reporting and we are using extrernal data.
Go With The Flow – we create the pipeline that can be used to retrive any data from kaggle. We plan to use data activators to send alerts based on the processed data.
Power User Love – in fabric we created pipeline as a low code solution. inside pipeline we are using python code for advanced operations.
Uses several third party solutions in the delivery, opensource or payable solutions made available by others. Show the importance of leveraging existing tools and APIs.
And the excitement doesn’t stop there! In the Mushroom Kingdom, Princess Peach must determine whether she can trust each inhabitant and prepare her for the Mario World. This is brought to life through a Tinder-like swiping interface, where she makes her choices by swiping. The feature leverages the PCF Swipe component and a analysis feature of the swiped choices and characters with AI enabled feature by connecting it with OpenAI Web Service. This engaging feature adds a layer of suspense and fun to the game, making it an unforgettable swiping experience!
Toad Assistant (PVA with use of the AI web crawling feature to get infomration and context from mariowiki.com webpage)