Our solution is all about making learning magic easier and more accessible for young wizards and witches out there. For this to succeed, the courses need to stay up to date and relevant. Since updating subject curriculums with new potions and spells everytime something new comes out is time consuming and not something the teachers can be bothered to do between flying their brooms, watching games of Quidditch, and saving the world from Dark Magic, we figured out that we needed to solve this problem for them.
Retriving data in Fabric using Data Pipelines
We are using external data sources from Potter DB (potions & spells) and importing them into Microsoft Fabric with Data Pipelines, storing them in a datalake.
A copy data step in Data Pipeline retrieves data form external data source:
Retrieves data from external source using the endpoint https://api.potterdb.com/v1/potions:
Stores the data in our lakehouse in a table called Potions:
Maps all fields from the external source and to the destination table:
Retrives data from the Potions table in our lakehouse that has been updated from an external source, and later used to import to Dataverse:
Using Service Principle to connect to our Dataverse Environment and upserting (update or create) records in our Dataverse Potion table:
Mapping the fields from the Datalake and the Dataverse table and upserting it to Dataverse:
After running the pipelines it updates our Potions table in Dataverse:
And there we have it, no need for teachers and staff to stay updated on the release waves for new spells and potions, it will all be automatically updated in the systems and arriving straight to everyone’s Mystic Mentor app.
That calls for the Dataminer badge, doesn’t it? 🪨⛏️