Processing geodata in Fabric

Finding resources in nature is not a simple task, and requires enormous amounts of data to locate all sources of all types. Luckily, Norway has a source for free and open source geodata located at “Geonorge” that are available to everyone. There are many different suppliers of this kind of data, but as the core datasource for our solution, we went with the “N50 Kartdata” supplied by “Kartverket”.

Using this as a source, we decided to use Lakehouse in Fabric as a way of uploading the XML-file with over 430.000 lines of data and then, by using a pipeline and dataflow in fabric, converted it into an table within av SQL analytics endpoint. Additional CSV files with descriptive support data were also uploaded and merged using the dataflow to make sure alle data were located at the same place and making it easier to search for the data needed..

Within the Fabric workspace we also created an AI-driven data agent, specialized on the imported dataset and available to use as an supportive agent within other AI-agents like those created using Copilot studio. As we were planning in using this data within the Power Platform ecosystem, we had to add some very detailed instructions to this agent to make sure the data output is available to all resources that need to access it. This makes it generally very effective at finding necessary data in the table, making it an very effective way of searching for the resources needed at any time.