Download Brreg Data

To get more data for testing we did setup a new soruce in Azure Data Factory to download “Account Name” and “Orgnummer” from Brreg.

The result downloading first 1000 from Brreg..

We made it simple.. using this URL

https://data.brreg.no/enhetsregisteret/api/enheter?size=1000&page=1

https://data.brreg.no/enhetsregisteret/api/enheter?size=1000&page=4

With an param to ADF to say how many pages we should download.. Simply changing the number of pages to download.. Meaning batches of 1000 accounts.. So, g.ex: We set

ADF will loop until 5 pages in URL have been set in API URL and downloaded to source

g:ex:

run 1: https://data.brreg.no/enhetsregisteret/api/enheter?size=1000&page=1

run 2: https://data.brreg.no/enhetsregisteret/api/enheter?size=1000&page=2

run 3: https://data.brreg.no/enhetsregisteret/api/enheter?size=1000&page=x

For each loop we download these data and dump them into TEST SQL DB

Example now:

Datamining waterlevel

For å få et bedre bilde av hva som skjer i Oslofjorden sammler vi inn data fra sensorer og eksterne kilder.

Vi har i en tidligere blogg beskrevet innsamling av sensordata i datavers samt hvordan vi har presentert dette på en fin måte. Dette i seg selv burde kvalifisere til utmerkeslen Datamining.

This image has an empty alt attribute; its file name is image-365-1024x582.png

Det er mange relevante eksterne datakilder men en sentral er havnivå, eller tidevann. Vi har laget en logic app som går regelmessig for å samle inn data om havnivå til gitt sted og tid.

http://api.sehavniva.no/tideapi.php?lat=58.974339&lon=5.730121&fromtime=2022-02-10T00%3A00&totime=2022-02-11T00%3A00&datatype=all&refcode=cd&place=&file=&lang=nn&interval=10&dst=0&tzone=&tide_request=locationdata

Resultatet av api querien er json med alle havninvå endringer fordelt over tid. Som vist:

Vi generer deretter en CSV fil fra Json og har benyttet denne som datakilde til PowerBI

At havninvået på denne lokasjonen endret seg så drastiskt er en spennende observasjon. Vi har derfor sammenstilt resultatet med fotgrafi fra ett kamera med bevegelsessensor på samme sted og tidspunkt.

Vi har desverre ikke anledning til å følge opp denne observasjonen på hackaton men vi har allerede, i hackatonnet, helt klart bevist at innsamling og sammenstilling av data i vår prototyp gir høy forretnings- og forskningsverdi.

Med dette ønsker vi bagden dataminder

Lazy chat bot makers  

Since Power Virtual Agents are all about creating chatbots on the fly, we decided to try the auto generation of topic from external sources.

By providing links to the official museum website and lexica sources we got a list of proposed topics. Surprisingly, most of the topics were useful to include in the chatbot. In combination with some information, we extracted ourselves the chatbot can now answer topics regarding Munch, opening hours and the different exhibitions that are currently available.

The information is presented to the user in an intuitive and interactive manner (lol).

Test – data

We needed testdata to develop the functions, so we made a simple ADF pipeline that inserts “actions” every x sec (if we want to).. We also need a lot of data, since we are planning to run reports and AI, with this test-data function we can start develop.

The ADF-pipeline takes a few params so we can controll how many runs we get.

Source data:

  • SQL server
    • Table with first Names
    • Table with last names
    • Table with IncidentTypes
    • View generating random:
      • name: mixing firstname and lastname
      • incident types (as we make more incident types)
      • longitude: Random – 59.9000 – 59.9999
      • latitude: Random – 10.1000 – 10.9999

Assaults

Planned Work

TBC

Completed Work

TBC

The Dataminer is here!

We have retrieved weather data from:
https://weather.visualcrossing.com/VisualCrossingWebServices/rest/services/timeline/manhattan?unitGroup=metric&key=G5NGC7CZTCJBVHKDTRGVD668G&contentType=json
inorder to determine if we should stay in the sewers or hit the streets of New York!

Power BI

Kjøleskapet kan se!

Etter mye klabb og babb har vi endelig fått satt opp Raspberry Pie’en vår med Python, Vs code, ssh-tilkobling til git-repo og selvfølgelig et fungerende webkamera!

Ved hjelp av et lite bash-script, Azure’s egne pythonmoduler får vi lastet opp og analysert bildene innen få sekunder, med en liste over alle objekter i bildet. Etter litt testing er vi veldig imponert over presisjonen, selv om Azure insisterer på at klementin vår er et eple. Svaret sendes videre til en Power Automate flow som oppdaterer data verse.

Arbeid utført. Pull request til godkjenning

Når arbeidstempoet er såpass høyt, er det fort gjort å glemme skikkelig testing eller linting der det trengs, så før det merges inn i develop branchen, må endringene godkjennes av en av teammedlemmene. Konfigfiler og nøkler skal for eksempel ikke inn i kildekoden.

Setting up Synapse Link to Dynamics 365

Middle-Age Mutable Ninja Tuples know there will be a lot of requests for help once the solution is released to the public. They plan on using Synapse to analyze all the data from the system. To get started they set up Synapse Link to Dynamics 365.

With the deprecation of Export Service, Synapse Link seems to be the go-to replacement for a simple, yet powerful tool to extract data from Dynamics 365.

Prerequisites to setup Synapse to Dynamics link:

  1. Azure subscription.
  2. Resource group
    • Synapse workspace
    • Storage account (Data lake gen. 2)

Setting up Dynamics link to Synapse

  1. Make sure you have completed the prerequisites (above)
  2. Go to https://make.powerapps.com/
  3. Navigate to your environment

Select the entities/tables you want to synchronize:

Click save, and your data will begin to show up in Synapse!

You can now run Queries against Synapse on Dynamics 365 Data:

Optional if you want to use Azure Data Studio:

Install Azure Data Studio to work with the data:

With data in Synapse, we can now start crossing data from several sources and make reports.

NOTE: One challenge we came across with Synapse is that all OPTIONSETS are only available as INT values. There is no way to export the entity to Synapse with both the Optionset int value and language-dependent label. This may end up being fixed using a “dirty hack,” alternatively MaMNT will find a prettier way to solve it going forward.

Badges claimed:

Crawl, Mine & Process

We have created a scheduled flow that crawls through www.elskling.no results in my neighborhood electricity prices to get which provides that I could choose in my area.

  1. We created a custom connector, which gets data from www.elskling.no, so we could get this data using a Cloud Flow.
  2. Using this custom connector we were able to get all the providers of the zipCode we sent with the request.

3. After getting these data, we are able to add the Current Rate of each supplier to our Account record in Dataverse. By running this flow hourly, we are able to always get the latest price from each supplier.

Search, Crawl and suggest

We have created an automatic daily web search crawler for giving family kids activity suggestions.

First, we have created a Bing Search API in Azure:

This Bing Search API is consumed by a Power Automate Flow which:

  1. Runs onces every day
  2. Search the web for kids activities
  3. Select on random of the returned suggestions
  4. Create an activity suggestion in Dynamics CRM

Data is listed in Dynamics 365 with URL and description from the search:

We will claim the following badges based on this post:

  1. Data Miner – for enhancing our Dynamics CRM database with data from Bing for kids activity suggestions
  2. Crawler – For building a search Bing API and utilizing this for searching for activities