Full solution from PowerShredders of Axdata

We came into this Hackaton with the idea that we could make the Onboarding process a lot less manual, a lot more secure and reduce the risk of human errors in the process significantly. And we’ll be honest, we are extreamly happy with the solution we now are traveling home with! The best thing you ask? It works like a sharm, every bit of it! We have tested it several times, and everything flows exactly as described!
You can see the full solution under here, but we’ll start with a summary of why we think we should get a lot of points in the four different main categories:

Excellent User Experience:

Lets start with the help this will give to the employees already working in the company:
– HR now will have a veary streamlined process for Onboarding, where all tasks that can (and should be) automated are just that. They don’ need to spend time sending out agreements, follow up with signature, enter a lot of information regarding the new employee in the system, follow up on other employees that forgets the tasks they have in an Onboarding. They don’t need to notify IT about a new employee coming it and wait for creation of user and access to systems. All of this happens automatically. Nothing is forgotten, no sensitive information is sent in emails or seen by someone who shouldn’t see it.
– IT never needs to think about a new employee coming in anymore. Everything is automated and just happens. Isn’t that the best user experience? When you actually don’t have to do even one click, and the process still works?
– Other employees in the company having tasks regarding a new employee coming in will be reminded of their tasks and make sure nothing is forgotten. Automated and nice. And, if they complete their task, no notifcation will be sent, it really is as easy as it sounds

And then, to the candidate starting. If the company implements this solution, everything will be ready for the new employee when he has his firsgt day of work. He can even get information about the company, his team, his manager and more in the Onboarding Portal before he starts, so we can keep up the good energy people feel when they are about to start working at a new company. The new employee will also feel that this company really takes care of their employees, and that they really are up to date in the digital world we’re living in.

Most Extreme Business Value:

The value for companies here are so high, that it’s almost difficult to know where to start. But, first of all, this saves a lot of time for the HR department. And, we really mean A LOT of time. Not just that everything is automated so they don’t need to do as much as they have to now to register a new employee, but they don’t have to push and follow up everyone else that hasn’t done their part and they don’t have to correct human errors that’s been done during the process. They can spend their time on something that is much more valuable to the company, and that is to make sure that all the employees already working here gets the best environment possible to do their job as best as they can! Lets face it, this is what we want the HR department to do, we don’t want them to spend time on entering data into a system.
The IT department will also save a lot of time witht this solution, that they can spend on other things as well. Not that I work in our IT department, but I assume that creating users and assigning licenses and access aren’t the most fun task they do at work. So I would think this will actually make their workday more fun!
Let’s just do an approxemently calculation of time saved. Create a contract for the new employee, upload it for e-signature, send it for signing, recieve it, ask the employee for more information so that it can manually be put into the system. Let’s say this totally takes 5 hours at best. Then someone needs to tell IT to create user and access and make sure that it have been done, a total of 1 hours. Then someone needs to follow up other employees, to make sure everything is done and ready for the new employee. This is maybe the most time consuxming, and I don’t think it’s wrong to say that this takes at best 8 hours in total during the time between someone signs and their first day of work. This means that time saved for just this one onboarding process is at least 14 hours. One of our customers typically hires between 10 and 20 employees every month, and this means that this solution will save them between 140 and 280 hours. Each month! To have an other example, an other customer of ours plans to hire 1.500 employees by the end of 2023… You do the math here!

Rock Solid Geeknes:

There are so many partws in this solution that works perfectly together, we almost can’t believe that we actually made this work as good as it does now. We seriously din’t think that we would be able to automate that many parts of the process as we have done now.

Killer App:

We have created this as a total solution, so that everything can be uploaded and used by all companies using Dynamics FO og Dynamics HR. With only a little personalization to make the wording, logo etc. fit the different companies, this can be set up quite fast. And, of course, each part can be implemented by it’s self too, if someone don’t want the whole solution. But, we know that this will be an easy solution to sell to our customers, as we know that a lot of them already really wants this!

The solution:

And now, finally, let’s go to the solution. You have been through a recruitment process in your company, and you have picked a really awesome candidate you want to send an offer to. This is what you do:

1: Create the candidate in Dynamics, and fill name, startdate, email address and phone number, and connect him to the position:

Like magic, the candidate resieves an email to view, update and sign the agreement. The Agreement is automatically filled with data from Dynamics before it’s sent out for signing, done through an API we created with OneFlow right here at this Hackaton:

The candidagte fills in the National ID number, and signs the Agreement using Bank-ID:

3: Agreement is signed by the company as well:

4: You Hire the candidate in Dynamics with two clicks, and assign the onboarding checklist that should be used for this canidate, and ticks the checkbox to create a user in Azure AD for the new employee:

This triggers a PowerAutomate that first create a work email for the new emplyee, sets this new email as the primary email address on the Employee workspace in Dynamics and creates the Azure AD user for the new employee and assigns group and licenses to it, and ends with sending the login detail to the employee with SMS, together with the link to the Onboardig Portal:

And, of course we have created a PowerAutomate that will remind the employees in your company about the tasks they are assigned regarding the onboarding of the new employee if they don’t follow it up by themself:

Now, the employee can log in to the Onboarding Portal (that works on all devices), and enter in more information about himself. When submitted, the data is automatically updated directly in Dataverse, and then becomes visiable in Dynamics within seconds:

And, last, but absolutely not least, the night before the new employee has his firs day of work, a PowerAutomate runs and imports the user to Dynamics, connects the user to the correct employee in Dynamics, and assigns the security role Employee to the user so that the new employee has accsess to the Employee Self-Service workspace in Dynamics:

And, to end it all for now, we have automated the process for email signature, so that is is automatically put on every one of the emails you send out. Now they finally will lokk the same for the whole company:

The signature itself is created with HTML-code, after a rule is created in Exchange

Serverless SQL and Synapse love

After taken all the data from Dynamics into Synapse Middle-Aged Mutable Ninja Tuples wanted to be able to create views and use all data gathered in Synapse.

As a first step doing this we setup a serverless SQL.

This serverless SQL has no storage and only contain of the possibilities to g.ex create Stored Procedures and Views. This will able us to cross data from many sources and get a more rich way to doing analytics without duplicating data.

First thing first: Creating a External Data source for Serverless.

Then we need to create a fileformat

(This is not the final file format we ended up with – remember to update when working)

Next create all the external table(s) we need

!Note we only get the data we need from the CSV file, there are so many fields not needed in the source so let’s get those out of the way..

This would be the “code” way. But taking the time into consideration.. We used the GUI way:

After creating all the external tables we can now create a view in serverless SQL using normal T-SQL

(Please don’t mind the “dirty case”. We tried to fix this in several ways. But could not find a solution)

Now we can use this View in Power BI to get Clean data from Synapse. We can even cross data from several sources without duplicating data.

Less work for those using Power BI, and more control of the data back-end. A win-win situation.

If this turtle had not been sick, he would dived further on, but as of to day, he has to go back to bed again..

Plans (if not sick)

  • Getting data from outside Dynamics
    • Weather info
    • Traffic info
  • Taken those data into Power BI.
    • Analyze of SOS alert VS weather and time used to solve SOS alert
    • Analyze of time used in traffic – optimalization of resource scheduling
    • ++

Concerns: Synapse VS Data Export Service and it's deprecation After looking at the synapse and comparing it with the previous Data Export Service. (Deprecated) I look forward to the future. There are an incredible number of new possibilities for synapse versus Data export Service. However, there is one thing that worries me a little. How to translate Option set values to Display names in Synapse. This is a known issue from the past g.ex power BI. The data export service had a separate function that created an optionssetmetadata table where you could join together and get the different display names for each language on option sets values. As it is today, from the synapse you only get access to the option values ​​and no longer have this support table to lean on. Ref: https://powerapps.microsoft.com/en-us/blog/do-more-with-data-from-data-export-service-to-azure-synapse-link-for-dataverse/

Download Brreg Data

To get more data for testing we did setup a new soruce in Azure Data Factory to download “Account Name” and “Orgnummer” from Brreg.

The result downloading first 1000 from Brreg..

We made it simple.. using this URL



With an param to ADF to say how many pages we should download.. Simply changing the number of pages to download.. Meaning batches of 1000 accounts.. So, g.ex: We set

ADF will loop until 5 pages in URL have been set in API URL and downloaded to source


run 1: https://data.brreg.no/enhetsregisteret/api/enheter?size=1000&page=1

run 2: https://data.brreg.no/enhetsregisteret/api/enheter?size=1000&page=2

run 3: https://data.brreg.no/enhetsregisteret/api/enheter?size=1000&page=x

For each loop we download these data and dump them into TEST SQL DB

Example now:

Datamining waterlevel

For å få et bedre bilde av hva som skjer i Oslofjorden sammler vi inn data fra sensorer og eksterne kilder.

Vi har i en tidligere blogg beskrevet innsamling av sensordata i datavers samt hvordan vi har presentert dette på en fin måte. Dette i seg selv burde kvalifisere til utmerkeslen Datamining.

This image has an empty alt attribute; its file name is image-365-1024x582.png

Det er mange relevante eksterne datakilder men en sentral er havnivå, eller tidevann. Vi har laget en logic app som går regelmessig for å samle inn data om havnivå til gitt sted og tid.


Resultatet av api querien er json med alle havninvå endringer fordelt over tid. Som vist:

Vi generer deretter en CSV fil fra Json og har benyttet denne som datakilde til PowerBI

At havninvået på denne lokasjonen endret seg så drastiskt er en spennende observasjon. Vi har derfor sammenstilt resultatet med fotgrafi fra ett kamera med bevegelsessensor på samme sted og tidspunkt.

Vi har desverre ikke anledning til å følge opp denne observasjonen på hackaton men vi har allerede, i hackatonnet, helt klart bevist at innsamling og sammenstilling av data i vår prototyp gir høy forretnings- og forskningsverdi.

Med dette ønsker vi bagden dataminder

Lazy chat bot makers  

Since Power Virtual Agents are all about creating chatbots on the fly, we decided to try the auto generation of topic from external sources.

By providing links to the official museum website and lexica sources we got a list of proposed topics. Surprisingly, most of the topics were useful to include in the chatbot. In combination with some information, we extracted ourselves the chatbot can now answer topics regarding Munch, opening hours and the different exhibitions that are currently available.

The information is presented to the user in an intuitive and interactive manner (lol).

Test – data

We needed testdata to develop the functions, so we made a simple ADF pipeline that inserts “actions” every x sec (if we want to).. We also need a lot of data, since we are planning to run reports and AI, with this test-data function we can start develop.

The ADF-pipeline takes a few params so we can controll how many runs we get.

Source data:

  • SQL server
    • Table with first Names
    • Table with last names
    • Table with IncidentTypes
    • View generating random:
      • name: mixing firstname and lastname
      • incident types (as we make more incident types)
      • longitude: Random – 59.9000 – 59.9999
      • latitude: Random – 10.1000 – 10.9999


Planned Work


Completed Work


The Dataminer is here!

We have retrieved weather data from:
inorder to determine if we should stay in the sewers or hit the streets of New York!

Power BI

Kjøleskapet kan se!

Etter mye klabb og babb har vi endelig fått satt opp Raspberry Pie’en vår med Python, Vs code, ssh-tilkobling til git-repo og selvfølgelig et fungerende webkamera!

Ved hjelp av et lite bash-script, Azure’s egne pythonmoduler får vi lastet opp og analysert bildene innen få sekunder, med en liste over alle objekter i bildet. Etter litt testing er vi veldig imponert over presisjonen, selv om Azure insisterer på at klementin vår er et eple. Svaret sendes videre til en Power Automate flow som oppdaterer data verse.

Arbeid utført. Pull request til godkjenning

Når arbeidstempoet er såpass høyt, er det fort gjort å glemme skikkelig testing eller linting der det trengs, så før det merges inn i develop branchen, må endringene godkjennes av en av teammedlemmene. Konfigfiler og nøkler skal for eksempel ikke inn i kildekoden.

Setting up Synapse Link to Dynamics 365

Middle-Age Mutable Ninja Tuples know there will be a lot of requests for help once the solution is released to the public. They plan on using Synapse to analyze all the data from the system. To get started they set up Synapse Link to Dynamics 365.

With the deprecation of Export Service, Synapse Link seems to be the go-to replacement for a simple, yet powerful tool to extract data from Dynamics 365.

Prerequisites to setup Synapse to Dynamics link:

  1. Azure subscription.
  2. Resource group
    • Synapse workspace
    • Storage account (Data lake gen. 2)

Setting up Dynamics link to Synapse

  1. Make sure you have completed the prerequisites (above)
  2. Go to https://make.powerapps.com/
  3. Navigate to your environment

Select the entities/tables you want to synchronize:

Click save, and your data will begin to show up in Synapse!

You can now run Queries against Synapse on Dynamics 365 Data:

Optional if you want to use Azure Data Studio:

Install Azure Data Studio to work with the data:

With data in Synapse, we can now start crossing data from several sources and make reports.

NOTE: One challenge we came across with Synapse is that all OPTIONSETS are only available as INT values. There is no way to export the entity to Synapse with both the Optionset int value and language-dependent label. This may end up being fixed using a “dirty hack,” alternatively MaMNT will find a prettier way to solve it going forward.

Badges claimed:

Crawl, Mine & Process

We have created a scheduled flow that crawls through www.elskling.no results in my neighborhood electricity prices to get which provides that I could choose in my area.

  1. We created a custom connector, which gets data from www.elskling.no, so we could get this data using a Cloud Flow.
  2. Using this custom connector we were able to get all the providers of the zipCode we sent with the request.

3. After getting these data, we are able to add the Current Rate of each supplier to our Account record in Dataverse. By running this flow hourly, we are able to always get the latest price from each supplier.