Since Power Virtual Agents are all about creating chatbots on the fly, we decided to try the auto generation of topic from external sources.
By providing links to the official museum website and lexica sources we got a list of proposed topics. Surprisingly, most of the topics were useful to include in the chatbot. In combination with some information, we extracted ourselves the chatbot can now answer topics regarding Munch, opening hours and the different exhibitions that are currently available.
The information is presented to the user in an intuitive and interactive manner (lol).
We needed testdata to develop the functions, so we made a simple ADF pipeline that inserts “actions” every x sec (if we want to).. We also need a lot of data, since we are planning to run reports and AI, with this test-data function we can start develop.
The ADF-pipeline takes a few params so we can controll how many runs we get.
We have retrieved weather data from: “https://weather.visualcrossing.com/VisualCrossingWebServices/rest/services/timeline/manhattan?unitGroup=metric&key=G5NGC7CZTCJBVHKDTRGVD668G&contentType=json“ inorder to determine if we should stay in the sewers or hit the streets of New York!
Etter mye klabb og babb har vi endelig fått satt opp Raspberry Pie’en vår med Python, Vs code, ssh-tilkobling til git-repo og selvfølgelig et fungerende webkamera!
Ved hjelp av et lite bash-script, Azure’s egne pythonmoduler får vi lastet opp og analysert bildene innen få sekunder, med en liste over alle objekter i bildet. Etter litt testing er vi veldig imponert over presisjonen, selv om Azure insisterer på at klementin vår er et eple. Svaret sendes videre til en Power Automate flow som oppdaterer data verse.
Arbeid utført. Pull request til godkjenning
Når arbeidstempoet er såpass høyt, er det fort gjort å glemme skikkelig testing eller linting der det trengs, så før det merges inn i develop branchen, må endringene godkjennes av en av teammedlemmene. Konfigfiler og nøkler skal for eksempel ikke inn i kildekoden.
Middle-Age Mutable Ninja Tuples know there will be a lot of requests for help once the solution is released to the public. They plan on using Synapse to analyze all the data from the system. To get started they set up Synapse Link to Dynamics 365.
With the deprecation of Export Service, Synapse Link seems to be the go-to replacement for a simple, yet powerful tool to extract data from Dynamics 365.
Prerequisites to setup Synapse to Dynamics link:
Azure subscription.
Resource group
Synapse workspace
Storage account (Data lake gen. 2)
Setting up Dynamics link to Synapse
Make sure you have completed the prerequisites (above)
Go to https://make.powerapps.com/
Navigate to your environment
Select the entities/tables you want to synchronize:
Click save, and your data will begin to show up in Synapse!
You can now run Queries against Synapse on Dynamics 365 Data:
Optional if you want to use Azure Data Studio:
Install Azure Data Studio to work with the data:
With data in Synapse, we can now start crossing data from several sources and make reports.
NOTE: One challenge we came across with Synapse is that all OPTIONSETS are only available as INT values. There is no way to export the entity to Synapse with both the Optionset int value and language-dependent label. This may end up being fixed using a “dirty hack,” alternatively MaMNT will find a prettier way to solve it going forward.
We have created a scheduled flow that crawls through www.elskling.no results in my neighborhood electricity prices to get which provides that I could choose in my area.
We created a custom connector, which gets data from www.elskling.no, so we could get this data using a Cloud Flow.
Using this custom connector we were able to get all the providers of the zipCode we sent with the request.
3. After getting these data, we are able to add the Current Rate of each supplier to our Account record in Dataverse. By running this flow hourly, we are able to always get the latest price from each supplier.