Crawling for those new deals

Since we are using a specific service to get deals, it could occur that our provider doesn’t give us all deals out there. That’s why we created a “Web Search API” in Azure. This allows us to create a flow that runs every day on bing for new deals and then notify us if our provider’s deal isn’t the best deal.

We started off by creating the Web Search API

We then created a scheduled flow to run every day, to get those deals we don’t catch in our providers API. In our query, we go for the keyword “Strømavtaler”.

Parsing the JSON allows us to structure the data and get it out. Now we can add each deal we find into a table inside Dataverse to create tasks to follow up on these.

We think this would be an excellent crawler because we are doing a search to bing each day for electricity deals. Then we are creating a record in dataverse for each deal we find, and follow up on these.

We also extended this flow, with another flow doing a search based on our library of electricity deals to see what the search engines give us if we run a search for each of our stored deals daily.

So based on our contract names, we are doing a search on bing each day, to see for possible better deals. We search based on the data we are capturing on www.elskling.no and then use this data to search on bing daily for updates on these contracts.

Those pesky accountants are at our throats

We have dumped historical events into a blobstorage for simple storage over time. The accountants at TMNT hq want to known what expenses each event incurred.

To facilitate them a simple search for cost by parameter we have built an azure search on top of the blobstorage. The blobs are indexed every hour so that any new events added to the search and the accountants can perform their accounting on the cost facet of the events.

The image below demonstrates how the backend search looks, the query in question returns the cost facet for every event where the triggering factor have been a loud sound (100dB).

If the search had returned two events for 100dB where the cost was equal, the count value would be 2.

Simple searching can also provide quick overviews over how a triggering event relates to the other data our sensors capture. Below we can see that all the events that triggered for 100dB triggered at a distance of 144cm from the sensorrig.

Target crawler

Re-post for SOME blogging from the interwebs targeting the Crawler badge

Using Microsoft Search Graph Connector and Microsoft Graph Search API’s we are indexing/injecting the ACDC blog posts into the TMNT intranet sphere to allow our turtles to stay up to date on what competitors are doing.

Graph Connector Setup
Graph Search in action

The Media dashboard has also been extended with a twitter feed.

SOME blogging from the interwebs

The turtle’s IT department, named Pizza Time, is hard at work showcasing their awesome setup. To make it easy to follow what happens we’ve brought the blog posts back into the intranet.

Setup consists of crawling the ACDC blog, for Pizza Time posts only via Graph Connectors.

Graph Connector Crawl Setup

To show these back on the home site we use the PnP Modern Search web parts (https://microsoft-search.github.io/pnp-modern-search/) with Graph connector support. The display is using a custom handle bars template matching the ACDC blog schema.

The custom template uses HTML5 content tags for proper screen reading and navigation.

How it looks in Teams

Crawl, Mine & Process

We have created a scheduled flow that crawls through www.elskling.no results in my neighborhood electricity prices to get which provides that I could choose in my area.

  1. We created a custom connector, which gets data from www.elskling.no, so we could get this data using a Cloud Flow.
  2. Using this custom connector we were able to get all the providers of the zipCode we sent with the request.

3. After getting these data, we are able to add the Current Rate of each supplier to our Account record in Dataverse. By running this flow hourly, we are able to always get the latest price from each supplier.

Lego City kriseteam har fått en canvas app med Relevance Search!

Claim for Crawler: Når vulkanutbruddet først er et faktum trenger myndighetene en måte å skaffe seg oversikt over tilgjengelige ressurser i samfunnet som kan bidra i gjenoppbyggingen. I Lego City sine myndigheters egen Dataverse har de lagret denne informasjonen på tvers av mange tabeller, som for eksempel Accounts, Contacts, Bookable Resources og Products.

For kjapt å få oversikt er det derfor nyttig å kunne søke enkelt på tvers av tabellene i Dataverse. Dette har vi løst ved å bruke Dynamics 365 sitt Relevance Search API.

Dette API’et tilgjengeliggjør Relevance Search-funksjonaliteten også utenfor Dynamics, slik at søket kan gjøres i den type applikasjon som er riktig for hver enkelt brukertype. Dette kan være i Sharepoint, i Teams, en Canvas app, eller noe helt annet. Man er altså ikke låst til å gjennomføre søket inne i Dynamics 365. Vi har laget en enkel Canvas app som alle på Lego City sitt eget kriseteam har tilgang til.

Et søk i denne appen trigger en Power Automate-flow som gjør søk ved hjelp av Relevance Search-API’et, og returnerer resultater på tvers av tabellene i Dataverse:

Ordføreren i Lego City er stressa etter vulkanutbruddet og søker simpelthen på “Emergency”:

Flow’en kjører, og søker på tvers av tabellene. I Account-tabellen finner de et firma som spesialiserer seg på krisehåndtering, perfekt! Navnet på firmaet er “Emergency Inc”.

Her er firmaet slik det ser ut inne i Dynamics 365:

Flow’en returnerer søkeresultatet tilbake til app’en, som viser søkeresultatene. La gjenoppbyggingen starte!

Search, Crawl and suggest

We have created an automatic daily web search crawler for giving family kids activity suggestions.

First, we have created a Bing Search API in Azure:

This Bing Search API is consumed by a Power Automate Flow which:

  1. Runs onces every day
  2. Search the web for kids activities
  3. Select on random of the returned suggestions
  4. Create an activity suggestion in Dynamics CRM

Data is listed in Dynamics 365 with URL and description from the search:

We will claim the following badges based on this post:

  1. Data Miner – for enhancing our Dynamics CRM database with data from Bing for kids activity suggestions
  2. Crawler – For building a search Bing API and utilizing this for searching for activities

Badge – Crawler

Search has come a long way in Dynamics these last years. Recently the Relevance search was greatly improved in UI making it such a great place to search for information in Dynamics.

This is a function you have to activate in Dynamics for it to work properly. Once you do, you will be warned about data leaving the dynamics org. The reason for the warning is that data will now be indexed via different azure services outside of Dynamics sphere.

This search uses the standard quick find fields.

Quick Find view
Quick Find field selection

The relevance search activates a search box on top of dynamics.

In this search you can use lots of parameters to refine search:

You can also click the Show more to see several entities

From here you can refine the results further to see entities/records across the system. All in all a great experience for Dynamics.