Crawler

It was really cool to see that it’s now possible to search Dataverse data via Microsoft Search. We had never tried this before, and it worked like a charm💪

THe initial indexing took many hours, but after this it keeps updating fairly quickly + it seems to respect the security model in Dynamics (VERY IMPORTANT) 👏

This is really nice to see as a function because it opens up Dataverse for the masses, and sharing is caring when it comes to info.. Right? 🙂

🦀 Crawling the Seven Seas for Hidden Treasure 🏆

The waters run deep and dark when you are at sea. This is why intel like maps, a spyglass and knowing the tides help. To give our pirates an advantage we crawl barenswatch.no for other ships location. But, just as in dogeball, it is not enough to know where something is, you need to know where it is going and when you need to be there to intercept. So how do we calculate the best opportunities and how to intercept them?

Step 1: Crawl and Hijack Valuable Data 🦀

From the system diagram we see our Functions App communicates with the Braenswatch API and stors all boats data including current location. We poll this information several times giving us multiple data points not only for ballast and weight, but also for speed, location and direction (this will be important later).

Screen grab of system model for the Pirate Pillaging app

For ease of use while creating the service we compiled all the data using Power BI Dashboards to see what we were working with. The dashboard can also be used to help captains understand the state of the seven seas, although we appreciate that more data is not necessarily better data. It depends on how you visualize, precent and timing of when to precent data to you user.

Video of Solveig ☀️ explaining how we gather, structure, compile and visualize ship data in a way highlighting opportune targets.

Step 2: Knowing which Opportunities to Strike 🏆

From the ships length, type and destination we can estimate its value. Slow moving ships heading for freight docs are ripe for looting, fast moving ships heading for a dry dock are empty and better left alone. For the sharp eye you have also spotted we have a Threat flag in or model, this is for marking known pirates, intercepters and the fuzz of the seven seas. Once again referencing to Sun Tzu: You always win the wars you never fight (dont search this quote).

Image displaying famous quote from Sun Tzu about fighting good

If we decide that this opportunity is ripe for plundering, we look at its course and if it is feasible to intercept based on our current course. Good opportunities are heavy, slow moving ships. Bad opportunities are war ships of any kind or any opportunity in the vicinity of a war ship.

There is also a certain Goldilocks of opportunities: Based on your own vessel, you do not want to attack another ship that is too large, since it will be a harder fight, or too small, since it carries less booty. We can use the dashboard to visually find the opportune boats and use this investigation to write code that targets these boats automatically.

Captain Mats looking at data The Power BI Dashboard populated with crawled and manipulated data from Barenswatch.no, structured by ship type, weight and destination, highlighting the most opportune ships in the collection

When we flag ships that are both opportune and not adjacent to warships we plot paths to figure out where they are going in relations to our course.

Step 3: Using Advanced Trigonometry to Pinpoint your Enemy 🎯

Youtube can teach you the basics about how to calculate a ships heading based on longitude, latitude, direction, true north and the course deviation.

Youtube is telling us what we need to know about complex

True north will change depending on where on the globe you are positioned, so for intercept courses it is paramount to get this stuff right. You want to be at the place where your target ship will be, not where they have been. And you want to be there before they get there, not to ride up after them and trailing them forever. This is where we use the latitude, longitude, direction and speed from earlier and _try_ to plot the path of the ships. I stress try because Solveig just looked at Sebastian with the “do you mind not being an idiot” when he asked how se was doing it and if it was done. Then just sort of sigh’ed and continued clicking keys. So it works, trust us.

Solveig ☀️ working on triangulating and plotting ships path to calculate intercept courses.

For our challenge we also need to estimate if it is a viable option to change our current corse to intercept the opportunity. For all business cases there is a cost/effect probability calculation. Fo us this depends on added time to destination with deviation and added length of trip, both of which adds to the total cost of provisions and consumables like petrol.

Step 4: Plot Optimal Course for Interception 🏴‍☠️

Finally when we have a list of opportune targets, we know which targets will be adjacent to our planned course form A to B we plot intercept maneuvers. Which, is a topic for another blog post – because this part is most certainly done… Trust us… 🤞🤞🤞

Let the machines work 🤖

We get satellite images from the Sentinel Hub API where we can get images from the coordinates we want. This makes it easy to compare images with AIS data from the same time period and coordinates.

Sattelite image of ships
Counting ships

To identify ships from the satellite images, we need more eyes than we have available. So what’s better than letting the machines do the work.
We have developed our own ML code in Python to recognize ships. This will make it much more efficient than manually reviewing all the images.
Currently, the machine has 99% confidence in recognizing ships. It then creates a heat map that shows where it thinks (with 99% probability) there are ships.
Then it counts the number of ships and sends it on.

Training the model to gain confidence

When we know how many ships are within a given area at a given time, it is easy to compare this with data from AIS.
Here we search through the API of the Norwegian Coastal Administration, where we can count the number of ships within the same time and area.

JSON from AIS data

Using external data in this way provides great value for our customers as the threat at sea is significantly lower 💾🏴‍☠️

The Jackpot App :: Pirates fleet enlisting

The reason we are calling this the jackpot app because we believe that it’s hitting multiple score-points!

The idea of this app is to enlist pirates’ fleet without a single character punch!!! Aided and infused by AI Builder, external API’s and Power Automate Flows; this Power Apps canvas app is state of the art geekiness!

Although, registered vehicles are the last of our worries (as pirates), but hey we thought of doing it anyways 🙂 The app starts by taking a picture of the registration plate of the vehicle. the app will send the picture to the AI Builder through a power automate flow that process the image and read the registration plate and show it on the app for confirmation.

Once the you are satisfied with the result, you can check the vehicle against the national registry and search for more registration information. This indeed does check the car against the Norwegian car registry (Statens vegvesen) show if the vehicle is approved for going on the European roads (EU-godskjenning). Also least of our concerns!

Actual data from Statens Vegvesen
Crawling and datamining data from external sources

Now, after confirming and inserting car data into our system. We need to identify the vehicle type and compare with our collected data in our backend system. However, identify the care is a simple process that doesn’t need any text input from the user. One picture will do while the AI Builder will take care of identifying the type of the vehicle and insert it into our backend system.

Vehicle plate identification flow (calls another flow through HTTP)
AI Builder made as an endpoint (all AI Builder models and flow are on separate environment, as AI Builder
can’t be included in a solution for deployment)

We are using a mix of built-in and custom AI models to achieve efficiency and productivity in an innovative way.

AI Builder model

The Skeleton Crews Skeleton Sketches: Hack Sparrows Architectural Treasure Map

When first presenting our idea, the Hack Sparrows shrouded it in mystery. A battle tactic as old as the kraken herself. In the words of Sun Tzu: Never let you enemy know your plan, at least if you have one. Which we most certainly have…

Initial sketch of what The Hack Sparrows wants to achieve.

The Traveling Pirate

Ever heard of the traveling salesman problem? It is a mathematical problem of route optimization: A salesman travels form A to B. In between he has many opportunities for trade. How can we optimize the route in terms of monetary value for the salesman: Which patch gives most bang for the buck, or most buck for the step?

The Bare Bones

We are no salesmen, we are pirates! And based on the presentation many of us are after the same treasures, leaving the best planned pirating rout the winner of pirates. The pirate king!

Our cleverly designed and thoroughly QA’d Architecture (do not double check) shows how we will use state of the (last century) art to enable better piracy.

System diagram showing what the user sees and what the system does. Proudly made with a thieved licenses from our employer. Yarr! (we have license, we just act like we stole them)

On the seven seas the captain has access to a web interface. This web interface uses a Function App to talk to two main services:

  • We data mine the Barenswatch.io service, which keeps track of all commercial vessels on the seven seas, including the Privateers. And store them in our Data Verse,
  • The Data Verse updates and stores the crawled data to keep track off ship movement, ongoing raids and opportunities (including flagging Privateers as hostile)

Outside the web interface we use tactile feedback systems powered by Arduino (or Our Raspberry Pi is we can get it working) to warn the user of dangers or opportunities.

Henrik creating the Data Verse tables for the power platform that will give power to platform, empowering the pirats to pirate commercial platform vessels

    Full solution from PowerShredders of Axdata

    We came into this Hackaton with the idea that we could make the Onboarding process a lot less manual, a lot more secure and reduce the risk of human errors in the process significantly. And we’ll be honest, we are extreamly happy with the solution we now are traveling home with! The best thing you ask? It works like a sharm, every bit of it! We have tested it several times, and everything flows exactly as described!
    You can see the full solution under here, but we’ll start with a summary of why we think we should get a lot of points in the four different main categories:

    Excellent User Experience:

    Lets start with the help this will give to the employees already working in the company:
    – HR now will have a veary streamlined process for Onboarding, where all tasks that can (and should be) automated are just that. They don’ need to spend time sending out agreements, follow up with signature, enter a lot of information regarding the new employee in the system, follow up on other employees that forgets the tasks they have in an Onboarding. They don’t need to notify IT about a new employee coming it and wait for creation of user and access to systems. All of this happens automatically. Nothing is forgotten, no sensitive information is sent in emails or seen by someone who shouldn’t see it.
    – IT never needs to think about a new employee coming in anymore. Everything is automated and just happens. Isn’t that the best user experience? When you actually don’t have to do even one click, and the process still works?
    – Other employees in the company having tasks regarding a new employee coming in will be reminded of their tasks and make sure nothing is forgotten. Automated and nice. And, if they complete their task, no notifcation will be sent, it really is as easy as it sounds

    And then, to the candidate starting. If the company implements this solution, everything will be ready for the new employee when he has his firsgt day of work. He can even get information about the company, his team, his manager and more in the Onboarding Portal before he starts, so we can keep up the good energy people feel when they are about to start working at a new company. The new employee will also feel that this company really takes care of their employees, and that they really are up to date in the digital world we’re living in.

    Most Extreme Business Value:

    The value for companies here are so high, that it’s almost difficult to know where to start. But, first of all, this saves a lot of time for the HR department. And, we really mean A LOT of time. Not just that everything is automated so they don’t need to do as much as they have to now to register a new employee, but they don’t have to push and follow up everyone else that hasn’t done their part and they don’t have to correct human errors that’s been done during the process. They can spend their time on something that is much more valuable to the company, and that is to make sure that all the employees already working here gets the best environment possible to do their job as best as they can! Lets face it, this is what we want the HR department to do, we don’t want them to spend time on entering data into a system.
    The IT department will also save a lot of time witht this solution, that they can spend on other things as well. Not that I work in our IT department, but I assume that creating users and assigning licenses and access aren’t the most fun task they do at work. So I would think this will actually make their workday more fun!
    Let’s just do an approxemently calculation of time saved. Create a contract for the new employee, upload it for e-signature, send it for signing, recieve it, ask the employee for more information so that it can manually be put into the system. Let’s say this totally takes 5 hours at best. Then someone needs to tell IT to create user and access and make sure that it have been done, a total of 1 hours. Then someone needs to follow up other employees, to make sure everything is done and ready for the new employee. This is maybe the most time consuxming, and I don’t think it’s wrong to say that this takes at best 8 hours in total during the time between someone signs and their first day of work. This means that time saved for just this one onboarding process is at least 14 hours. One of our customers typically hires between 10 and 20 employees every month, and this means that this solution will save them between 140 and 280 hours. Each month! To have an other example, an other customer of ours plans to hire 1.500 employees by the end of 2023… You do the math here!

    Rock Solid Geeknes:

    There are so many partws in this solution that works perfectly together, we almost can’t believe that we actually made this work as good as it does now. We seriously din’t think that we would be able to automate that many parts of the process as we have done now.

    Killer App:

    We have created this as a total solution, so that everything can be uploaded and used by all companies using Dynamics FO og Dynamics HR. With only a little personalization to make the wording, logo etc. fit the different companies, this can be set up quite fast. And, of course, each part can be implemented by it’s self too, if someone don’t want the whole solution. But, we know that this will be an easy solution to sell to our customers, as we know that a lot of them already really wants this!

    The solution:

    And now, finally, let’s go to the solution. You have been through a recruitment process in your company, and you have picked a really awesome candidate you want to send an offer to. This is what you do:

    1: Create the candidate in Dynamics, and fill name, startdate, email address and phone number, and connect him to the position:

    Like magic, the candidate resieves an email to view, update and sign the agreement. The Agreement is automatically filled with data from Dynamics before it’s sent out for signing, done through an API we created with OneFlow right here at this Hackaton:

    The candidagte fills in the National ID number, and signs the Agreement using Bank-ID:

    3: Agreement is signed by the company as well:

    4: You Hire the candidate in Dynamics with two clicks, and assign the onboarding checklist that should be used for this canidate, and ticks the checkbox to create a user in Azure AD for the new employee:

    This triggers a PowerAutomate that first create a work email for the new emplyee, sets this new email as the primary email address on the Employee workspace in Dynamics and creates the Azure AD user for the new employee and assigns group and licenses to it, and ends with sending the login detail to the employee with SMS, together with the link to the Onboardig Portal:

    And, of course we have created a PowerAutomate that will remind the employees in your company about the tasks they are assigned regarding the onboarding of the new employee if they don’t follow it up by themself:

    Now, the employee can log in to the Onboarding Portal (that works on all devices), and enter in more information about himself. When submitted, the data is automatically updated directly in Dataverse, and then becomes visiable in Dynamics within seconds:

    And, last, but absolutely not least, the night before the new employee has his firs day of work, a PowerAutomate runs and imports the user to Dynamics, connects the user to the correct employee in Dynamics, and assigns the security role Employee to the user so that the new employee has accsess to the Employee Self-Service workspace in Dynamics:

    And, to end it all for now, we have automated the process for email signature, so that is is automatically put on every one of the emails you send out. Now they finally will lokk the same for the whole company:

    The signature itself is created with HTML-code, after a rule is created in Exchange

    Crawling for those new deals

    Since we are using a specific service to get deals, it could occur that our provider doesn’t give us all deals out there. That’s why we created a “Web Search API” in Azure. This allows us to create a flow that runs every day on bing for new deals and then notify us if our provider’s deal isn’t the best deal.

    We started off by creating the Web Search API

    We then created a scheduled flow to run every day, to get those deals we don’t catch in our providers API. In our query, we go for the keyword “Strømavtaler”.

    Parsing the JSON allows us to structure the data and get it out. Now we can add each deal we find into a table inside Dataverse to create tasks to follow up on these.

    We think this would be an excellent crawler because we are doing a search to bing each day for electricity deals. Then we are creating a record in dataverse for each deal we find, and follow up on these.

    We also extended this flow, with another flow doing a search based on our library of electricity deals to see what the search engines give us if we run a search for each of our stored deals daily.

    So based on our contract names, we are doing a search on bing each day, to see for possible better deals. We search based on the data we are capturing on www.elskling.no and then use this data to search on bing daily for updates on these contracts.

    Those pesky accountants are at our throats

    We have dumped historical events into a blobstorage for simple storage over time. The accountants at TMNT hq want to known what expenses each event incurred.

    To facilitate them a simple search for cost by parameter we have built an azure search on top of the blobstorage. The blobs are indexed every hour so that any new events added to the search and the accountants can perform their accounting on the cost facet of the events.

    The image below demonstrates how the backend search looks, the query in question returns the cost facet for every event where the triggering factor have been a loud sound (100dB).

    If the search had returned two events for 100dB where the cost was equal, the count value would be 2.

    Simple searching can also provide quick overviews over how a triggering event relates to the other data our sensors capture. Below we can see that all the events that triggered for 100dB triggered at a distance of 144cm from the sensorrig.

    Target crawler

    Re-post for SOME blogging from the interwebs targeting the Crawler badge

    Using Microsoft Search Graph Connector and Microsoft Graph Search API’s we are indexing/injecting the ACDC blog posts into the TMNT intranet sphere to allow our turtles to stay up to date on what competitors are doing.

    Graph Connector Setup
    Graph Search in action

    The Media dashboard has also been extended with a twitter feed.

    SOME blogging from the interwebs

    The turtle’s IT department, named Pizza Time, is hard at work showcasing their awesome setup. To make it easy to follow what happens we’ve brought the blog posts back into the intranet.

    Setup consists of crawling the ACDC blog, for Pizza Time posts only via Graph Connectors.

    Graph Connector Crawl Setup

    To show these back on the home site we use the PnP Modern Search web parts (https://microsoft-search.github.io/pnp-modern-search/) with Graph connector support. The display is using a custom handle bars template matching the ACDC blog schema.

    The custom template uses HTML5 content tags for proper screen reading and navigation.

    How it looks in Teams