Recently I’ve done a proof of concept with one of my customers and I was animated to cooperate more closely with Enterprise Asset Management (aka EAM) module in Microsoft Dynamics 365 Supply Chain Management. It was an adventure to me to research and understand what is possible using EAM in a large company structure.
At the beginning of the PoC it was a challange to handle ten-thousands of assets which have to be administrated and periodically maintained. As in ERP it is a common fight with master data quality it was the more challange to have all maintenance work (calibrations protocols, test results, counters, …) documented in a way that that all industry specific requirements and obligations to prove are fulfilled. I’m now familiar to work with counters, checklists, assesments, … which handle all outcome from work orders like transactions to be sure that protocols have sustainable value.
In the further discussion it comes about measurement data in conjunction with existing asset equipement. The measurement equipement produces up to hundrets of Gigabyte of data per month running tests on devices which are here basically any kind of vehicles.
So basically there are two kind of data source identified, on the one hand we see protocols of the assets which are produced during asset maintenance work (callibration protocols, reference test measurememts, …) and on the other hand we have measurement data as result of customer service orders for e.g. electronic or mechancial tests on devices about material compliance, electromagnetic compatibility, emmissions, product qualifying and certifications or any kind of industrial standard and regulations.
Both should be handled with D365FO the first one, using asset management module for maintenance of devices and the second one using production orders to run and protocol very well defined test scenarios on customer devices to produce the documentation for evidence and certificates. As the maintenance work is usually done inside the facilities or at external suppliers the vehicles tests have mostly to be driven under real life scenrios which means either on public roads or any kind of test tracks.
To cover all that requirements we thought about an infrastructure to have only one technique in place for the management of all the data produced by multiple measurement and sensor equipments. So the idea was to evaluate Helium – the peoples network for that kind of business use case (Helium for Business) to bring operational sensor data to the associated business process within Microsoft Dynamics 365 Supply Chain Management.
Helium Network is a open-source based blockchain network using LoRaWan technology to transport small data packages from any devices as sensors to any hosts to be integrated in industrial and/or business processes. A data packages on Helium is usual 24 bytes, beside the paylod overhead there are usual 11 bytes available für real sensor data as temperture, humidity, gps-coordinates or any other measurement value. The costs are quite transparent as a Data Credit (DC) on Helium is 0,00001 $, which means for
1 $ you can transfer 100.000 data packages. Sending one data package per Minute costs ~5$ per year in case of 100% uptime of the device.
So Helium is constantly expanding the availability of antennas and hotspots to receive and forward data packages to the service providers host systems. The availabilty on Helium is visual available at the Helium explorer https://explorer.helium.com
To transfer any kind of sensor data the sensor needs to be registered with a unique Device-ID within one of device management platforms. Beside the devide independent Providers as it is Helium itself (Helium – Introducing The People’s Network),The Things Network (The Things Network) or AWS IoT Device Management (IoT Device Management – AWS IoT Device Management – Amazon Web Services) some device manufacturer also provide services to manage their devices as e.g. Bosch with its own Device management platform (IoT device management: the easy way to device management (bosch-iot-suite.com))
Once a device is configured (done by tools provided by the device manufacturer), registered at one of the IoT platforms and connected (using DevEUI (Device Extended Unique Identifier), App EUI and App key) data packages can be received from any location as far it is covered by Helium network.
To prepare the raw data for any business scenario the payload of the data package needs to be extracted and “translated” to any by business software understandable format. Means to remove any overhead and reduce the data to the only required measurement values, e.g. the level of the container, the counter of a machine, number of clicks on a web site, …, every capturable value is possible. The number of possible business application scenarios is unimaginable as more and more processes are going to be digital.
Within the Device managament console of Helium (https://console.helium.com/) multiple connectors are available to extract, transform and forward the raw payload to any other platforms to be integrated with commercial or technical processes. This is called Flow in Helium console to connect the Sensors data (here Dragino), using a custom transformation code (here Dragino Decoder) and transfer to an other subsequennt system (here Azure Iot-Hub).
After receiving the raw sensor data the data package can be transformed by custom functions and forwarded to connected successor systems. A couple of Integration options incl. Azure IoT Hub or AWS IoT Core are available as well as core integrations with HTTP calls at the Helium console. That stage is not the level to collect and aggregate data from multiple data packages, it just processes the single raw data message. Even the “Flow” can have multiple outbound integrations e.g. one for further technical processing (e.g. temperature value goes to MES Monitoring), or for commercial processing (e.g. temperature goes also as a counter to asset management or a production order).
Within Helium Console we forward either the raw or modified or extrated data the to next level of enrichment.
Azure IoT Hub & Device
In Azure we need to create an Azur IoT Hub which is base point to capture data flows from technical devices – or even simulators as the Azure Rasperry Simulator (Raspberry Pi Azure IoT Web Simulator (azure-samples.github.io)).
Within the Azure IoT Hub we have to setup the sensor as a device that we already created inside the Helium Console.
The connection from Helium Network to the device within the Azure IoT Hub is done by using the primay connection string of the device. Data can now forwarded from Senser management platform to Azure IoT Hub and can be monitored and visualizied. For monitoring we can use common Azure metrics as well as Azure Diagnostic and even Visual Studio Code (using IoT Hub Explorer) to analyze the flow of the sensors data through the IoT Hub.
Azure Stream Analytics
After we have captured the data in a Ature IoT Hub we can setup some logic to handle that data package as single message or to work with multiple data packages as a data stream. This means with Azure Stream Analytics we can setup some code to handle, collect, aggregate and analyze more than just one data package depending on the frequency the sensors sends data packages. Basically Azure Stream analytics uses at least one (or more) Input data sources and one (or more) Output targets.
Input source can be devices of an IoT Hub, Event Hub or even files form Blob storage. Target system are defined as “Outputs” and can be different systems as Azure functions, Blob storage, Event Hub, Service Bus, Power Bi or SQL Database beside some more options.
Between Input and Output we can setup some logic with Queries to prepare the data corresponding to our business requirement. Using Stream Analytics Query Langugae (Stream Analytics Query Language Reference – Stream Analytics Query | Microsoft Learn) we can apply Time logic, transformations, aggregations, … to prepare the data stream for succeeding systems. Once the query is defined the Stream Analytics Job will be executed with every reveiving data package. Because of some time functionality data packages can be gathered over defined periods and transformed as required.
E.g. we could collect the measurement values coming form the input data source over one minute and forward just the average value to the Output. Or just send a message to the output if no sensor data package has arrived within a defined time intervall. Beside the query also Azure functions can be used as a Ouput which means to have more individual business logic implemented.
Dynamics 365 SCM Sensor Data Intelligence (SDI)
Microsoft Dynamisc 365 SCM SDI integration provides multiple scenarios to integrate IoT Data with D365SCM business processes. For this different Azure Stream Analytics jobs are used accordingly to the business case (Sensor Data Intelligence home page – Supply Chain Management | Dynamics 365 | Microsoft Learn) e.g. Report as Ressource outage, triggering maintenance request, … or even the Machine status.
This functionality will be provided by running the Azure Sensor Data Integration package (Deploy and connect Azure resource) which is available within public preview of Version 10.0.30.
This depolyment creates all the required Azure resources (IoT Hub, Logic Apps, Azure Stream analytics jobs, Azure Functions, …) to connect the Azure Device in a proper way within Microsoft Dynamics 365 SCM.
Of course the Azure Stream Analytics Input source have to be adopted to your real sensor data packages, as well as the query to transform the data to the expected Output format for the Dynamics 365SCM integration. With that Supply Chain integration also a Azures Redis Cache is used to keep the data temporary in memory for fast access as well as to apply some business- and time logic to the data sensor stream.
Finaly we need to configure the business case where we want to use the sensor data in D365SCM.
Currently a couple of scenarios are in preview as Asset downtime, Asset management, Machine status, Product quailty and Production delay (Sensor Data Intelligence home page – Supply Chain Management | Dynamics 365 | Microsoft Learn).
If we have finalized all the steps from configuring a hardware sensos, moving and modifying the payload of the sensor using Device Management Platform, Azure IoT Hub and Azure Stream analyitcs we can work with near real time sensor data withín the standrad business processes in Mcrosoft Dynamics 365 Supply Chain Management. The vision is present and limitless how to integrate device data to automate business process.
Thinking forward we can use also Microsoft Power Platfom to support more busines processes based on SDI.
- a tank is empty and it is reordered by automated creation of Purchase orders – to reduce manual planning effort
- a web advertisement are billed on the basis of user clicks by creating usage based invoicing – to avoid manual interaction
- a maintenance oder is created because the asset has reached the counter value for maintenance – to avoid resource damage
- manufactured products have automatically attached a quality protocoll off measured values – to have uninterrupted continuous quality assurance
No limits. Feel free to live your business ideas using Sensor Data Intelligence within Microsoft Dynamics 365 Supply Chain Management.
This blog was posted first time wihtin Microsoft Dynamics 365 for Finance / Supply Chain / Retail – for sustainable business improvment (dynamics3654operations.de) on November 2022.