skip to Main Content

USING .NET FOR YOUR IOT NEEDS PART 3 – THE CLOUD

Hopefully you saw and enjoyed Part 2, where we looked at the software side of things, and showed how to set up and connect to Azure IoTHub. Meaning you as a .NET developer can use your existing C# skills to develop for IoT. In this post, we are going to build on this and show the cloud elements, finishing off the picture of building an IoT system from End to End.

If you missed the beginning of this series, you can get started with Part 1, where we look at the hardware side of things.

Let’s get started.

How can the cloud help?

Before we look too deeply into the various cloud tools let’s think about the flow of our data from the devices we set up in previous posts and how we can take that data and make it accessible and usable by the business. As after all, unless the business can use the data, what’s the point of collecting it?

IoT flow of data from devices

There are 3 steps in the flow. 1) we gather the data from the DEVICES in the field, 2) the IoTHub then ingests that data and either stores or analyses it for INSIGHTS and 3) ACTION is taken if required.

I know this sounds simplistic but it’s a way to divide the system in your mind (well it is for me anyway!) and the reference design shown later in the blog follows this principle as well. However, I wanted to break it down with a block diagram first.

The idea is that devices are out in the big bad world collecting the data and streaming/sending this sensor data to your IoTHub which is your gateway into the cloud. From here you set up your cloud infrastructure to process this data and take insights from it using various tools covered in a bit. Once you have some insights, you can then take action on that data. It could be a PowerBI dashboard for staff to use or a logic app that will send an email or an SMS to staff if an anomaly is detected.

Simplifying Device Connection

Before we talk about the cloud section though we need to simplify the device connection workflow. You saw in the last blog post that there is a lot of set-up for each device and creating of keys and registering of the devices etc.

This is all perfectly simple stuff and takes just a few minutes per device especially when you have done it a few times. But if you have or plan to have a large estate of devices it becomes a full-time job to provision and manage those devices. And of course, this is time and money the project may not support.

This is where other Azure services can step in and help. The service we want here is called DPS or Device Provisioning Service.

Device Provisioning Service (DPS)

The best way to think of the DPS is as a Helper service that will allow you to provision devices with zero-touch. Meaning when the service is set up you can just connect devices and it will auto-provision them for you and connect them to the required IoT Hub based on the rules you set. As always, a picture speaks a thousand words.

Device Provisioning Service

  1. Device manufacturer or you add the device registration information to the enrolment list in the Azure portal.
  2. Device contacts the DPS endpoint that has been set in code/firmware. The device passes the identifying information to DPS to prove its identity.
  3. DPS validates the identity of the device by validating the registration ID and key against the enrolment list entry using either a nonce challenge (Trusted Platform Module) or standard X.509 verification (X.509).
  4. DPS registers the device with an IoT hub and populates the device’s desired twin state.
  5. The IoT hub returns device ID information to DPS.
  6. DPS returns the IoT hub connection information to the device. The device can now start sending data directly to the IoT hub.
  7. The device connects to the IoT hub.
  8. The device gets the desired state from its device twin in the IoT hub.

This looks like a lot of work to set up but the SDK helps here as well. It is just a few lines of C# code to make the connection to the DPS service and you will be given the IoTHub connection. You can review some code from an event run by Microsoft in 2021 which showed off this service and the code is available on my GitHub Here.

If you want to read some more about Azure DPS and how it can help you and your project head over to docs.Microsoft.com.

Using the data in the Cloud for Insights and Action

The most important part of any IoT system is the data. In the last couple of posts, we have shown you how to take an IoT device and write code using .NET to connect that device to the cloud and stream the data from various sensors. We have even shown you how to simplify the connection of the devices using DPS above.

However as mentioned at the top, the data is just arriving in the IoTHub and we can see it arrive but how do we use this data for analysis to gain insights and take action? We can now look at a detailed reference design from Microsoft showing the possibilities.

Microsoft reference design

As you can see from this, we have various paths we can take depending on the data stream coming in from our devices. Remember that an Azure IoT Hub scales to millions of simultaneously connected devices and millions of events per second so that is a lot of telemetry data to handle. Of course, you could lessen this by adding more Hubs which is a nice way to break a large infrastructure into manageable or geographic elements but be warned, you have a hard limit of 50 IoT Hubs per subscription.

Azure Stream Analytics

Now imagine we have a large estate of devices all streaming live data to our IoTHub, meaning we could be streaming millions of telemetry events per second. We want to analyse these events live pulling out moving averages or having alarms for values that go outside parameters. This is exactly the scenario that Azure Stream Analytics was designed for.

This hot path can deal with the millions of messages and analyse them live on the stream with a view to raising an alarm if values go outside your chosen parameters. Or it can connect with a PowerBI dashboard so the back-office team can monitor the system.

Hot path monitoring

You can use a simplified SQL language which you can extend with code using C# and JavaScript to allow you to detect anomalies. Or detect anomalies using one of the multiple built-in Machine Learning Algorithms

These include Spike and Dip to detect rapid rise/fall in a data point. For example, a rapid rise in vibration could indicate a bearing failure. A Changepoint can detect slow persistent changes in a value over time. Something we humans are not very good at. For example, a slow rise in oil temperature and pressure could indicate an issue with a gearbox. Of course, you can use the data stream to train your own models and there are tutorials in the docs to help with this on docs.Microsoft.com.

A Big Benefit of Azure Stream Analytics

The best part about the Stream Analytics tool is that we can push it out to IoT Edge. This means that our edge devices out in the field near the sensors can run these detection models and raise the alarm both locally and in the office. Meaning there will be a quicker response as we don’t have to wait for the data to reach the cloud service. This could for example save a machine tool from destruction as it could be stopped before a bearing failed, saving costs and downtime.

Azure Data Explorer

Azure Data Explorer is a fast, fully managed data analytics service for real-time analysis of large volumes of data streaming from your IoTHub. It’s a fully featured tool that allows you to Ingest the data from IoTHub in near real-time and then query this data and visualize the results all in one tool. You can also have it take the data from your cold path of stored data in say a database if you are using it for analysis offline.

Azure Data Explorer, or ADX as it’s known, is a new service and it’s a replacement for Time Series Insights (TSI) which is scheduled for the end of service in March 2025. There is a migration path if you are already using TSI but for new projects, ADX is your go-to. There is a free cluster you can register for that lasts amazingly 1 year and gives you a lot of storage and services. So it’s well worth a look at: here.

Using Kusto with Azure Data Explorer

ADX uses yet another query language which is called Kusto which seems at first a bad decision by Microsoft. However, once you start using it, you realise it’s an inspired decision as it’s like SQL but more like English and easier to understand for non-developers. And that is the idea, meaning that back office staff can write a query to glean some insights into the data without too much special training.


StormEvents

| where StartTime between (datetime(2007-11-01) .. datetime(2007-12-01))

| where State == "FLORIDA" 

| count

Kusto or KQL as it’s known is Case-Sensitive so be careful when typing out that query as I have wasted far too much time on debugging when it was just a table column name I miss-typed. Once you have your queries set up, you can visualise them in ADX dashboards:

Visualise data in ADX dashboards

If Kusto scares you then you can connect to a PowerBI dashboard or even Excel to keep it in the Office family of products. However, one other very nice way to visualize the data is to use Grafana. It is Open Source and enables you to query, visualize, alert on, and explore your metrics, logs, and traces – super powerful.

Grafana

Business Integration

I have mentioned many times in this series how important it is that the business can use the data and insights provided to better manage the business. After all, this is what the project is all about right – a way to give the business an edge over the competition in making faster and more insightful decisions all based on real-time data.

In the reference architecture diagram above, you can see the many ways to utilise the data once it has been processed. From PowerBi to a Power Automate Flow or a Logic App, these are all Low or No code approaches to utilising the data streams.

Visualisation

PaaS v SaaS

One last point on Azure IoT and something that needs to be done before we try and connect our devices to the cloud and worry about how they will connect. We first need to set up the cloud environment. Sadly, even here we have a choice to make.

Microsoft has two main offerings, the Azure IoTHub which we covered in detail here, but there is another: Azure IoT Central.

The main difference between them is that IoT Central is a SaaS offering that includes Azure IoTHub as part of the platform. It is where Microsoft suggest you start, as it removes a lot of the complexity of building out the platform yourself. Otherwise, you need to provision and configure all the individual parts (as we covered) which is time and money that could be spent elsewhere in the project.

IoT Central – the SaaS offering

You can see in this sales graphic that Azure IoT Central has all the core services we have discussed packaged nicely for us and all can be set up in an easy-to-deploy system ready to use in a few simple clicks in under 10 minutes.

Azure IoT Central

There is of course a price difference for the two routes that isn’t just engineering time but for the actual services. IoT Central is priced on a per-device structure, so it’s cheaper when you have just a few devices and setting up. Whereas the IoT Hub is priced for the Hub and you can have virtually unlimited devices for that one price. But you also have to pay for the other services like Device Provision Service, Azure Stream Analytics etc on top.

The other advantage of IoT Central, sorry I know it sounds a bit salesy, but want you to have the facts – when you create an IoT Central Application in the Azure Portal you get the option to create one with a template. These templates are what the Microsoft IoT Team have seen as the most common scenarios. They may even save you more time and money, and if nothing else they are a great starting point for your own system. You can read more about the templates here.

IoT Central templates

Decision on PaaS v SaaS

The best way I have found to tackle this decision is to sit down and draw out what the IoT infrastructure will look like, how many devices, what is the growth rate planned, how many device messages will be required etc. For this initial design, you can pick what will work best for your business, adding in factors like engineering time and training etc.

Alternatively, you could start with IoT Central to get set-up and running, test the system design and check that the business wishes to make the decision to roll out the larger system before investing engineering time and money in the set-up of the PaaS system. This would then need to be switched over and all the devices would need to be re-provisioned on the new system. So take that into account in any planning, as we all know the business will ask why the extra costs and downtime if it’s already working.

Conclusion

In this third and last post of the series, we have looked at the cloud side of the system. From DPS to help provision all the devices to how to use tools like Azure Stream Analytics and Azure Data Explorer to gain Insights into our Device data and then a brief look at the Action step of business integration.

Lastly, we looked at IoT Central which if I am honest is a perfect starting place for nearly all projects and where I often point people to start with any IoT roll-out as it’s a low investment in time and money to get a system up and running before you fully commit to the amazing world of IoT.

I really hope you have enjoyed this deep dive series into IoT and learnt a thing or two along the way. If there is anything you think I missed, wish to be covered in more detail or want to show off your IoT system please do reach out on Twitter @CliffordAgius or contact the Ballard Chalmers team. We are always more than happy to help.

Happy coding.

Post Terms: .NET | C# | Cloud | dotnet | edge | Internet of Things | IOT | IoT Edge

About the Author

Clifford Agius, writing here as a freelance blogger, is currently a two-time Developer Technologies MVP, specialising in Xamarin/.NET MAUI and IoT. By day, an airline pilot flying Boeing 787 aircraft around the world and when not doing that, Clifford freelances as a .NET developer. An active member of the .NET community, he is a regular speaker at conferences around the world.

Education, Membership & Awards

Clifford graduated as an engineer from the Ford Technical Training Centre in 1995. After 11 years as an electrical/mechanical engineer, he trained to become an Airline Pilot. Clifford became a Microsoft Valued Professional (MVP) in 2020 and went on to earn it again in 2021.

You can find Clifford online at:

Back To Top
Contact us for a chat