If the Internet of Things (IoT) is the plumbing, then the potential value comes from what is flowing through those pipes: the data. IoT devices generate a massive amount of data. Statista estimates that by 2025, the total data volume of connected IoT devices worldwide is forecast to reach 79.4 zettabytes (ZBs) and you need to employ machine learning for IoT to realize the potential.

In 2015, McKinsey & Company published a report that estimated that only 40% of IoT data (40% of 79 ZB is still 31.6 ZB) was being collected and only 1% was being used. The report made the case that there is tremendous economic value in IoT–$11.1 Trillion estimated in 2015. One of the keys to realizing this potential would be for companies to harness the power of IoT data through predictive algorithms.

Companies need to learn to harness the value of the zettabytes of data flowing through the IoT in order to fully realize the trillions of dollars of global economic value!

Table of Contents

Unlock the potential of IoT with Machine Learning

Algorithms play a crucial role in unlocking the value in IoT data because there is so. much. data. Without algorithms, there will be no way for a human or even a fancy dashboard to make sense of all the noise flowing through the pipes.

We need to use machine learning (ML) to sift through the noise and uncover interesting events, anomalies, time periods, insights, etc. Furthermore, we can use ML to simulate possible future states, estimate probabilities of root causes, predict future events, generate simulations, model systems, and more.

This guide is a starting place to begin examining machine learning algorithms and how they apply to IoT data, how to optimize and revisit these models, and what considerations need to be made to ensure all stakeholders have a clear idea of what to expect from the ML efforts.

IoT use cases for Machine Learning

back and white graphic with pink computer. Text states "IoT and ML use cases" with 9 text boxes surrounding the pink computer reading "1. Process/Asset Monitoring, 2. Anomaly Detection, 3. Predictive Maintenance, 4. Cyber Security, 5. Product Development, 6. Simulations, 7. System Modeling, 8. Route Optimization, 9. Resource Optimization"

Broadly speaking, there are some general groups of use cases for machine learning with IoT data, regardless of the algorithms used to achieve the result:

  1. Process and Asset Monitoring: Sensors can be built-in or retrofitted onto any type of device imaginable. From small wearable devices to submarines, sensors can be added to anything. The data from the sensor provides data to monitor processes or devices in real-time, such as in manufacturing or transportation.
  2. Anomaly detection: used to detect faults on devices, interesting events, security breaches, etc.
  3. Predictive Maintenance
  4. Cyber Security: IoT data can be a critical target for hackers. It’s important to monitor the data for signs of intrusion. This is often done by analyzing the data in the pipeline.
  5. Product Development: In Industry 4.0, the manufacturing line doesn’t end when the product leaves the assembly line. Connected products provide data for companies to stay connected and provide services to the device, but also to learn from customer usage patterns and device failure patterns. The manufacturer can use this information to make better products.
  6. Simulations: IoT devices provide the data needed to build and run large-scale simulations. Physics models can provide a lot of information to a model, but real-world data provides information about the environment that is not captured in foundational physics models.
  7. Resources Optimization: IoT is used in supply chain management and can be used to minimize waste, anticipate supply chain issues, and monitor assets to determine what parts are needed.
  8. Route Optimization: Consumer delivery companies leverage GPS on their fleets of vehicles to enable huge efficiency gains found via route optimization. The famous traveling salesman is an example of this type of data problem.
  9. System Modeling: In addition to simulations, IoT can provide the necessary information for real-time system modeling which is very adjacent to the idea of Digital Twins. Digital Twins are digital representations of real-world processes or assets with a data link between the two.

Machine Learning algorithms applied to IoT use cases

what is data science

Photo by Digital Buggu on Pexels.com

Broadly speaking, there are two groups of machine learning algorithms: Supervised and Unsupervised. The difference between these groups is whether or not the data has labels for what is being predicted. For example, if you want an algorithm to predict failures and you have a dataset where failures are labeled, then you can potentially use supervised learning. If you don’t have labeled training data, then you would need to use unsupervised learning. Here is a general overview of supervised and unsupervised machine learning algorithms applied to IoT use cases.

Supervised Learning Algorithms

Supervised learning algorithms are trained on labeled data. The target, or the value that we are trying to predict, is provided for every example in the training data. You are relying on the algorithm to find patterns in labeled past behavior to predict future behavior. Examples of supervised learning algorithms include linear regression, logistic regression, and support vector machines (SVMs).

There are many different variations of supervised algorithms, but they generally perform two tasks: Regression and Classification. Generally, regression maps the input to a continuous output, whereas classification maps input to a discrete output.

With IoT data, the following are some examples of labeled data:

  • You have one variable (y) and you want to create a system model to predict y as a function of n other measured variables (x1,…, xn). For example, you want to create a model for pressure (y) as a function of temperature (x1) and engine speed (x2).
  • You have some continuous variables measured (temperature, engine speed, and pressure) and you collect event variables (fault occurrence). You want to use continuous variables to predict the occurrence of the event.
  • Satellite imagery of the ocean with bounding boxes labeling oil rigs.
  • Machine telemetry data and corresponding maintenance logs with timestamps of breakdown events.
  • Manually collected data recording the occurrence of a target event (e.g. loading/unloading of a machine) and telemetry data collected during the same time period
person wearing welding helmet welding metal near gray brick wall

Photo by Movidagrafica Barcelona on Pexels.com

What are some examples of Supervised Learning algorithms applied to IoT?

  • Linear regression can be used to predict the remaining useful life of equipment in a predictive maintenance application. The caveat here is that it must be known that the equipment degrades on a linear time scale. Most degradation follows a “bathtub curve” (reference: https://en.wikipedia.org/wiki/Bathtub_curve), but the linear timescale is used in traditional preventative maintenance paradigms where you need to change your oil every “x” miles or “x” engine hours. Then, if the machine is used at a consistent rate, you can use simple linear regression to estimate when maintenance needs to occur. Some other examples of Linear Regression applied to IoT data:
    • Development of Predictive Maintenance Interface Using Multiple Linear Regression [IEEE]
    • Machine Learning Predictive Maintenance on Data in the Wild [IEEE]
    • “Condition Monitoring for Predictive Maintenance – Towards Systems Prognosis within the Industrial Internet of Things.” [Chukwuekwe, 2020]
  • Logistic regression can be used to classify IoT device data as either normal or anomalous in a fault detection application. Logistic Regression has been used quite a bit in IoT security applications:

    • “Intrusion Detection in IoT With Logistic Regression and Artificial Neural Network: Further Investigations on N-BaIoT Dataset Devices”[Journal of Communications Security]
    • A method to detect Internet of Things botnets [IEEE]
  • Support vector machines (SVMs) can be used to classify sensor data as belonging to one of several predefined classes, such as different types of machinery or different types of failure modes. Additionally, they can be useful in anomaly detection. SVMs are well-known for being non-probabilistic binary classifiers that can create a linear or non-linear boundary between two classes. The method has been extended to include multi-class classification and semi-supervised methods.

    • Network Attack Classification in IoT Using Support Vector Machines [MDPI]
  • Decision Trees can be used to create simple (or complex) and easy-to-understand if/then rules. Decision trees are often used for classification tasks where the final leaves of the tree correspond to the different categories of classification; however, decision trees can be applied to regression problems–these will give ranges of values as a final output. These algorithms are extremely useful in automating processes, such as automating when to water crops based on moisture sensor data in a precision farming application:

    • “Precision Agriculture Design Method Using a Distributed Computing Architecture on Internet of Things Context” [MDPI]
  • Deep learning algorithms: These algorithms are a type of artificial neural network that can learn to recognize patterns in big data. Deep learning algorithms have been used in a wide range of IoT applications, including image and video analysis, natural language processing, and anomaly detection. Deep learning algorithms are traditionally built with very large labeled datasets–labeled data is hard to come by for some applications, such as predictive maintenance. However, the IoT is a massive generator of data, so if you can generate labeled data programmatically at the edge or in the cloud, then plenty of examples exist of Deep Learning being applied to IoT use cases:
    • “Survey on Machine Learning and Deep Learning Algorithms used in Internet of Things (IoT) Healthcare” [Link to doi]
    • “A Survey of Machine and Deep Learning Methods for Internet of Things (IoT) Security” [IEEE]
    • “Deep Learning in the Industrial Internet of Things: Potentials, Challenges, and Emerging Applications” [IEEE]

Unsupervised Learning Algorithms

Unsupervised learning algorithms are methods of machine learning that are trained on unlabeled data–the “supervised” part of the name refers to whether the data is labeled or not. This means that the target, or the value that we are trying to predict, is not available in the training data. The algorithm is used to find patterns in unlabeled data and, most often, unsupervised algorithms are used in clustering tasks.

Examples of unsupervised learning algorithms include k-means clustering, Gaussian Mixture Models (GMM), and Apriori rules or association rules.

With IoT data, the following are some examples of unlabeled data:

  • Raw telemetry data. Unless your data logger is programmed with some business rules to identify some events and send the occurrence of an event as a signal, raw telemetry data is unlabeled.
  • Raw GPS data. There are some tools available to help label GPS data with information like intersections, place names, proximity to geological features, etc. The process of cleaning and enriching GPS data can require a special skill set.
  • Satellite imagery. More and more satellite imagery is becoming available thanks to the reduction in cost and size of satellites.
  • Streaming video feeds. Many Commercial Off The Shelf (COTS) video cameras exist to stream low latency/bandwidth video to data centers. These can be used to monitor locations for security, research, safety, liability, and a variety of other use cases.

What are some examples of Unsupervised Learning algorithms applied to IoT?

  • K-Means clustering:
    • “Clustering for smart cities in the internet of things: a review” [Springer Link]
    • “Semantics and Clustering Techniques for IoT Sensor Data Analysis: A Comprehensive Survey” [Springer Link]
    • A Hybrid Approach: Utilizing Kmeans Clustering and Naive Bayes for IoT Anomaly Detection [Arxiv]

  • Gaussian Mixture Models (GMM):
    • “Clustering of Data Streams With Dynamic Gaussian Mixture Models: An IoT Application in Industrial Processes” [IEEE]
  • Apriori Rules: used to find groups of similar items. applied to IoT, can be used to find groups of similar IoT devices based on telemetry. Also been studied to separate
    • “Association of IoT Devices Using Fuzzy C-Means Clustering and Apriori Algorithms” [IEEE]
    • “Review on Recent Research in Data Mining based IoT” [Serials Journal]
  • Anomaly detection: Deep learning algorithms can be used to detect anomalies in IoT device data, such as unusual patterns or deviations from normal behavior.
close up photography of yellow green red and brown plastic cones on white lined surface

Photo by Pixabay on Pexels.com

  • Reinforcement learning algorithms: These algorithms are used to train an agent to make decisions in an environment in order to maximize a reward. Reinforcement learning algorithms are commonly used in robotics and control systems.
    • Reinforcement learning algorithms can be used to train an autonomous vehicle to navigate through a complex environment, such as a warehouse or a construction site.
    • Reinforcement learning algorithms can also be used to optimize the control of a process, such as a manufacturing process or a supply chain, by learning the optimal actions to take in different states.
  • Decision tree algorithms: These algorithms construct a tree-like model of decisions and their possible consequences, and are used for classification and regression tasks. Decision tree algorithms are commonly used in IoT applications for fault diagnosis and predictive maintenance.
    • Decision tree algorithms can be used to classify sensor data as normal or anomalous in a fault detection application.
    • Decision tree algorithms can also be used to predict the likelihood of a failure occurring in a predictive maintenance application.

Considerations for using Supervised OR Unsupervised Learning methods in IoT

The biggest consideration when deciding between supervised and unsupervised methods for your IoT application is whether you can label your data. It is generally worth the effort to build labeling into the design of the product rather than needing to label the data after the fact.

Some ways to build labeling into your IoT product:

  • Create a rules-based trigger to add an event label
  • Capture a video or image based on a trigger to collect pictures for image detection
  • Install a physical button: seriously.
  • Create a feedback loop with humans in the process
  • Add additional sensors to your prototype product

What are the steps to creating a predictive algorithm with IoT data?

Let’s assume you already have an IoT pipeline in place and are collecting data from some assets. Ideally, some business cases would have been discussed prior to setting up these pipelines. Here’s the general process:

  1. Follow CRISP-DM and start with a business use case.
  2. Identify the key requirements
    • What hardware, software, and people are needed to develop a proof of concept?
    • Do you need to build a prototype?
    • What are the requirements for the output from the algorithm?
  3. Collect the right data
    • With the use case in mind, and in collaboration with subject matter experts, whoever is designing the pipeline and the data collection effort can estimate what information is needed to solve the problem. It’s important to consider the granularity of the data as well.
    • Look to white papers and academic research to get a sense of what data is needed to solve the problem. If no one has worked on a similar problem, congratulations on being a pioneer! Designing experiments to understand what data and granularity are needed will be important. Having very granular data will create a huge overhead for data transmission, storage, and computation. Without enough granularity, you will not be able to get enough signal from your data to solve the problem.
    • Determine if a test environment or simulated data is needed to achieve the goal
  4. Collect enough data
    • Consider how long of a timeframe you need to assess and predict the event you’re interested in
    • For example, in machine degradation use cases, an FMECA can provide insight into how often a certain part will fail
    • Talking to SMEs will help determine how long and for how many devices you will need to collect data to meet the requirements.
  5. Determine a baseline model
    • The first step in any modeling effort is to create a baseline model. If there is already a model in place, that’s your baseline. If there is a decision-making process in place, that’s your model. You can re-create business processes using a decision tree and use that as your baseline. Often, we choose a very simple model, such as a logistic or linear regression model to serve as the baseline. Future modeling efforts will be measured against this baseline to determine the gains realized by the model. This is an important step for determining the ROI of your model.
  6. Iterate through CRISP-DM, improve the baseline model
  7. Test and validate the model
    • You will either test on actively used devices, in a test group, or in a simulated environment
    • This is where you will need to observe the model outputs and determine if the quality of the results is acceptable for production
    • It’s very important, as in any software testing, to test edge cases.

Conclusion

Machine Learning for IoT use cases has the potential to unlock tremendous amounts of value and information hiding in the piles of data that companies collect. There are many approaches to teasing out interesting information. The list of algorithms and examples here barely scratches the surface! The most important thing to consider is what is the business use case.

It’s critical to combine Data Scientists with Subject Matter Experts and Business Representatives to determine high-value and realistic data projects.

Leave A Comment

Let me know what you think!

Related Posts