Examples of an ML workflow
Most of us have probably heard about machine learning algorithms, what they do and how powerful they are in terms of solving some business and technological problems. This fact sparks a lot of interest in machine learning and, as a result, an ever-growing number of organizations across various industries and countries are thinking of applying this technology to improve their decision-making process.
Even though sophisticated algorithms are key pieces of complex intelligent decision-making support systems they represent only the very tip of the iceberg. Normally, fully functional machine learning systems are made of different components each serving its own role and working in parallel relative to each other or, more commonly, consequently where the execution result of one component becomes an input to another component. Such a modular structure that acquire, process and analyze data in a certain order usually called pipeline, and in case with machine learning components involved – machine learning pipeline.
In this blog post, we are going to talk about one of the pipelines that we have built recently to meet one of our client`s needs. Our client runs and maintains a complex production facility populated with sophisticated equipment that consumes energy power. There is also a set of specific parameters which are used to manage and monitor the facility`s performance and health. These parameters are being measured by sensors placed all over the facility.
The client came up with a request to analyze which of the measured parameters are affecting the power consumption the most in order to understand how to manage the facility better and in a more energy-efficient way.
Modeling energy consumption given a set of parameters sounds pretty much like a machine learning problem, however, in order to get there as well as to solve the end problem of finding the most crucial parameters that drive energy consumption, we had to build an entire pipeline showed in the picture. Each module performs a certain task that is consumed by the following module. Let`s briefly go through all of them and look at what each and every module is performing:
Given a set of sensors that served the purpose of raw data sources we had to fetch these data streams, combine and aggregate them in a dataset that could be used further. We have utilized MQTT messaging protocol to connect and acquire readings that are coming from each and every sensor along with the time stamp when this reading was done.
The incoming data was processed by another module that aggregated and cleaned data from errors and anomalies. After collecting and processing of good amount of data sufficient to be used for modeling the corresponding module was ready to kick in.
The modeling component was responsible for training machine learning models that accurately fetch dependencies between power consumption and controlled parameters. Once we have ensured that the trained machine learning model is of acceptable quality our pipeline was ready to interpret the resulting model in order to get information on what parameters are influencing power consumption and, most importantly, in what way.
We have applied the latest machine learning interpretation techniques to learn different modes of work of the facility, where certain parameters become more influential than compared to other modes and other sets of parameters. Also, we have learned circumstances when power consumption goes up or down as well as detected how parameters depend on each other.
Finally, the derived insights were communicated to the end-user of the pipeline by the visualization component that was responsible for showing the resulting outcome of analysis performed by the pipeline.
On a closing note, at the end of this project, we helped our client understand what caused an excessive consumption of power on that facility as well a understand how to eliminate that extra power usage.
Ildar Abdrashitov Business Intelligence Analyst