By clicking “Accept”, you agree to the storing of cookies on your device to enhance site navigation, analyze site usage, and assist in our marketing efforts. View our Cookie Policy and Privacy Policy for more information.

De-mystifying the Terminology of Digital Transformation

De-mystifying the Terminology of Digital Transformation

Are you confused about all the buzzwords, trends and opportunities around digital transformation? In this article we’re helping you with what you should do.

If you’re an engineer, operator, adviser, or executive working in heavy industry, odds are high you’ve heard more and more about trends, buzzwords, and ideas suggesting software and data science will radically transform your business. However, the confusion between long-term shifts in technology and short-term hype around jargon can hinder experimentation and adoption of valuable methods and concepts. What’s real? What’s not? What does it mean for you?


  • Industrial Internet of Things (IIoT) refers to the use of sensor and cloud technologies to enhance manufacturing and industrial processes.
  • Big Data refers to the explosion of data that can be collected, stored, and processed, both in terms of volume of data (terabytes or petabytes rather than megabytes or gigabytes) and frequency of data sampling (often in second or sub second intervals rather than daily or weekly reporting).
  • Data Science is an emerging field that applies mathematics, statistics, computer science, and related techniques to develop new insights from data sets - often of the “big data” variety.
  • Artificial Intelligence (AI) is a collection of techniques enabling computers to make decisions independent of ongoing human inputs.
  • Machine Learning is a subset of AI involving predictive algorithms that are “tuned” by using data directly, rather than (only) human expertise. Machine learning techniques are a key tool for data scientists and consist of the primary types of algorithms used in the context of AI, especially as they relate to industrial systems.

Certain types of machine learning approaches, such as “deep learning,” “reinforcement learning,” “neural networks,” “cognitive networks,” or other specific techniques may come up when discussing certain IIoT applications. For the purposes of most asset operators, engineers, and executives, we can lump these together in the bucket of “things that you don’t need to really care about right now.” All these methodologies are a means to an end – to help the business make better decisions about complex physical systems.


It’s for real.

The underlying technology trends driving cloud computing, cheaper sensors, and cheaper communications have continued for many years. They are expected to continue into the foreseeable future, according to industry analysts. Most major industrial companies are actively seeking to harness these technologies to improve their businesses – and so are mid-sized and smaller companies. Examples of IIoT applications in action include real-time alerts and notifications based on sensor threshold values, machine learning analytics related to sensor anomalies, and – perhaps the most desired application – forecasts and predictions about the future state of a system or collection of systems.


If you work in a heavy industry company, you probably work in a world of precision engineering, established decision protocols, and deeply ingrained operating practices. There’s a status quo approach to making decisions about most issues that could be critical for your business.

IIoT applications only matter if they provide faster decision inputs, better decision inputs, or ideally, better decision inputs faster. These inputs still have to integrate with a new or existing decision process to have any impact on the business.

If you are thinking about these issues, here’s our advice:

  1. Identify the potential value you want to capture through an IIoT application. How will you specifically reduce opex, improve throughput, improve yield, etc.? Why is this particular problem the right place for you to start? It should be because the bottom-line impact could be material to the business, not because the business happens to have lots of available data.
  2. Identify what you would do differently. It’s highly likely that your business already pays a lot of attention to the most obvious drivers that affect overall profitability in any given system or process. Why do you think an IIoT approach might be better? What actions will you or your team take with this new insight? Typically, the benefits relate to timing in which a decision could be made or the use of new insights improving a decision process.
  3. Get the right people in the room to confirm feasibility of any proposed application. Subject matter expertise, solution architecture, data science, and business process expertise should have eyes on the same problem, at the same time, in order to simultaneously assess the potential value, confirm the technical feasibility, and understand the potential hurdles in disrupting the status quo.
  4. Start fast. The wonderful thing about declining technology costs is relatively lower barriers to get started. Sometimes the best way to learn how your business should adopt IIoT is by experimenting – often with a tightly scoped, rapidly deployed prototype. Then breaking it, improving it, and finally implementing it at scale. With today’s technology, this can be done in a matter of weeks or months, rather than years. However, it still takes commitment to develop the program and find areas where value and technical feasibility align in a way that is meaningful for your business.