In our last post we discussed how ISA-95 is a missing link in the many conversations about Smart Manufacturing. The additional technologies and solutions from Smart Manufacturing provide more data. However, various circumstances surrounding data and data architectures in companies makes it difficult to glean significant value from the data for operational intelligence and improved decision making.
Enter Data Driven decision making and ISA-95. ISA-95 provides a set of standard terminology, information and operations models, and data formats and exchange architectures to define how data can be structured, processed, formatted, exchanged and shared, and thereby increase the value of the data to the organization.
Let’s take this conversation another level deeper into how ISA-95 increases the value of a Data Driven practice.
Gaps in Workflows
Let’s look at a typical manufacturing plant. The current state might look pretty good because they might have implemented Lean Six Sigma and you might have started down the path of IIoT (Industrial Internet of Things) by pulling production, process, or maintenance data from some machines and processes. You’ve solved some low hanging fruit problems as a result of the new information available from the IIoT solution. Now your production has increased without some of that pesky unplanned downtime or reduced rework. Nice job!
However, there might be other challenges which you’ve been aware of, including:
- The plant is not turning over enough inventory
- There is too much WIP
- The plant is creating and shipping the wrong products because modified paper drawings aren’t making it to the plant floor in time
- New priorities came in from a customer which changes what order to work on now and the operators already completed the order before the change was communicated via paper to the plant floor
- New schedules were published but raw materials weren’t ordered in time
- The Lean process is still very manual and data intensive.
- The company has multiple, heterogenous paper-based systems, legacy systems, vendor based systems, and proprietary systems which make it difficult to get the big picture of what’s going on.
Honestly, it’s a wonder you’re able to get anything done.
When you step back and think about how things are produced in your company you can see the workflows of materials through the plant with the value-add processes performed on them. Overlaid on those workflows is data that flows along with the materials. Some of this data is in a digital format. As an example, there is a process for receiving an order with the customer’s drawings, acquiring the customer’s drawings, creating or updating and maintaining the company’s version of the drawings engineering team for production. Let’s assume this whole process is digital with Inventor, vault, etc.
However, there are places in the workflows where that digital data will move along and then run into a wall and stop…a wall of paper and pen, or spreadsheets. Maybe for the engineering drawings example above the process for acquiring through maintaining the drawings might be digital and smooth. However, once the drawings are done, they need to be printed, put onto a clipboard, and hand-delivered to the plant floor.
That might be ok for the start of an order where the drawings arrive before setups and production on the order starts. What happens, though, when the customer requests a change after the drawings are already on the plant floor. Let’s say the person responsible for delivering updated documents is out sick and no one realizes the updated drawings haven’t been sent to the operators. Worse yet the machines have been setup for production and production has already started. It’s possible some or all the parts for the order might be shipped to the customer before anyone realizes some or all of those parts are incorrect. The paper wall where the drawings need to be printed and hand delivered by humans is a very error prone process which can have very costly consequences.
This is a gap between operations…a gap in the flow of valuable information to make the whole process more efficient and flexible. There is value in addressing these gaps. The value is in making the flow of data smoother and more structured. This enables the data to be shared with the rest of the organization as needed. These solutions are very valuable and worth implementing.
Options to Solve
These and many other challenges can be solved. There are often two approaches to solving these challenges. The first is to create point-to-point solutions. A point-to-point solution is where an IT person will create or find and install a small, inexpensive tool which provides, in this example, maybe a tablet with the capability of viewing the latest drawings. That’s great because this specific issue has been solved in the short term. However, point-to-point solutions are not a good idea.
Think of the challenges that already exist in IT systems. There are:
- Legacy systems, vendor-based systems, proprietary systems
- Likely no information architecture or application framework for the whole company
- Silos of data throughout the company
- Duplicate data in those silos such as customer data in accounting, in sales, in shipping
- Multiple versions of that data which don’t match
- Systems connected to one another without the benefit of an overarching information architecture
- Custom integrations and custom schemas often created by software developers that didn’t document the system and are no longer available.
This makes for a very fragile, brittle system. One change in one system can have a domino impact on many other systems.
Let’s say you then want to change one of the software applications, switch out your CMMS, or upgrade your ERP. But once you do that this house of cards starts begins to teeter…you must change multiple other systems, etc. All in all, it can be a very expensive, resource intensive mess to maintain.
It can also be the price of success. This kind of system got you where you’re at. However, it won’t get you to the bigger and better manufacturer you want to become because you’ll constantly be fighting this beast.
Another option is to build those connections of systems in such a way that the data exchanged happens within a well-defined, industry standard architecture that is built to how manufacturing companies are structure and operated.
Think of it this way…which garage do you have now? In which garage is it easier to find that 3/16” wrench, or the coping saw, or the zip ties to tie up those Christmas lights? Which garage would you rather have?
What we really should be trying to create is a well-organized system which exchanges data between systems with structured data formats, and significantly increases the value of information people consume on the other end. This is a system that efficiently exchanges data from the business unit such as engineering drawings and production schedules to the plant floor as soon as they’re updated and approved in the business unit. Production, process, and maintenance data from the plant floor is available in real-time to the business unit for planning and resource allocation to handle various challenges and improve operations.
This kind of capability is possible only by applying some industry standards which organize the data within the context of the actual operations in place, and with a open architecture of communicating the data.
Think of that organization of the company like a 3-layer cake. I’m partial to 3-layer carrot cakes with the delicious cream cheese icing and black walnuts. The top layer of the cake is the business unit where the strategic planning, forecasting, scheduling, and logistics work happens. The bottom layer of the cake is the plant floor where operators, maintenance, supervisors, plant managers and others produce the products. The middle layer is where the communication of data should happen between the two other layers. That communication of data should also contain the context of the operations to increase the value of the data exchanged.
However, this middle layer is typically comprised mostly of paper and spreadsheets which are used as data collection tools and repositories. Communicating data with these tools is very slow, inadequate (not enough data), often inaccurate (because it was collected by humans), and unstructured (which decreases the value of the data). It is nearly impossible to have real-time intelligence on a company’s operations when much of the data is on paper and spreadsheets.
How it Works: ISA-95
The ISA-95 standard is the right tool for designing and implementing the communication of data in that middle layer of the cake. Let’s review how this works.
The ISA-95 standard provides a single definition of terminology, operational models, and information models about those operations, along with ideas around how that data should be exchanged. These definitions become the foundations for how manufacturers can organize and exchange data within their organizations in a way such that the data provides the highest value possible.
As a bit of background, ISA is the International Society of Automation. They are a nonprofit industrial organization that puts out the good word on how to improve manufacturing companies with standards and best practices around machines and automation of those machines. This standard has been around for over 20 years and is still being revised and updated today.
Let’s go back to the situation above where communication of data between the business unit and plant floor is often based on paper and spreadsheets. This middle layer should provide real-time information with context on the operational situation to everyone who needs it. There are some requirements to enable the real-time exchange of more valuable information:
- Technologies to digitize the paper and spreadsheet data
- Industry standard models for structuring the data and adding context to the data, and
- A design and architecture for the automated exchange of the data with multiple, heterogeneous systems in the manufacturing company.
The first concept, technologies for digitizing analog data, can be addressed by various off-the-shelf and custom software products which are planned and implemented in line with the company’s goals, strategies, and roadmap moving towards operational excellence. The second and third concepts are addressed by the ISA-95 standard.
Let’s dig one level deeper and briefly describe how this standard works.
The ISA-95 has system hierarchy which describes the various systems used in manufacturing and how they’re related to one another. This concept makes clear the scope of the various systems, as well as what and how they’d communicate across the organization.
The system hierarchy is as follows:
- Level 1 – Instrument/device
- Level 2 – Intelligent device, process control, and supervisory control and data acquisition (SCADA) functions
- Level 3 – Operations management functions
- Level 4 – Enterprise planning and logistics functions
The idea of the 3-layer cake above is an oversimplification of this system hierarchy. Even so, it is a good tool for describing the basic structure of a manufacturing enterprise with all of its multiple plant systems and networks. A diagram of the system hierarchy can be seen in the inset of the 3-layer cake.
Additionally, there are multiple pieces to the standard, some of which are:
- Information model
- Operations model
- Events, functions, attributes
- Data exchange architecture based on publish/subscribe
- Data schemas
Within the operations model there are workflows with attributes, functions, and events. These concepts are used in conjunction with one another to describe a particular workflow. The one or more workflows make up the model for operation of a specific manufacturing process.
Data is communicated to communication bus when certain events occur in the workflow. The operations events can be normal actions in production, quality, maintenance, or inventory movements. There are also events that will occur for adverse events. That data is also sent into the communication bus.
This data sent as a result of the events has a certain structure and set of data points. The structure and collection of data points should contextualize all data associated with the operations event into a single message.
These real-time messages originating from the event-driven architecture are then publish into the messaging bus which uses a publish-subscribe architecture. Other heterogeneous systems will subscribe to one or more messages to obtain the information necessary for that system.
We should also mention that the resulting message communicated over the event-based architecture is “transport agnostic” and will work with any existing and new IIoT protocol technologies.
With all of that defined, let’s bring this home so we can clearly understand why all of this is important.
This whole system is defined and documented. The fact that everything is so well defined and documented, and the nature of the communication and data architectures the system then becomes highly scalable, flexible, and reusable.
Additionally, this thoughtful, well documented, well designed system then becomes an insulating or abstraction layer between various systems that are exchanging data. In such a system if you change one system it won’t break the sharing of many others.
With such a system more valuable information can be shared in real-time with the right people. That sharing of data enables better operations visibility, intelligence, decision making, and ultimately better actions (i.e., Data Driven).
This more valuable information with better decisions and right actions leads to a more efficient, flexible, and competitive manufacturing company.
Another way to summarize these concepts is that when data architectures become simpler, in a manner of speaking, they then become cheaper in the long run…and cheaper is better.