It seems that there is no single product or software that automatically categorizes a production system as modern — at least not yet. It is generally a collection of technologies used together and deployed in a concise way that allows the user to realize a current generation (aka modern) control system. We should also consider the possibility that by the time the solution is designed and implemented, it may already be a revision or two old. The bleeding edge of technology is moving at a rate faster than most companies can keep pace with. Below are a few of the key constituents of current technologies and how to leverage them to drive production optimization.
Automated production control and visualization
This article is not specifically about programmable logic controllers (PLCs) and human machine interface (HMI) systems, but they deserve an initial mention as they are at the front lines of how operators interface with machines and how the machines work together through input/output (I/O) and network interfaces.
The PLC is programmed to make decisions from its known machine state based on feedback from instruments, automatically taking action to drive the motion of actuating valves, speeding up or slowing down conveyors and turning pumps/fans. This allows for the production process to unfold in a repeatable intended sequence.
Automatic control is not in itself modern; it has been done for more than a half century, but the technology behind the products continues to keep pace with current generation digital technologies. Visualization is also not a new concept; it is as mature as the idea of automated production control (APC), but it too continues to realize advancements in capability with the forward progress of what could be categorized as user interfaces (things like HMIs and graphical user interfaces [GUIs]).
Most of us cannot visualize the hundreds or thousands of lines of code being executed by a PLC in a production environment to control the overall process status, including health, production rates, alarms and more. Nor do we typically have an easy way to command the control system to operate differently based on changing production needs. An operator interface system (OIS) provides this real-time feedback with rich diagnostics and allows the user to provide input to clear a fault, bypass a production step, change over a production line to run a different product or countless other commands. Leveraging modern visualization systems that employ identity authentication, time-stamped operator logs, production recipe structures and systemwide coordinated events, alarms and diagnostics have proven to be beneficial in reducing downtime due to production changeovers, shift or operator changes or system faults. The International Society of Automation (ISA) has developed standards that theoretically can apply to any production environment to ensure optimal design of the HMI.
Control/information network architecture
Traditional control systems were built with an electrical infrastructure that carried power from a sensor or instrument to a PLC I/O card. The I/O system and PLC would measure the circuit to determine if it was open or closed, the voltage level present or the amplitude of current flowing through the circuit. It would then translate this measurement into digital status values of on/off, yes/no and floating-point analog values to be used in machine/process logic.
Today, nearly all industries have adopted the use of digital communication networks to share data that is structured in onboard microprocessors of sensors and instruments with the PLC or controller. Manufacturing control and information networks are the well-accepted, present-day method of connecting hardware for the purpose of commanding equipment and obtaining data/information from this equipment. Network technology has opened the gates to a vast sea of data that can be made available to the user to make better decisions within a given manufacturing process.
Unfortunately, this does not come without the potential risk of malicious activity. A knowledgeable person could possibly infiltrate the network topology and obtain sensitive information about the functionality of the overall control system and could even intentionally cause problems within that system. The advancement of networked production systems has created the demand for technological advancements in physical and virtual security measures, leading to even entire fields of study including industrial network security and industrial cybersecurity.
Data collection and information
The availability of information and how it is used may be one of the most impactful aspects of how world-class manufacturers get ahead and stay ahead of competition. “Data rich and information poor” is a phrase known by anyone who has been mind-boggled by a gigantic array of DINTs, INTs, bits and bytes shapeshifting as the control system chugs along, executing the designer’s program. Data/information records used to be captured with chart recorders that traced out ink lines on scrolls of paper representing values such as fluid flow and kilowatts of power consumed. Production line workers would walk around with clipboards and physically write down production throughput values for a shift and the status of equipment for the purpose of communicating metrics to company stakeholders.
Today these functions can be effectively performed with software tools that time-stamp huge amounts of data in a repository over long durations. This can lead to insightful, customized dashboards that depict things like machine performance, staff productivity, uptime, downtime, system anomalies, events, scrap rates and more — basically anything that exists as a datapoint in the networked control system. Depending on the size and scale of a given production facility or enterprise, it would take dozens or hundreds of full-time employees across all shifts to capture and record the amount of data that current generation historian tools do. The cost of implementing such a tool becomes acceptable to manufacturers that are now realizing the need for better information and insight into their operations.
Choosing the appropriate data points to historize and determining how to combine data points to represent new, contextualized information can be challenging and still requires a deep familiarity of the process with insight from a broad cross-functional team. Once the historian model/structure is established, the value quickly proves out when done correctly and enables the use of this information in dashboards, reports, analytics and feedback to advanced modelling tools for the purpose of automated optimization.
Dashboarding and reporting
The same cross-functional team that conceived the structure of the production environment historian can and should be leveraged to then boil down the plethora of datapoints and information to determine what is most useful in the form of visualized dashboards and production reports. The flexibility and expandability of modern visualization software allows users to tailor the dashboards and reports for meaningful representation of the data available on a role-based format. For example, an operations manager may value details around staff productivity and totalized production throughput, while a maintenance manager may need a more focused view of the events/alarms that led to a downtime event. This role-specific portal into the information that exists within most control systems allows individuals to make highly educated decisions to improve and optimize the manufacturing process at hand. Often, influence from company leaders and regulatory entities will require specific information such as production key performance indicators (KPIs) and quality metrics, as well as tracking and tracing of products, raw materials and resources consumed throughout the process.
Analytics, AI and machine learning
The demand for data scientists in industrial manufacturing markets was insignificant at the time that automated production began to take hold in the 1970s and 1980s. Today it is not uncommon, and the capability of software analytics tools continues to grow. Manufacturing data analytics generally encompasses the measurement of and relative comparison of historized data with the goal of understanding and defining normal behavior, patterns and outliers (anomalies). If there are 10 variables to study, a well-versed person with access to necessary software tools can easily comprehend and prescribe steps to perform actions such as mitigating future downtime occurrences, improving product quality, reducing overall energy consumption or a variety of other optimization factors. When there are hundreds or thousands of variables to analyze for the purpose of optimization (as is often the case in manufacturing operations), the use of artificial intelligence (AI) and machine learning (ML) algorithms may be a better fit financially for an organization. AI and ML are a modern addition to the realm of automated control systems, and companies are now leveraging these software tools to drive changes and take action in production systems based on the results of automated computer analysis, pattern recognition and self-optimization algorithms.
Simulation and modelling
The adoption of simulation and modelling of production systems has grown significantly in the last decade. Manufacturers use these tools at the front end of the design of a new product, during the industrialization of the product and all the way through the lifecycle of the product to map out hypotheticals and provide insight to risks that otherwise may not have been considered.
The idea of a simulation or an active software model within the industrial manufacturing space is essentially a way in which a function, machine, process or environment is represented in an “ideal state.” A simulation is often off-line and used for research and development. An active software model may be embedded in a machine learning algorithm and used to drive decision making processes to bring the real-world, physical process in-line with the ideal model. Optimization of a given process is far more efficient with fewer risks and less financial impact when first performed in a simulated environment that maintains high fidelity with the physical environment.
Leveraging these capabilities in real-time with advanced software models and modeling predictive control have helped companies streamline process/production optimization. The information infrastructure — starting with control system hardware such as sensors and instrumentation, to PLCs that facilitate the actions of controlled devices and feedback from them, to contextualized information within a historian — is critical to the usefulness of these advanced technologies.
While no single technology or product makes a modern control system, accurate, meaningful information is a key constituent. The PLC commands devices and equipment to run in a programmatic, coordinated way while taking feedback from instruments and sensors. As it does so, it is churning out huge amounts of data which, when used properly, can represent highly meaningful information that can be time-stamped and stored, visualized, analyzed and used in a simulated model to improve fidelity. This information is the closed-loop feedback to an automated self-optimizing solution, or to humans enabling them to act in an educated way to drive production optimization improvements over time.
Kevin Senn is a solutions architect at Faith Technologies. He is responsible for creating, developing and maintaining customer relationships in the industrial manufacturing markets served with a focus on automation solutions. Kevin joined Faith Technologies in 2021 after spending 15 years in the industrial manufacturing automation markets working for Rockwell Automation and Magnetek with primary focus on power electronics technologies and electric motor control. He received his mechanical engineering degree from the University of Wisconsin-Milwaukee.