
Leveraging algorithms and past performance, IIoT and analytics help determine schedules and increase uptime.
Historically, process manufacturers lacked sufficient data to predict critical equipment failures and instead relied on reactive maintenance to get plants back online quickly following failure. Prolonged periods of downtime piled up costs, exacerbated by extensive manual troubleshooting and root cause analysis.
Today, a significant increase in digitalization and Industrial Internet of Things (IIoT) implementations is expanding manufacturers’ access to equipment data, shifting the challenge from data availability to insight availability, and conversion of these insights into actionability. Process manufacturing companies are using advanced analytics solutions to gain insights from their data to predict equipment issues and inform optimized maintenance activities, leading to proactive maintenance programs, higher equipment reliability, and reduced maintenance and lost production costs.
Maintenance strategies have evolved as more data is measured, stored, and made available than ever before. This wealth of accessible data lets maintenance teams predict failures, calculate trigger points for condition-based maintenance, and share these insights with the personnel on the frontlines who are scheduling and executing the activities.
As data-driven strategies increasingly replace time-based maintenance, manufacturers are cutting operational expenditures previously dedicated to maintaining equipment on an arbitrary preventive maintenance (PM) schedule. This approach is coming merely decades after time-based PM replaced historic run-to-failure approaches, which forced companies to shoulder many costly unplanned outages.
With digitalization and IIoT implementations, manufacturers now have access to the data necessary for optimal equipment maintenance and reliability improvements, but the challenge has shifted to creating the right environment for analytics where contextual data can be viewed alongside process sensor data, and where time-series-specific calculations can be easily applied by process subject matter experts (SMEs). Advanced analytics software applications address this and other issues, empowering maintenance and reliability teams to uncover insights from many sources of information, informing actions based on predictive and prescriptive analytics.
“Analytics” is a broad-brush term used to describe any process that uses math to turn data into actionable information. It provides insights into consumer behaviors, marketing effectiveness, supply chain agility, financial performance, and other business functions. “Big data” analytics are necessary to deal with data in large volumes, velocities, and varieties, and there is no data of larger volume, greater velocity, or higher variance than those collected by sensors in process manufacturing.
A typical process plant stores time-series data from sensors measuring temperature, pressure, level, flow, vibration, and much more. A single refinery, for example, can possess hundreds of thousands of sensors with samples—timestamp and value pairs—recorded on intervals of hours, minutes, seconds, or even fractions of seconds. When dealing with large multinational companies, the number of sensors enterprise-wide can quickly approach a ten-digit figure. Performing analytics efficiently among these vast volumes of data quickly becomes paramount to unlocking the value hidden within.
But using advanced analytics applications to create meaningful insights from oceans of data has prerequisites. Big data is inherently complex, and it must be thoroughly understood and cleansed before it can be used in modeling and multivariate calculations. And of course, the adage “garbage in, garbage out” also applies, so process manufacturers must ensure the integrity of their data collection and storage systems before venturing anywhere near advanced analytics.
Once process manufacturers begin shopping around, they will notice nearly every software product, platform and cloud service on the market claims to perform some sort of data analytics, with the type of analytics performed differing based on each tool’s intended functionality. The qualifier “advanced” typically refers to the use of statistics and machine learning innovations in analytics to assess and improve insights. “Augmented” analytics tap into the same innovation themes, while putting the analytics in the context of user business intelligence applications and other frequently used tools.
Under the umbrella of advanced analytics, there exists a hierarchy, beginning with retrospective functions—including “descriptive” summary statistics and “diagnostic” root cause investigations—and building up to futuristic flavors like “predictive,” which tells users when to act and “prescriptive,” which instructs them what to do (Figure 1). These various types of increasingly complex—and useful—analytics work together, with the former two informing the latter two.