Data readiness is key to escape the advanced analytics trap
If not in name, the technology trap is a familiar concept.
It describes what happens when advanced capabilities enabled by a breakthrough technology meet the limits of adoption. Technical ideals like the printing press or personal computing enter the real world and suddenly face the challenges of widespread adoption and use.
Think literacy — or computer literacy — and in the case of analytics, data literacy. What is possible is limited by the readiness of people and systems to actually leverage these methods or technologies in a regular way.
That does not mean the industry is not prepared for developments like advanced analytics, but it explains part of the challenge initially with their adoption. Technologist Roy Amara described the gradual emergence of new technologies before widespread adoption partially as a result of our lopsided expectations — “We tend to underestimate the effect of a technology in the short run and underestimate the effect in the long run.”
Across asset-intensive industries, including in heavy processing environments, a similar development is playing out. A technology trap keeps innovation from securing a foothold in day-to-day operations. Investment vetting, procurement and implementation, not to mention worker skilling and change management, slow the use of new technologies. Internal technical resources required to adopt a technology can be a mismatch. New technology may lack industry-specific protocols or regulatory requirements. And then skepticism about the utility of the technology may linger.
In the past year, the financial pressure of the COVID-19 pandemic-induced downturn has left many businesses grappling with fewer resources to fund digital transformative initiatives. At the same time, the pandemic has shown organizations how mission-critical it is to adopt digital solutions. That recognition has left many with an acute need for industrial analytics.
Even as companies attempt to make operational applications like AI/ML, digital twins and operational orchestration a part of their operations, they are finding the analytics trap runs through their organization. When various units or personnel have varying degrees of data access and readiness, organizational priorities around digital transformation can reflect the demands of individual business units or personnel.
Misaligned priorities are a leading reason why digital transformation projects fail. And for organizations pursuing digital transformation, these false-starts are costly and difficult to rein in on the original timetable. Research from Harvard Business Review shows that $1.3 trillion was spent on digital transformation initiatives in 2018 alone, with an estimated $900 billion in waste from companies reporting that they did not meet their goals.
When projects meet their mark, however, companies can sustain operational excellence with forward-thinking initiatives. In an analysis across heavy processing industries, McKinsey found that plants and companies with low digital maturity improved their EBITDA by 3-5 percentage points on average through digital transformation initiatives. Even organizations experienced with digitization initiatives stood to gain from adopting maintenance strategies that improved their preventative maintenance (PM) program — about 1-3% according to the same study.
Successful digital transformation also makes early movers more competitive, outpacing the earnings of later adopters by 80% according to the Boston Consulting Group.
The key was a baseline data readiness across the enterprise in order to make initiatives in advanced analytics possible. Instead of literacy, companies needed to have the same language of data readiness in place — including data integrity and portability.
A key part of that data readiness, particularly for industrial analytics driven by operational technology (OT) systems, is the retention of original context. On-premise systems, largely as a result of licensing or network constraints, cannot maintain data in context, which restricts its usefulness. Keeping that context for future use ensures that operational applications are cost-effective and multi-purpose — not having to undergo pre-modeling. As often happens now, OT data loses its context with consumption for a single purpose like enterprise reporting.
Making that data legible throughout the organization is a core part of enterprise data management, moving up the deployment of operational applications from years and months to weeks. And many processors, and especially those core users of data from smart sensors and automation systems, can bring data from distributed control, monitoring and collection systems into the cloud for enterprise use. An enterprise strategy for data acquisition, ingestion and access sets the foundation for the digital maturity of different areas of the business.
With visibility into conditions at the plant level with the right context, business departments across an organization can manage with the operational excellence required to meet bottom-line objectives, sustainability goals and regulatory requirements — all from a single repository of contextualized data. Rising O&M costs, resource shortages and aging infrastructure are challenging operators to do more with less. Industrial analytics, powered by a cloud-native data ingestion and access strategy, offer a path towards digital transformation to address these challenges, both within scope and on time.
About the Author: Dr. Dave Shook is the chief data officer at Uptake. He has 30 years of industrial experience in automation and deriving value from the analysis of operational data. Prior to joining Uptake, Dave was the co-founder and CEO of ShookIOT and before that, served as the CTO of Matrikon, where he was instrumental in the development of its products and solutions. Dave holds a doctoral degree in chemical engineering from the University of Alberta and is a recipient of the DG Fisher Award for Process Systems Engineering.