Process industries were among the earliest adapters of robotic process automation (RPA), a commitment that, according to several recent studies, is unlikely to change anytime soon.
A recent survey by Forrester Consulting, for example, shows that two-thirds of companies currently using RPA plan to increase their automation spend by at least 5% over the next 12 months. Gartner, meanwhile, predicts RPA spending in all industries will top $1.5 billion in 2021.
But despite widespread deployment by processing companies, more than half of all users are failing to realize the lower costs, higher productivity and improved customer (and employee) experiences RPA promises to deliver.
This failure becomes even more concerning when considering the technologies looming on the horizon which represent the next wave of digital transformation. Hyperautomation, for example, has been dubbed the future of RPA. Enabling processors to harness the power of RPA, artificial intelligence (AI) and machine learning in order to process massive volumes of data in an end-to-end automation toolchain, hyperautomation has shifted from an option to consider to “a condition of survival,” according to Gartner.
When properly deployed, hyperautomation is capable of boosting efficiency and productivity on a larger scale than ever before possible by allowing companies to automate more complex and complete business processes instead of only parts of them, applying automation at speed. Speed, however, is hardly the only benefit. Gartner predicts organizations will be able to lower operational costs by as much as 30% as early as 2024 by combining hyperautomation with redesigned and optimized operational processes.
Like hyperautomation, artificial intelligence represents another inevitable step in processing’s digital evolution. Gartner predicts that by 2025, AI will be the top category driving infrastructure decisions. Half of all enterprises will have devised AI orchestration platforms. That compares to fewer than 10% last year.
With AI seemingly destined to become the next big thing for processors, companies need to recognize that AI complements RPA and together they form a more robust and comprehensive platform for automation — intelligent automation. Blending RPA’s rules-based automation capabilities with AI’s cognitive capacity and the trial-and-error learning power of machine learning, intelligent automation will enable end-to-end automation of complete business processes and orchestration of work across teams comprised of both humans and bots.
Yet another near-term technological advance, low-code/no-code platforms, has the potential to become a massive disrupter in the RPA space. These platforms, which allow for rapid software development, will empower business users to be more involved in the development of automations, accelerating time-to-value and decreasing reliance on IT.
While all of these technologies have the potential to provide even greater increases in productivity and cost savings than RPA alone, none of them will be able to deliver maximum return on investment unless processors are able to get past the constant break-fix cycles that currently plague their automations, increasing RPA uptime and ultimately boosting the business value RPA promised to produce.
To that end, processors that are spending too much of their time patching RPA breaks need to go back to the beginning before RPA was deployed to determine whether flaws in their automation design are inhibiting their ability to maximize returns. This typically means clearly defining the criteria for selecting which processes to automate and then evaluating whether the processes originally chosen actually meet those criteria. In other words, automating poor processes will lead to poor automations that regularly fail, eating away at expected business value.
To aid in selecting the right processes to automate, it is essential for the business and IT sides of a company to share ownership of RPA. IT understands the limitations of RPA and is well-positioned to ensure that strong business processes are selected to be automated. Business, on the other hand, understands what goals the organization wants to achieve and can apply its knowledge of the processes currently being used. Together, they can examine any inefficiencies or gaps that have occurred and identify RPA candidates which are best able to deliver resilient, high-quality automations that are not subject to constant breaks.
It is equally important for processors to establish their own RPA center of excellence, a dedicated, cross-functional team whose primary responsibility is to define a strong governance model for the entire organization, identifying and deploying automated processes enterprise-wide.
All too often, RPA failure comes down to the inability of companies to consolidate automation tools and harmonize automation initiatives. This inevitably leads to the creation of islands of automation in which each line of business within an organization creates its own, independent automations. This siloed approach typically is characterized by a lack of knowledge sharing, different automation design practices, repeated mistakes and even siloed RPA tools, all of which serve to increase overall corporate costs and reduce the level of quality delivered by the automations employed.
With an RPA center of excellence, processing companies are in a much better position to avoid such a siloed approach by defining the processes for identifying, assessing, validating and prioritizing RPA opportunities for the entire enterprise. It will also ensure that continuous improvement through a lessons-learned methodology can be implemented so that costly mistakes are avoided in the future.
The center of excellence will be responsible for generating the templates and guidelines needed to standardize how automation work will be designed, communicated and developed following best practices. It will also ensure that these documents are packaged and communicated in a way that prioritizes precise development by offering accessible, explicit guidance that fosters collaboration among all stakeholders. Far too many companies still rely on paper-based mechanisms (such as process design or solution design documents) to provide process guidance, which invariably leads to missed requirements, fragile automations and bad deployments.
Similarly, comprehensive dependency mapping in automation design allows processors to better manage change. Interface changes in the automated process with which the automation interacts is the most common reason bots break. Many of these breaks can be traced back to a failure to connect automated processes with their dependencies, such as the legacy systems and applications with which they interact.
Through robust dependency mapping at the beginning of the deployment, processors will be in a much better position to manage changes which impact automated processes, particularly with respect to regulatory policies and constraints which are subject to more frequent updates and revisions. Meticulous dependency mapping also allows companies to shift from a reactive change management strategy to a proactive initiative which addresses dependency changes while minimizing RPA downtime.
By taking a hard look at these hidden risks which may be undermining RPA uptime, processing companies will be able to significantly improve bot availability. As processors look to the future — and the technologies that are already beginning to be adapted industry-wide — increased uptime will spell the difference between maximizing business value by finally scaling automations enterprise-wide and becoming bogged down in constant break-fix cycles.
Put another way, the standard average of automation uptime is 92%. That means a single bot is unavailable for approximately 29 days each year because it has broken or thrown errors and, as a result, has been unable to execute. That translates into 29 days in which the bot has been subject to investigation, reconfiguration, testing and redeployment for every break that occurs.
In terms of how that translates to business value, 92% automation uptime means that $20,000 worth of business value will be lost while that single bot was unavailable. For one bot, that may not be a tremendous hit to the bottom line. For a more realistic digital workforce of 100 bots, though, 92% automation uptime means $2 million in lost business value annually.
The numbers don’t lie. Given the increased dependency of processing companies on RPA and the new technologies now coming online, processors need to act now to ensure their automations are maximizing ROI today and ready for hyperautomation, AI and the other advances that will enhance RPA in the future.
Charles Sword is the Chief Revenue Officer at Blueprint Software Systems and is responsible for all aspects of market development for Blueprint’s Enterprise Automation Suite, a powerful digital process design and management solution that enables enterprise organizations to identify, design and manage high-value automations with speed and precision in order to scale the scope and impact of their RPA initiatives. Charles is a recognized expert in emerging technologies with over 20 years of experience delivering high impact, strategic solutions for Global 2000 organizations and is passionate about delivering technology that helps teams to rapidly optimize, automate and digitally transform their organizations. For more information, call 1-647-288-0700 or visit https://www.blueprintsys.com/