By Andrew Van Nest, Blumberg Capital

Robotic Process Automation (RPA) has reached buzzword status in the pantheon of AI, ML, big data, “the uber of…”, etc., and rightfully so. Simply googling “RPA” will bring up Gartner’s report projecting RPA’s growth from 2018 to 2019 and showing that the average large enterprise will spend $10-20M on process automation in 2020. RPA has the potential to become a truly disruptive and influential technology as it promises to automate mundane and repetitive tasks that have historically been done manually. The opportunity to reduce operational costs, increase productivity and help employees focus on creative, high value work is highly attractive. 

However, as RPA gains popularity one issue continues to be an ongoing problem: bots “break.” Once RPA initiatives are implemented, companies must face the reality that their process automations can be as fragile as glass. Many bots deployed currently rely on “screen scraping” or image recognition methods to automate information gathering tasks. Large enterprises’ IT and organizational structures are highly complex and business processes are constantly in flux. This means even a small change to the UI, newly connected APIs or data transposition could potentially interrupt the bot’s functionality. This breakdown in automation can cause downtime and lost business value with the potential for additional needed technical resources. 

After understanding the limitations of RPA, we wanted to look at how the issue of fragility is being solved. Two core strategies rose to the top: 1) Process Discovery and 2) Open Source Tools for Cloud-Orchestration. 

Process Discovery

The degree of fragility depends on the complexity of the process to be automated by the bot. This can make deploying RPA a headache. All too often companies rely on a number of different stakeholders, including consultants, business analysts and/or developers to map out an in-depth process for automation. And in some cases, too many cooks spoil the bot broth. The complexity of stakeholders can result in time-consuming delays to solve the problems.

We believe a focus on two initial questions will prove a more effective path: 1) What processes should you automate? and 2) In what circumstances are bots most-likely to break in the automation process? 

Companies can answer these questions by implementing process discovery. Based on event logs, the process discovery methodology creates new models where previous models do not exist or meet the needs within the enterprise. Using what are called “parameterization” bots and “process mining” to monitor and map the automation from inception will help answer these questions. “Parameterization” bots are simple bots that capture ongoing changes in data sources (e.g. object types, organizational elements). They are used when RPA bots are deployed to help the bots manage the most complex current status of the application. “Process mining” includes a variety of techniques (e.g. alpha-algorithm, fuzzy miner, genetic miner, etc) that support the analysis of business processes. 

Outlining these techniques upfront will enable the system to auto-flag process variations and provide fact-based evidence of changes in the automation process. The combination of these two mechanisms keeps the process automation flexible and resilient. It enables companies to implement and deploy more rapidly, reliably and with a plan to fix problems in the system, ultimately increasing uptime. Companies including FortressIQ, Mimica.ai and Celonis are working to solve this problem. 

Open-Source Tools for Cloud-Orchestration 

An open-source RPA platform is naturally collaborative because the formats for software bots are not proprietary to any single vendor. In addition, open-source frameworks turn bots into commodities. The open source model doesn’t have any licensing fees, so companies spend less on software. This enables organizations to combine software bots across different areas, such as workflow automations or ML applications, without multiple vendors or licenses. 

There are a number of open-source projects to choose from: Wechaty (for WeChat only), Automagica and Robot Framework. Each of these has advantages and limitations, but with 4,400 starred reviews and 104 contributors on GitHub and 300+ Python packages, Robot Framework is considered the most universal and one of the more robust solutions. It offers tests and process automations coupled with an engaged community of developers who continuously maintain the extension libraries. Robot Framework’s community leads to more resilient and less fragile bots. 

One example is Robocorp. The company is building leading edge tools for open-source RPA developers based on Robot Framework. Bots will become increasingly commoditized, and orchestration technology will define the utility and resilience of RPA. 

Cloud-native orchestration will be the determining factor in the future of reliable RPA. We spoke with Antti Karjalenein, CEO of Robocorp and he said, “Self-hosting automation platforms are not feasible for the majority of enterprises that will benefit from the technology. Cloud orchestration will also add new functionality to robots and allow them to integrate into business operations ever more seamlessly.”

In addition to the projects mentioned above, there are a few other open-source RPA competitors such as OpenRPA (UiPath clone with a support-based business model), TagUI and Robin (.NET based RPA language). The winner of the orchestration race, will dominate the RPA market, in  a manner similar to the battle between Docker and Kubernetes versus Swarm. 

As RPA adoption continues, we expect to see a maturation of the market including improvement of current products, additional new products and increasing customer adoption and satisfaction. Leveraging open source frameworks, cloud-orchestration and process discovery are all key. This new generation of RPA companies have the power and flexibility to reduce software bot fragility. This will lower the barriers to entry for current and potential customers to operate more resilient networks and applications, increase uptime, reduce professional services and staffing and reduce costs.