<img height="1" width="1" style="display:none" src="https://www.facebook.com/tr?id=220603792599280&amp;ev=PageView&amp;noscript=1">

Intelligent automation is a big buzzword of late, usually used to mean the augmentation of RPA processes with machine learning, for example by plugging in a predictive model. At SKALER, we tend to use the term more broadly to cover intelligently designed solutions. RPA should be one tool in the box for facilitating automation, but not the be-all and end-all.  

Due to its versatility, it is very easy for RPA to become a Maslow's hammer, This can be seen in two different ways. Firstly, use cases that would be better off implemented with other, domain-specific technologies, get approved as RPA projects. In other cases, the techniques used to achieve a particular business outcome is suboptimal from a technological perspective. Both scenarios result in the expansion of an organization's RPA codebase. This is undesirable, if you think of RPA as the legacy IT of tomorrow. 

One might think that only organizations still in their RPA honeymoon period are susceptible to these kind of missteps. However, in the absence of a holistic strategy concerning data and automation, even long-time practitioners of RPA commit the same crimes (strategy work is hard, and making robots is easy, right?). 

Many articles have been written about what to consider when selecting processes up for automation. For example, the processes should be rule-based and involve sufficiently large volumes of transactions. Even after finding the right candidate, the choices you make in the design phase can make or break the project. In this article I will go over a couple of common RPA "traps". 

The spreadsheet robot trap 

Teams in all organizations today are encouraged to leverage their data for "insights". At the same time, they may have acquainted themselves with process automation and software robots.  

If an organization's analytics practices have not yet fully matured, it may not be clear to whom to turn to when a need for a new report arises. Consequently, these what-should-be analytics exercises accidentally become RPA projects via the path of least resistance. The same can be seen in organizations where the data and RPA teams exist in their own silos, i.e., there is no meaningful interaction or alignment of objectives between the two competences.  

While it is common for all automation processes to have some sort of reporting component, you should think twice before implementing robots whose sole purpose is to generate reports. There are a few drawbacks to RPA-driven data analytics: 

Lack of standardization – RPA developers are used to making (spreadsheet) reports, and their Excel-friendly tools certainly allow it. However, after implementing five robots, each churning out such reports, you are likely to have five report formats that don't look anything alike.  

Change friction – If reporting work is given to robots, even the slightest change to a report needs to pass through an RPA developer's desk. On the other hand, business intelligence tools such as Power BI, Tableau, Excel's Power Query(1) etc. are very approachable, allowing users to edit the dashboards and reports themselves without having to learn programming. This way, the users can reach their goals much faster and without unnecessary back-and-forth with the developer. 

Bureaucracy overload – If you have been doing RPA for a while, you probably have a process and controls in place regarding delivery and change management (as you should). But if some of your robots are just making non-critical reports, while others maintain master data in your financial system, does it really make sense to subject them to the same level of scrutiny? 

All SKALER solutions are built on top of a database where data about transactions are persisted. This allows us to skip implementing a custom reporting phase separately as part of each robot. Instead, we can expose the process data using the data visualization tool of your choice, in real time 

 

0BB3E070-4F73-4141-BFEC-AB2EC3B333F2_1_201_a

The email robot trap 

Very often an RPA process starts with ingesting new transactions for processing. Email is a very popular interface for achieving this, due to its familiarity. In the most desperate cases one might end up surface-automating the Outlook client (yikes!). If possible, using email for handing over and managing robot work should be avoided. Just because human work is driven by e-mail, it doesn't mean that the same should apply to robots. Instead, consider leveraging actual work management systems, such as ServiceNow, if available. 

Perhaps the biggest drawbacks in using email concern security. Even if your email service is appropriately hardened, validating the authenticity of incoming emails is tricky. Someone with knowledge of your processes can potentially generate illegitimate transactions that the robot goes on to process. Using email also means that copies of your process data become archived on your email server whether you intended so or not, and removal of messages (both inbound and outbound) needs to be handled somehow. Relying on email also has implications on availability; any outages in the email service will cascade into robot outages.  

Email is also commonly used to distribute a robot's outputs (such as the beforementioned static report files), even in cases where an actual document management system (DMS) such as SharePoint is in use. DMS solves many of the problems associated with email. For example, documents can be set up for automatic archival/removal based on their age. You will also be able to manage access to historical outputs retrospectively with identity and access management tools. 

27372DD3-4928-4D82-B528-12C3A1DAE2BB_1_201_a

The unnecessary-UI-automation trap 

UI automation means interacting with a system by simulating mouse -clicks and keyboard inputs. Sometimes, the target elements need to be identified by comparing to reference images or with OCR. This works well in automating legacy systems. But they should only be considered as a last resort i.e. when the target system doesn't offer any other alternatives. Anyone who has worked with said techniques knows how easily they are broken by the slightest change in the user interface. 

This point is worth repeating: do NOT use UI/surface automation on modern systems that provide APIs for fetching and updating data programmatically. Salesforce is a great example of such a system, as it is purposefully designed to allow power-users to edit the UI as a self-service. In such environments, surface automations are a disaster waiting to happen. Yet you commonly see them being targeted by robots, even when they expose excellent APIs that you can expect to work compatibly for a long time. 

Web applications with modern architectures often allow fetching data from the backend programmatically, even if a publicly facing API has not been exposed. An expert RPA developer can harness this to their advantage. Besides added robustness, interacting with the API layer directly allows the robot to work faster, when it can skip looking for UI controls, clicking and typing, and the browser can skip re-rendering the page. The focus of commercial RPA tools lies with interacting with the UI layer and hence don't cater well to these kind of advanced automation techniques. 

You can do better

The attempt of this article was to highlight the decision-making involved in implementing what we at SKALER consider to be intelligent automation. Oftentimes it is these, seemingly inconsequential, decisions that determine the successfulness of your automations down the road.  

Refactoring the robots later is costly and takes time away from building new things. That is why it is vital to carefully evaluate your options, beginning from the design phase. This sometimes involves challenging those commissioning the work, which can be easier with the backing of a partner that has learned things the hard way. SKALER can support you at all the different phases involved in implementing automations, including candidate evaluation and solution design. 

About the writer:

Otto Ahonen

A consultant who thinks in code. A recent convert to the open source RPA movement.

https://www.linkedin.com/in/ottoa/

(1) Power Query is a component of Microsoft Excel and worthy of a special mention. Unlike classic Excel, you can use it to define and run a set of repeatable procedures needed to arrive at your final report table. It's my go-to tool for automating small to medium scale analytics workflows. The report that your RPA developer says will take him several days can probably be done in an afternoon with Power Query. And what's best, the solutions can be maintained by any reasonably advanced Excel user, which in most organizations are abundantly available, at least in relation to coders. 

Would you like to know more?
Read more
RPA business case calculator
Blog — Ville Heinilä
Jun 22, 2022 9:19:10 AM
8 RPA/IPA Myths Busted
Blog — Harri Suonikko
May 20, 2022 6:56:32 AM
Sam Salonen to lead SKALER
Blog — Tommi Viitala
Mar 30, 2022 3:18:56 PM
10 reasons to start harnessing open automation in 2022
Blog — Harri Suonikko
Feb 11, 2022 12:58:33 PM
Read more