Like many new technologies, robotic process automation (RPA) is disrupting traditional solutions and introducing different risks related to its use. Bots act differently from other traditional software and services because they leverage the same interfaces as a human user, but they are inherently not human.
A bot’s access needs to be managed and there are several factors to consider. Bots bring about new risks and challenges around identity and access management (IAM) and cannot be treated like a human identity. Bots add a layer of complexity in that they can introduce abstraction for a human’s aggregate access, in which one person may have access to many individual bots. Also, how a bot’s credentials are managed and who has access to modify or execute bots within an RPA platform need to be controlled. For these reasons, bots cannot follow the traditional controls used for human identities. Here are just a few examples of identity-related considerations to account for when implementing RPA bots:
Component | Considerations |
Identity lifecycle | • How is the bot’s identity being managed in the Identity Governance and Administration (IGA) product (e.g. SailPoint, Omada, Saviynt, etc.)?
• Is the bot’s access to accounts and entitlements being tracked and managed over time for traceability and audit purposes? • Is the bot’s access disabled when it is no longer required (e.g. when bots get decommissioned)? • Is there a well-defined naming convention for bots to help clearly identify and differentiate them from user and service accounts? |
Bot ownership | • Who is responsible for requesting and approving access for the bot?
• How does ownership transfer as the bot’s tasks change? |
Recertification | • Who is accountable to recertify the bot’s access? |
Privileged Account Management (PAM) | • Is the bot properly interacting with the PAM product if performing privileged activities?
• How are the bot’s credentials being stored and are they susceptible to theft? • Are bot’s activities and sessions being logged and monitored? |
Segregation of Duties (SoD) | • Does the introduction of bots pose any new SoD conflicts? |
Segregation of Duties (SoD) is a consideration that is often overlooked in RPA implementation. As organizations spin up bots to automate day-to-day processes, they assign business owners to manage these bots. The access requested for a bot could conflict with the business owner’s existing access, creating SoD violations without the organization’s awareness. Consider the following SoD use cases:
- Human to Bot SoD – A human’s direct access to a system is in conflict with a bot’s access that the human owns.
- Bot to Bot SoD – A bot’s access is in conflict with another bot’s access and both bots are owned by the same human.
Most of today’s identity governance and administration (IGA) tools are not equipped to identify potential SoD violations that can arise when a bot is assigned to a business owner. Organizations need to revisit access request workflows when implementing RPA to account for SoD violations.
Privileged Account Management (PAM) is a major consideration when automating processes using bots. Bots must be designated specific access to perform the duties they are programmed to run, thus the potential need for privileged access. When access is requested for a bot, it is assigned an account to access the necessary systems or assets, but how is this account managed? If the account grants privileged access to a system or asset, the account should be managed inside of a PAM tool, requiring the bot to check-out the credentials. The organization should also consider the humans responsible for managing the RPA platform. An RPA platform should be considered a sensitive asset, thus requiring the organization to manage credentials used to access it. A PAM tool should be used in this scenario to ensure greater control over bots.
Vendor Marketplace
The current IAM vendor marketplace is evolving and looking for ways to solve for identity-related security concerns. IAM vendors are adapting to the issues around RPA and are continually advancing their tools in the market. Here are just two examples of vendors looking to solve for IAM concerns around RPA:
- CyberArk has partnered with UiPath and Blue Prism to securely store and use privileged credentials (e.g. user IDs and passwords and/or keys) that bots are using in the organizations’ environments. CyberArk has developed connectors with these RPA solutions to integrate their product as part of the automation workflow. With this partnership, privileged credentials assigned to bots are protected within a secured PAM solution.
- The SailPoint 7.3 update includes bots as an official identity type. Introducing a bot identity allows an organization to create unique workflows and logic using the new bot identity. As processes are automated, bot identities can be created within the IGA tool, assigned access and owners. SailPoint 7.3 includes a feature to create policies (e.g. Entitlement SoD policy) using this identity type. This feature helps improve governance on bot identities to prevent bots from gaining too much access. This release also includes features such as requesting access and access certification for bot identities.
The marketplace has also become aware of the challenges associated with the rapid adoption of RPA and is continuing to mature the tools needed to secure bots and their access. Organizations need to work with their IAM vendors to discuss concerns with RPA.
What’s Next?
The benefits RPA can deliver to organizations are virtually unlimited. RPA helps organizations save time and money while building efficiencies in performance and cost management over the long-term. But moving to RPA without a solid plan in place for managing bots could become a serious misstep that will, ultimately, cost the organization more than efficiencies gained.