How to Empower the “Sec” in DevSecOps

In this blog, I’m going to talk about Privileged Access Management (PAM) for security in a modern DevOps world. A very frustrating topic for sure, but please hang in there. There’s new light at the end of the DevSecOps tunnel!

The Sec (Security) in DevSecOps is typically forced into the mix due to corporate policy or industry regulations. Most DevOps teams, however, consider PAM a blot on the landscape because it gets in the way. PAM aims, in part, to simplify and centralize credential management (often referred to as Application-to-Application Password Management or AAPM).

However, it tends to be complicated to deploy and manage, non-intuitive for modern workflows, and require lots of manual care and feeding. Although it’s a policy violation to ignore corporate security controls and practices, DevOps will become frustrated and often bypass PAM resulting in increased risk, not to mention wasting your PAM investment dollars. As such, it doesn’t fit the DevOps model that strives for speed and agility through automation.

But that’s changing.

I’m going to talk about modern AAPM, and four different ways PAM can improve security — while making it easier on DevOps to help ensure they embrace it and adhere to policy. Each successive approach adds incrementally more benefits, so based on your current maturity and risk tolerance, you can pick the one that best suits your needs.

1. Standard AAPM Approach Leveraging Vaulted, Static IDs/Passwords

Applications and services rely on IDs and passwords to authenticate to other workloads. These passwords are vaulted, and the developer makes a REST-based API call to check out the password.

All this requires setup, however. DevOps must vault all these passwords, give each workload a service account to log into the vault, configure roles and rights in the vault to constrain which passwords each workload can check out, and provide developers with code (e.g., an OAuth2 client) to access the vault.

Reduce risk by taking credentials out of code where attackers can trawl for them.

A lot of manual vault-related configurations, increasing operational overhead and impacting speed and agility. Multiply that by the number of workloads and it can stretch into the thousands.

Better intrinsic security around IDs/passwords, since they’re secured and centrally managed in the vault.

Each new workload needs a service account to authenticate to the vault. Configuring hundreds or thousands of additional service accounts increases your attack surface.

Instead of passwords remaining static, the vault can rotate them routinely to increase entropy, making them less predictable and harder to exploit.

Static passwords are not temporary. Even though the vault can rotate them, the rotation schedule tends to be infrequent (e.g., 24 hours minimally) giving an attacker a big window of opportunity to exploit.

Static passwords embedded in code can cause the application to fail if the original password changes. Checking out a vaulted, synchronized, password via an API call helps ensure the retrieved password is always current.

If an IT admin or attacker changes a password outside the vault’s control, workloads that rely upon the vault as the source of truth will fail, especially if the vault’s password rotation schedule is not sufficiently granular1.

Static passwords can be shared amongst developers and between workloads, increasing risk and impeding visibility into dependencies.

1 If you use Centrify Client-Based Password Reconciliation, you can avoid this downside. Check out a recent blog post, here.

2. Standard AAPM Approach with SSH Keys

SSH Key Management

Many organizations recognize the risks associated with the widespread use of shared privileged accounts for authentication, both by human administrators and by workloads. In turn, they’ve elected to use Secure Shell, or SSH, keys instead.

With Centrify Privileged Access Service 20.4, you have the option of leveraging SSH keys in the form of a matching set of cryptographic keys, which can be used for authentication by users or applications without any password being sent over the wire.

The PAM approach here is very similar to Option 1. Instead of checking out a vaulted password, the workload would check out a vaulted SSH key. The same pros and cons apply, with the addition of:

Reduce risk by taking credentials out of code where attackers can trawl for them.

A lot of manual vault-related configurations, increasing operational overhead and impacting speed and agility. Multiply that by the number of workloads and it can stretch into the thousands.

Better intrinsic security around SSH keys, since they’re secured and centrally managed in the vault.

Each new workload needs a service account to authenticate to the vault. Configuring hundreds or thousands of additional service accounts increases your attack surface.

Instead of SSH keys remaining static, the vault can rotate them routinely to increase entropy, making them less predictable and harder to exploit.

Static SSH keys are not temporary. Even though the vault can rotate them, the rotation schedule tends to be infrequent (e.g., 24 hours minimally) giving an attacker a big window of opportunity to exploit.

Static SSH keys embedded in code can cause the application to fail if the original SSH key changes and the public key is updated on the target. Checking out a vaulted, synchronized, SSH key via an API call helps ensure the retrieved password is always current.

If an IT admin or attacker changes an SSH key outside the vault’s control, workloads that rely upon the vault as the source of truth will fail, especially if the vault’s password rotation schedule is not sufficiently granular1.

SSH Keys are more challenging to hack, and the private key is never sent over the wire. Thus, SSH keys are more secure.

Static SSH keys can be shared amongst developers and between workloads, increasing risk and impeding visibility into dependencies.

Instead of SSH keys remaining static, the vault can rotate them routinely to reduce the risk of compromise.

SSH keys can be harder than passwords to keep track of and lack visibility into how they’re being used, by whom, or for what.

Static SSH keys embedded in code can break things if that password changes. Checking out a vaulted password via an API call helps ensure the retrieved password is always current.

Allowing developers to create keys without any central management constitutes a shadow-IT problem. When the developer leaves, what happens to the keys? SSH keys can be lost (e.g., when a developer leaves the company), potentially impacting system/service availability

3. Standard AAPM Approach with Ephemeral Tokens

This model dramatically improves on Options 1 and 2. Ephemeral tokens are a modern form of secure access that is temporary, time-based, and with automatic expirations. They’re also created automatically by the vault, on-demand, freeing up DevOps from dealing with installation, configuration, and rotation issues while contributing to a best-practice “Just-In-Time” access control model.

Developer code can request a token from the vault (e.g., OAuth2 or OpenID Connect) and use that to authenticate to another workload. Once that session is complete, the token disappears.

Most of the same pros and cons from Option 1 apply, but ephemeral tokens bring additional benefits.

Reduce risk by taking credentials out of code where attackers can trawl for them.

A lot of manual vault-related configurations, increasing operational overhead and impacting speed and agility. Times that by the number of workloads and it can stretch into the thousands.

The vault generates the token uniquely for each workload, just-in-time, reducing manual overhead.

Similarly, each new workload still needs a service account to authenticate to the vault. Configuring hundreds or thousands of additional service accounts increases your attack surface.

Short-lived, existing only for the duration of access, thus more secure since there’s nothing static/physical to share or lose. Also, should an attacker compromise a server, there’s no static credential to steal.

4. A Modern AAPM Approach with Centrify Delegated Machine Credentials and Ephemeral Tokens

Centrify Delegated Machine Credentials

Finally, this newest model dramatically improves on all previous models. It reduces manual work and operational overhead to near zero for PAM. When deploying new VMs, cloud-based instances, or containers, DevOps can fully automate obtaining the Centrify PAM software, installing, configuring, and enrolling them in the Centrify Platform.

More specifically, for workload automation, the PAM mechanisms that enable new workloads to authenticate to the vault and consume vault services (such as retrieving a secret or obtaining an OAuth2 token) are also automated. During machine deployment, a Centrify Client is downloaded from a Centrify repository, installed, and used to enroll the machine in the Centrify Platform with one or more Centrify roles to constrain access. This results in a trust relationship and a unique service account for the machine (aka a machine identity) along with a credential allowing the machine to authenticate to the Centrify vault and consume those pre-authorized services.

Next is the unique part. The machine can then delegate its credential for use by local workloads, obviating the need for DevOps to create and manage individual service accounts for each one. The workload uses the Centrify Delegated Machine Credential to authenticate to the vault and, in turn, receives an OAuth2 bearer token it can use to consume vault services. Since not all workloads require the same vault access, the OAuth2 token can be scoped per-workload, to constrain access to specific vault APIs.

Once authenticated, workloads can (based on permissions) access vaulted objects such as account passwords, SSH keys, and secrets, or request additional ephemeral tokens, such as OAuth2, for authentication to other systems and workloads.

With a machine-to-vault trust relationship, a single service account per machine, the use of roles, and OAuth scopes to control and constrain vault access per-workload with 100% automation, the benefits from a security and operations perspective are significant.

Full automation enabling PAM to become a first-class citizen of the DevOps pipeline – finally.

One service account (automatically provisioned during enrollment) to authenticate to the vault, avoiding per-workload service account sprawl, reducing overhead and the attack surface.

Ephemeral OAuth2 tokens avoid the risk inherent with static credentials.

Centrally control and manage system and workload access to vault objects and API services. This can be set up in advance (pre-authorized for automated system enrollment).

Machine access to vault objects is controlled via roles during enrollment and OAuth scopes control which APIs each workload can use. This can be pre-configured once in advance.

Centrify “Enrollment Codes” provide a secure way for machines to enroll in the Centrify Platform (avoiding the need to use static ID/password) and automatically adds the machines to a role.

As you can see, PAM and AAPM have progressed to better support cloud migration and general DevOps use cases. I hope this gives you some useful food for thought as you (re-)evaluate PAM-related requirements yourself. Please click this link for a short demo of Centrify Delegated Machine Credentials.

If you need more information, please visit our website or reach out to a local partner or Centrify account manager, who will be happy to answer any questions.

This post was first first published on Blog | Centrify website by TonyGoulding. You can view it by clicking here