Research — 24 Jan, 2022

Data security, 'shifted left'

Introduction

Enterprises are encountering a proliferation of applications that store, process and transmit a wide variety of sensitive data. To keep this data safe, developers must "shift left" — a term derived from moving a task to the left on the project timeline — to address security earlier in the development process, building in data security that enables confidentiality and integrity. In this report, we discuss some of the context, current challenges and principals guiding developer experience when implementing data security.

SNL Image

Data security has historically lagged application development. Frequently bolted on to environments after applications have been developed, traditional data security solutions can be rigid and bring high opportunity costs for the sake of compliance or better risk management. DevOps and agile now push developers for both continuous growth and greater security challenges. The way forward must easily enable developers to both secure their data and their application infrastructure, with the dual benefits of getting to market safer and faster.

As data itself is increasingly dynamic in enterprise applications, more intrinsic data security will have to follow. When there are new purposes for data, data security must be built in. Controls can no longer be static; long gone are the days in data security of "set it and forget it." Data security vendors should be partnering with their customers to deliver all these values and continuously evolve their understanding of deployment and developer experience — data security cannot afford to become a technical debt. With developer experience, or DX, principles such as simplicity, standardization and speed, data security can be one of the first development considerations, rather than one of the last.

SNL Image

Context

One of the first major catalysts for data security was the creation of the Payment Card Industry Data Security Standard. It made data security a "must have" rather than a "nice to have." When introduced in the mid-2000s, PCI DSS was the first compliance regulation with technical controls like antivirus, patching, monitoring and encryption. It was the first to be enforced, and enforcement came from cardholder networks like Visa Inc. PCI DSS was intended to be a data security standard with industry (rather than government) oversight, so organizations scrambled to put data security into place that would satisfy the standard's requirements and allow them to continue processing payments. Vendors that provided database and web application monitoring (such as Imperva), data loss prevention, and encryption key management were key beneficiaries and grew substantially thanks to the stimulus provided by PCI.

Security vendors and practitioners took the lessons learned from the maturation of PCI assessment and remediation and applied them to other classes of data. Initiatives such as the Health Insurance Portability and Accountability Act and the Health Information Trust Alliance added further scope for specific types of sensitive information commonly seen in healthcare, such as a provider's national provider identifier — a standard for uniquely identifying individual covered healthcare providers — and personally identifiable information.

In these initial compliance cases, targets were fixed and urgent. Relational databases and their payment applications were often stored in fixed, on-premises environments, and many approaches to data security were "bolted on" after the application was developed. Many data security products were specific to individual applications, so enterprises had to piece together multiple data security tools. More recently, the broader trends for applications to be more modular via cloud, virtualization and agile development have increased the need for data security to be directly built in by application developers themselves. According to 451 Research's Voice of the Enterprise data, security tool deployment for application teams has increased from 29% of respondents to 48% of respondents from 2015-2020.

Current challenges

Bolted-on data security faces significant challenges, such as scale, risk and application economics. Applications have moved away from monolithic architectures where a single data security solution could be placed in line to help secure an entire stack. More modern architectures are dynamically scalable; data security solutions must therefore scale with the application's needs. As some applications have moved to cloud or cloud architectures, the previous generation of security tooling may not work in these environments, or in those of a given cloud service provider. As compute, storage and network have been virtualized and distributed across a variety of more recent cloud-native techniques, existing/legacy data security controls may not have been designed to support these newer operating environments.

Many past compliance efforts produced documents or reports that summarized only an application snapshot for the annual audit and remediation itself, not necessarily changing controls and changing procedures over time. Practically speaking, it can be difficult to scale and alter controls that are only documented from a point-in-time perspective. Data security controls that had rigid allow and deny lists and a "set it and forget it" approach may have won a tactical data security battle but are ill equipped to scale and proactively manage risk in highly dynamic modern environments.

Many data security controls were not designed for the cloud or initially licensed via common cloud consumption models. Data security needs to be as economically flexible as other application resources. Data security deployment must be agile without sacrificing security and rapid innovation.

Data security has also not been portable, historically, but has been written or designed for a specific platform. For example, data security controls designed for relational databases might not work for Hadoop or Mongo; encryption for Salesforce might not work for Microsoft 365. This has been one of the biggest limitations of traditional data security. Shifting data security left and developing earlier in the process should, in theory, enable a broader range of downstream applications and platforms to implement data security controls.

The shift to DevOps has developers considering additional tasks with disciplines that they may not have expertise in implementing, such as threat modeling, secure coding and secure code reviews. Developers cannot be expected to be data security experts — nor should they be. With these new responsibilities, data security tools must provide a good DX to be implemented securely and well. In other words, data security "shifted left" has to make it seamless for developers to integrate data security into their builds with limited friction.

For security vendors, positive DX has thus far had a positive impact on valuations. Okta Inc.'s $5.6 billion purchase of application programming interface-first identity and access management vendor Authy Inc. and HashiCorp Inc.'s projected IPO valuation of $13 billion in December are two examples. Below, we break down the principles and evolutions of data encryption DX: simplicity, standardization and speed.

Simplicity

For developers, encryption is historically complex, and developers typically have many design choices to make. If their objective is to create an application's primary functionality, the requirements of encryption may have them well out of their depth. If developers get around to encryption, there are many choices to make, such as algorithm, algorithm mode, key length, initialization vector, padding and mode. Additionally, managing encryption keys adds choices for key generation, storage, revocation and rotation. Developers then have further choices to make for authentication, authorization, and message integrity for recipients and senders.

To describe this complexity, consider the steps Alice must develop to encrypt a message for Bob. She must both encrypt the message and ensure that only Bob can access its contents. Bob must verify that only Alice was the one who sent the message. Each step listed here is made in an established cryptographic library, such as the Java Crypto API or OpenSSL libcrypto library:

* Alice creates an AES encryption key for her message.

* Alice encrypts the message with the AES key.

* Alice's encrypted message is hashed with the SHA-256 algorithm.

* Alice reads in her private RSA key.

* Alice creates a signature with the message hash and her RSA private key. The signature authenticates Alice.

* Alice reads in Bob's public RSA key, typically available in verifiable format via Bob's digital certificate.

* Alice encrypts the encrypted message, the shared AES key, the signature and hash value with Bob's public key. This assures that only Bob can decrypt this content with his corresponding private key, maintaining confidentiality between the two of them.

* Alice sends the message.

Given this complex DX, the most recent 2021 OWASP Top 10 still lists cryptographic failures as its No. 2 problem category. The cryptographic failures are almost always in the individual implementations — attackers get around encryption by exploiting encryption that is poorly or improperly implemented. Newer developer libraries and protocols like NaCl (Salt), Google's Tink, Signal and ACME are offering high-level interfaces that simplify the DX. In the case of NaCl, it consolidates these eight steps into a single step.

Vendors are beginning to simplify the DX, and one such vendor is Evervault. Front-end developers that collect sensitive data, such as a name or email address, simply tell Evervault which fields are to be encrypted. Evervault uses a relaying mechanism and domain name system changes to redirect those data elements to Evervault for encryption. Developers do not need to make any choices about authenticated data encryption or key management. Moreover, they need to make negligible changes to their code to leverage the service. In the same way services like Twilio simplify and abstract communication and telephony, vendors such as Evervault aim to do likewise with data security.

Standardization

Given its historical complexity, enterprise data encryption has been anything but standard. Software-based key and secrets management systems like HashiCorp Vault and Amazon.com Inc.'s AWS KMS readily centralize keys, certificates and secrets for the application, its identities and infrastructure. Vault features 100-plus integrations. AWS KMS provides encryption key management for more than 90 AWS offerings, including EC2, Dynamo, S3 and RDS.

When developers in an organization can standardize and centralize their secrets management, secure code review scope can be reduced. Developer-friendly standardized solutions can be immutable, versioned and audited. Entire workflows can be automated. Developers spend less time reconstructing security controls and more time building products.

Speed

Time-to-market and personnel constraints are arguably the developer's greatest challenges. While the compliance, risk and regulatory requirements push many data security initiatives, it is the developers that will be the ultimate implementers, given the overall time-to-market challenges. Data security providers must therefore strategically understand these imperatives if they are to see significant developer adoption. No longer are data security solutions separate from other go-to-market activities. For these data security vendors, the developer experience is both the effective go-to-market strategy and the product.

As an example, Good DX incorporates documentation, community and functionality, so that developers can learn to build quickly. Some FOSS and SaaS vendors, including startups and major cloud providers, have made major steps to onboard developers. For community and documentation, Slack or Discord forums, full API documentation, tutorials, and a jargon-free presence on Quora, Twitter, YouTube or Reddit may be advisable. Other developer-friendly touches might include a cookbook of clear functional examples to facilitate rapid understanding and application. SaaS offerings can even offer troubleshooting and support for any prototypes or trials that developers have started. Developers often depend on the community in seeking out preferred solutions that incorporate good documentation and functionality, highlighting the value of engagement in constantly educating and enabling developers to securely build better.

The death of "set it and forget it"

While the focus here has been on data security, specifically encryption and related disciplines of encryption key, secrets, certificates and signature management, there are still other data security tools undergoing improvements in DX. Advances in the understanding of the data lifecycle, from the origin of data to its destruction/revocation, may affect the way encryption controls are applied. Other newer use cases around privacy-enhancing technologies may also alter DX.

As data itself is increasingly dynamic in enterprise applications, more intrinsic data security will have to follow. When there are new purposes for data, data security must be built in. Controls can no longer be static; long gone are the days in data security of "set it and forget it." Data security vendors should be partnering with their customers to deliver all these values and continuously evolve their understanding of deployment and developer experience, as data security cannot afford to become a technical debt. With greater simplicity, standardization and speed, data security can be one of the first development considerations, rather than one of the last.

This article was published by S&P Global Market Intelligence and not by S&P Global Ratings, which is a separately managed division of S&P Global.

Gain access to our full news & research coverage and the industry-specific data that informs our insights.