Managing Encryption for Data Centers Is Hard. And It Will Get Harder – Data Center Knowledge

Give up on trying to do it all yourself and leave itto the experts.

Encryption is a core tenet for cybersecurity. Attackers can't steal data that's encrypted. No matter what Hollywood says, there's no way for a hacker to get through one layer of good encryption, much less "several layers of encryption."

Related: Why Google Cloud Turned to AMD to Solve for Runtime Encryption

But encryption comes with a lot of challenges.

In a survey last fall by the Cyber Security Competency Group, 66 percent of respondents said that the management of encryption keys was either a "big" or "medium" challenge for their companies.

Related: Quantum Computing Doesnt Threaten Good Encryption--Yet

Managing keys across multiple clouds was an even bigger challenge.

In a similar study by Ponemon Institute and Encryption Consulting last year, 60 percent of respondents said that key management was "very painful."

The top reason for the pain? Knowing who's in charge of all the keys. Other pain points include lack of skilled personnel and isolated or fragmented key management systems.

Meanwhile, encryption is evolving, and keeping on top of all the encryption algorithms is a challenge. Encryption involves some heavy-duty math. It's easy to make a mistake.

Within the next decade, respondents expect to see mainstream enterprise adoption for new approaches like multi-party computation, homomorphic encryption, and quantum algorithms.

As with any security technology, encryption is a constant game of cat and mouse. Attackers try to find vulnerabilities in algorithms. To keep up, defenders improve the algorithms themselves, strengthen how they are implemented, or increase the length of encryption keys.

That means any long-term encryption strategy has to allow for the possibility of upgrading either the algorithms or the keys.

Consider for example servers managing internet communications. To encrypt a message both the sender and the receiver have to agree on what encryption method and key length they are using, said Mike Sprunger, senior manager of cloud and network security at Insight.

"When those servers are deployed, they have a list of algorithms ranked from most desired to least desired and they will negotiate to find the highest-level match," he told DCK.

Unfortunately, those lists can get out of date, he said. "Often, when servers are deployed, they're never touched again."

The one bright side here is that online communications are ephemeral. Keys are created, used, and immediately discarded.

When it comes to long-term storage, however, those keys sometimes have to be good for years. Some companies have business or regulatory requirements to keep data for a decade or longer.

If the encryption becomes outdated or the keys themselves are compromised data centers have to decrypt all their old data and re-encrypt it again with new, better encryption.

"A good practice is to rotate keys regularly," said Sprunger.

This can easily become an administrative nightmare if a data center operator is doing it alone.

"The good vendors have a mechanism for going through and recycling and replacing keys," he said.

If anything goes wrong and keys are lost, so is the data.

Encryption also plays a role in generating the certificates used to digitally sign and authenticate systems, users, and applications. If those certificates expire, get lost, or get compromised, companies could lose access to their applications or attackers can gain access.

"Most organizations do not do a good job managing that," said Sprunger. "And if they don't manage the certificates properly, they run the risk of shutting down their organization. I recommend that if they're not good at managing certificates, they go to a third-party provider."

If encryption is handled by hardware, upgrading can be a particular challenge for data centers that have opted to buy and maintain their own equipment.

Hardware acceleration can result in both speed and security improvements, but the hard-coded algorithms can also become old and obsolete.

"Now I've got to go back in and replace equipment to get a different algorithm or a bigger key size," said Sprunger.

On the other hand, if there's a particular system that has embedded hardware-based encryption, like an encrypted drive, then when the devices are replaced the new ones will automatically have the newer and better encryption in them.

"That will be a fairly painless upgrade," said Tom Coughlin, IEEE fellow and president of Coughlin Associates.

With software-based encryption that encompasses multiple systems, upgrades can be a bigger challenge.

"There may be issues, depending upon how many of these exist and how much they depend upon each other," he said.

When choosing encryption vendors, data centers should look for those that are FIPS 140-2 compliant, said Insight's Sprunger, senior manager of cloud and network security at Insight.

Getting this certification is difficult and expensive, involving third-party security reviews, but is a federal mandate for government contracts.

"Having been a director of technical engineering for a company that built encryption appliances, it's an arduous process," he told DCK. "But table stakes."

Any vendor should be able to respond right away to questions about compliance, he said.

There are many vendors and organizations working on new encryption technologies and creating standards required to ensure that we're all moving in the same direction. Data center managers looking to buy equipment that will set them up for the future particularly the quantum future will have to wait for both technologies and standards to emerge.

The picture is a little bit clearer, for now at least, when it comes to symmetric encryption. That's when the same key is used to both lock and unlock the data such as when a company stores backups.

To keep data secure for a year or two, current 128-bit encryption is enough, said Simon Johnson, senior principal engineer at Intel.

"If you're looking to keep secrets beyond 15 to 20 years, then folks are starting to recommend at least 256 bits," he told DCK. That'll keep us secure even when the first wave of quantum computers gets here.

Fortunately, today's chips can support that level of encryption, Johnson said. "The AES (advanced encryption standard) operations are there for doing that. It's just a matter of changing your software to go those lengths."

Asymmetric encryption, where one key is used to encrypt a message and a different key is used to decrypt it, is a bit more challenging. This is the type of encryption used for communications and is also known as public key infrastructure.

The next stage of evolution of this type of encryption is still up in the air, he said.

"We're still waiting for NIST (National Institute of Standards and Technology) and the academic world to really focus on providing mechanisms that will do asymmetric encryption in the post-quantum world," he said. "We're waiting for standards. Not just Intel the world is waiting for standards. There's no post-quantum standards in that space."

But creating new encryption algorithms, testing them, developing standards, getting industry buy-in, and then deploying them will take years. And that's if the new algorithm fits into the protocols that are in place today.

"But who knows what these new algorithms will look like," he said.

For example, moving to elliptic curve algorithms, one of the early favorites for quantum-proof encryption, would be a ten-year horizon, he said.

He suggests that data center managers looking ahead should first of all move to 256 encryption to protect storage.

And for asymmetric encryption used in communications, larger key sizes should provide adequate security for the intermediate future, he said.

"So, five to eight years," he said. "Though nobody knows when this mysterious quantum computer is going to appear."

Read more:
Managing Encryption for Data Centers Is Hard. And It Will Get Harder - Data Center Knowledge

Related Posts
This entry was posted in $1$s. Bookmark the permalink.