What is cryptography? How algorithms keep information secret and … – CSO

Cryptography definition

Cryptography is the art of keeping information secure by transforming it into form that unintended recipients cannot understand. In cryptography, an original human readable message, referred to as plaintext, is changed by means of an algorithm, or series of mathematical operations, into something that to an uninformed observer would look like gibberish; this gibberish is called ciphertext.

Cryptographic systems require some method for the intended recipient to be able to make use of the encrypted messageusually, though not always, by transforming the ciphertext back into plaintext.

Before we move into the meat of this article, let's define a couple terms related to cryptography. The syllable crypt may make you think of tombs, but it comes from a Greek word that means "hidden" or "secret." Cryptography literally means "secret writing." Cryptology, meanwhile, means something like "knowledge of secrecy"; if cryptography is the practice of writing secret messages, then cryptology is the theory, although the two words are often used interchangeably. Encryption"making secret"is what we call the process of turning plaintext into ciphertext Encryption is an important part of cryptography, but doesn't encompass the entire science. Its opposite is decryption.

One important aspect of the encryption process is that it almost always involves both an algorithm and a key. A key is just another piece of information, almost always a number, that specifies how the algorithm is applied to the plaintext in order to encrypt it. In a secure cryptographic system, even if you know the method by which some message is encrypted, it should be difficult or impossible to decrypt without that key. Keep algorithms and keys in your mind, because they'll be important as we move on.

This is all very abstract, and a good way to understand the specifics of what we're talking about is to look at one of the earliest known forms of cryptography. It's known as the Caesar cipher, because Julius Caesar used it for his confidential correspondence; as his biographer Suetonius described it, "if he had anything confidential to say, he wrote it in cipher, that is, by so changing the order of the letters of the alphabet ... If anyone wishes to decipher these, and get at their meaning, he must substitute the fourth letter of the alphabet, namely D, for A, and so with the others."

Suetonius's description can be broken down into the two cryptographic elements we've discussed, the algorithm and the key. The algorithm here is simple: each letter is replaced by another letter from later in the alphabet. The key is how many letters later in the alphabet you need to go to create your ciphertext. It's three in the version of the cipher Suetonius describes, but obviously other variations are possiblewith a key of four, A would become E, for instance.

A few things should be clear from this example. Encryption like this offers a fairly simple way to secretly send any message you like. Contrast that with a system of code phrases where, say, "Let's order pizza" means "I'm going to invade Gaul." To translate that sort of code, people at both ends of the communication chain would need a book of code phrases, and you'd have no way to encode new phrases you hadn't thought of in advance. With the Caesar cipher, you can encrypt any message you can think of. The tricky part is that everyone communicating needs to know the algorithm and the key in advance, though it's much easier to safely pass on and keep that information than it would be with a complex code book.

The Caesar cipher is what's known as a substitution cipher, because each letter is substituted with another one; other variations on this, then, would substitute letter blocks or whole words. For most of history, cryptography consisted of various substitution ciphers deployed to keep government and military communications secure. Medieval Arab mathematicians pushed the science forward, particularly the art of decryptiononce researchers realized that certain letters in a given language are more common than others, it becomes easier to recognize patterns, for instance. But most pre-modern encryption is incredibly simple by modern standards, for the obvious reason that, before the advent of computers, it was difficult to perform mathematical transformations quickly enough to make encryption or decryption worthwhile.

In fact, the development of computers and advances in cryptography went hand in hand. Charles Babbage, whose idea for the Difference Engine presaged modern computers, was also interested in cryptography. During World War II, the Germans used the electromechanical Enigma machine to encrypt messagesand, famously, Alan Turing led a team in Britain that developed a similar machine to break the code, in the process laying some of the groundwork for the first modern computers. Cryptography got radically more complex as computers became available, but it remained the province of spies and generals for several more decades.

Before we move on here to modern cryptography, let's pause to discuss two important principles that underlie it. The first is what's come to be known as Kerckhoffss principle, named after the 19th century Dutch cryptographer Auguste Kerckhoffs. Remember, as we said, any cryptographic system involves both an algorithm and a key. Kerckhoffs believed that "a cryptographic system should be secure even if everything about the system, except the key, is public knowledge."

Now, these were the days when cryptography had almost entirely military applications. The idea here is that, while it would be nice to keep your cryptographic system a secret, your opponent will almost certainly eventually figure it out. Claude Shannon, a World War II cryptographer who would go on to be a pioneer in information theory, put it more succinctly: "The enemy knows the system." What Kerckhoffs and Shannon are getting at is that you want to design an algorithm that doesn't need to be a secret in order to successfully conceal information.

That said, in today's world, the public nature of cryptographic algorithms is seen as something good in and of itself, rather than an unavoidable evil. Standard cryptographic algorithms have been widely studied and stress-tested, and trying to come up with your own private algorithms is doomed to failure as security through obscurity usually is.

What you do need to keep secret is your cryptographic key. We'll get to the mathematics of how that works in a moment, but for now, we'll touch on another cryptographic principle that makes that math possible: a reliance on one-way functions, mathematical operations that are very difficult to reverse. The classic example of a one-way function is the multiplication of two very large prime numbers together. While that calculation is simple to do, if you only had the end result, it would be very difficult, verging on impossible, to figure out the original two prime numbers. The question of whether any function can truly be one-way is debated by mathematicians, but many are irreversible in practice at the limits of our current computing power, so we'll leave that question aside as we move on.

It was the formation of the first computer networks that started civilians thinking about the importance of cryptography. Computers were talking to each other over the open network, not just via direct connections to one another; that sort of networking was transformative in many great ways, but also made it trivially easy to snoop on data traveling across the network. And with financial services being an early use case for computer communication, it was necessary to find a way to keep information secret.

IBM led the way in the late 1960s with an encryption method known as "Lucifer", which was eventually codified by the US National Bureau of Standards as the first Data Encryption Standard (DES). As the internet began to grow in importance, more and better encryption was needed, and today a significant portion of data flying around the world is encrypted using varying techniques that we'll discuss in more detail in a moment.

We've already discussed some of the specific applications of cryptography, from keeping military secrets to transmitting financial data safely across the internet. In the bigger picture, though, there are some broad cybersecurity goals that we use cryptography to help us achieve, as cybersecurity consultant Gary Kessler explains. Using cryptographic techniques, security pros can:

You may recognize some of these principles from variations of the CIA triad. The first of these uses is the obvious oneyou can keep data secret by encrypting it. The others take a bit of explanation, which we'll get into as we describe the different types of cryptography.

There are numerous cryptographic algorithms in use, but in general they can be broken into three categories: symmetric cryptography, asymmetric cryptography, and hash functions. Each has its own role to play within the cryptographic landscape.

Symmetric cryptography. The Caesar cipher we discussed above is a great example of symmetric cryptography. In the example we used, if encrypted messages were being exchanged between Caesar and one of his centurions, both parties would have to know the keyin this case, how many letters forward or backwards in the alphabet you need to move to transform plaintext to ciphertext or vice versa. That's what makes it symmetrical. But the key needs to stay a secret between the two of them, which is why this is sometimes also called secret key cryptography. You couldn't send the key along with the message, for instance, because if both fell into enemy hands the message would be easy for them to decipher, defeating the whole purpose of encrypting it in the first place. Caesar and his centurion would presumably have to discuss the key when they saw each other in person, though obviously this is less than ideal when wars are being fought over long distances.

Symmetric cryptography is widely used to keep data confidential. It can be very useful for keeping a local hard drive private, for instance; since the same user is generally encrypting and decrypting the protected data, sharing the secret key is not an issue. Symmetric cryptography can also be used to keep messages transmitted across the internet confidential; however, to successfully make this happen, you need to deploy our next form of cryptography in tandem with it.

Asymmetric cryptography. Caesar may have been able to confer with his centurions in person, but you don't want to go into your bank and talk to the teller just to learn what the private key is for encrypting your electronic communication with the bankthat would defeat the purpose of online banking. In general, in order to function securely, the internet needs a way for communicating parties to establish a secure communications channel while only talking to each other across an inherently insecure network. The way this works is via asymmetric cryptography, which is sometimes called public key cryptography.

In asymmetric cryptography, each participant has two keys. One is public and is sent to anyone the party wishes to communicate with. That's the key used to encrypt messages. But the other key is private, shared with nobody, and it's necessary to decrypt those messages. To use a metaphor: think of the public key as opening a slot on a mailbox just wide enough to drop a letter in. You give that key to anyone who you think might send you a letter so they can open the slot and deliver the envelope. The private key is what you use to open the mailbox so you can get the letters out.

The mathematics of how you can use one key to encrypt a message and another to decrypt it are where the idea of one-way functions that we discussed above come into play: the two keys should be related to each other mathematically such that it's easy to derive the public key from the private key but not vice versa. For instance, the private key might be those two very large prime numbers, which you'd multiply together to get the public key. The Infosec Institute has a deep dive if you're interested.

The computations needed for asymmetric cryptography are much more complex and resource intensive that those behind symmetric infrastructure. Fortunately, you don't need to use it to protect every message you send online. Instead, what usually happens is that one party will use symmetric cryptography to encrypt a message containing yet another cryptographic key. This key, having been safely transmitted across the insecure internet, will then become the private key that encodes a much longer communications session encrypted via symmetric encryption.

See the rest here:
What is cryptography? How algorithms keep information secret and ... - CSO

What is Cryptography? Definition from SearchSecurity

What is cryptography?

Cryptography is a method of protecting information and communications through the use of codes, so that only those for whom the information is intended can read and process it.

In computer science, cryptography refers to secure information and communication techniques derived from mathematical concepts and a set of rule-based calculations called algorithms, to transform messages in ways that are hard to decipher. These deterministic algorithms are used for cryptographic key generation, digital signing, verification to protect data privacy, web browsing on the internet and confidential communications such as credit card transactions and email.

Cryptography is closely related to the disciplines of cryptology and cryptanalysis. It includes techniques such as microdots, merging words with images and other ways to hide information in storage or transit. However, in today's computer-centric world, cryptography is most often associated with scrambling plaintext (ordinary text, sometimes referred to as cleartext) into ciphertext (a process called encryption), then back again (known as decryption). Individuals who practice this field are known as cryptographers.

Modern cryptography concerns itself with the following four objectives:

Procedures and protocols that meet some or all of the above criteria are known as cryptosystems. Cryptosystems are often thought to refer only to mathematical procedures and computer programs; however, they also include the regulation of human behavior, such as choosing hard-to-guess passwords, logging off unused systems and not discussing sensitive procedures with outsiders.

Cryptosystems use a set of procedures known as cryptographic algorithms, or ciphers, to encrypt and decrypt messages to secure communications among computer systems, devices and applications.

A cipher suite uses one algorithm for encryption, another algorithm for message authentication and another for key exchange. This process, embedded in protocols and written in software that runs on operating systems (OSes) and networked computer systems, involves:

Single-key or symmetric-key encryption algorithms create a fixed length of bits known as a block cipher with a secret key that the creator/sender uses to encipher data (encryption) and the receiver uses to decipher it. One example of symmetric-key cryptography is the Advanced Encryption Standard (AES). AES is a specification established in November 2001 by the National Institute of Standards and Technology (NIST) as a Federal Information Processing Standard (FIPS 197) to protect sensitive information. The standard is mandated by the U.S. government and widely used in the private sector.

In June 2003, AES was approved by the U.S. government for classified information. It is a royalty-free specification implemented in software and hardware worldwide. AES is the successor to the Data Encryption Standard (DES) and DES3. It uses longer key lengths -- 128-bit, 192-bit, 256-bit -- to prevent brute force and other attacks.

Public-key or asymmetric-key encryption algorithms use a pair of keys, a public key associated with the creator/sender for encrypting messages and a private key that only the originator knows (unless it is exposed or they decide to share it) for decrypting that information.

Examples of public-key cryptography include:

To maintain data integrity in cryptography, hash functions, which return a deterministic output from an input value, are used to map data to a fixed data size. Types of cryptographic hash functions include SHA-1 (Secure Hash Algorithm 1), SHA-2 and SHA-3.

Attackers can bypass cryptography, hack into computers that are responsible for data encryption and decryption, and exploit weak implementations, such as the use of default keys. However, cryptography makes it harder for attackers to access messages and data protected by encryption algorithms.

Growing concerns about the processing power of quantum computing to break current cryptography encryption standards led NIST to put out a call for papers among the mathematical and science community in 2016 for new public key cryptography standards.

Unlike today's computer systems, quantum computing uses quantum bits (qubits) that can represent both 0s and 1s, and therefore perform two calculations at once. While a large-scale quantum computer may not be built in the next decade, the existing infrastructure requires standardization of publicly known and understood algorithms that offer a secure approach, according to NIST. The deadline for submissions was in November 2017, analysis of the proposals is expected to take three to five years.

The word "cryptography" is derived from the Greek kryptos, meaning hidden.

The prefix "crypt-" means "hidden" or "vault," and the suffix "-graphy" stands for "writing."

The origin of cryptography is usually dated from about 2000 B.C., with the Egyptian practice of hieroglyphics. These consisted of complex pictograms, the full meaning of which was only known to an elite few.

The first known use of a modern cipher was by Julius Caesar (100 B.C. to 44 B.C.), who did not trust his messengers when communicating with his governors and officers. For this reason, he created a system in which each character in his messages was replaced by a character three positions ahead of it in the Roman alphabet.

In recent times, cryptography has turned into a battleground of some of the world's best mathematicians and computer scientists. The ability to securely store and transfer sensitive information has proved a critical factor in success in war and business.

Because governments do not want certain entities in and out of their countries to have access to ways to receive and send hidden information that may be a threat to national interests, cryptography has been subject to various restrictions in many countries, ranging from limitations of the usage and export of software to the public dissemination of mathematical concepts that could be used to develop cryptosystems.

However, the internet has allowed the spread of powerful programs and, more importantly, the underlying techniques of cryptography, so that today many of the most advanced cryptosystems and ideas are now in the public domain.

See the original post:
What is Cryptography? Definition from SearchSecurity

System.Security.Cryptography.CryptographicException: The payload was …

When you are using your .Net Core application to decrypt a string from a different machine than it was encrypted, you may run into the following exception:

Exception:

System.Security.Cryptography.CryptographicException: The payload was invalid.

at Microsoft.AspNetCore.DataProtection.Cng.CbcAuthenticatedEncryptor.DecryptImpl(Byte* pbCiphertext, UInt32 cbCiphertext, Byte* pbAdditionalAuthenticatedData, UInt32 cbAdditionalAuthenticatedData)

at Microsoft.AspNetCore.DataProtection.Cng.Internal.CngAuthenticatedEncryptorBase.Decrypt(ArraySegment`1 ciphertext, ArraySegment`1 additionalAuthenticatedData)

at Microsoft.AspNetCore.DataProtection.KeyManagement.KeyRingBasedDataProtector.UnprotectCore(Byte[] protectedData, Boolean allowOperationsOnRevokedKeys, UnprotectStatus& status)

at Microsoft.AspNetCore.DataProtection.KeyManagement.KeyRingBasedDataProtector.DangerousUnprotect(Byte[] protectedData, Boolean ignoreRevocationErrors, Boolean& requiresMigration, Boolean& wasRevoked)

at Microsoft.AspNetCore.DataProtection.KeyManagement.KeyRingBasedDataProtector.Unprotect(Byte[] protectedData)

at Microsoft.AspNetCore.DataProtection.DataProtectionCommonExtensions.Unprotect(IDataProtector protector, String protectedData)

Two things you will need to check:

1. Is the encryption key persists to a local path? - The key needs to bepersisted to a shared location

2. SetApplicationName must be used to set an explicit application name. - If ApplicationName is not set, it will be generated a guid at runtime for different machines, and that will lead to the error above.

Code Example below:

services.AddDataProtection()

.ProtectKeysWithCertificate(x509Cert)

.UseCryptographicAlgorithms(

new AuthenticatedEncryptorConfiguration()

{

EncryptionAlgorithm = EncryptionAlgorithm.AES_256_CBC,

ValidationAlgorithm = ValidationAlgorithm.HMACSHA256

}

)

.PersistKeysToFileSystem(new System.IO.DirectoryInfo(Configuration.GetValue("KeyLocation"))) //shared network folder for key location

.SetApplicationName("MyApplicationName")

.SetDefaultKeyLifetime(TimeSpan.FromDays(600));

Visit link:
System.Security.Cryptography.CryptographicException: The payload was ...

NIST Action Will Heat Up Post-Quantum Cryptography Market: Report – TechNewsWorld

  1. NIST Action Will Heat Up Post-Quantum Cryptography Market: Report  TechNewsWorld
  2. Preparing Cryptography for the Risks of a Post-Quantum Computing World  Electronic Design
  3. Transitioning to Quantum-Safe Encryption  Security Intelligence
  4. Can the World Avoid a 'Quantum Encryption Apocalypse'?  Slashdot
  5. View Full Coverage on Google News

Visit link:
NIST Action Will Heat Up Post-Quantum Cryptography Market: Report - TechNewsWorld

Global Encryption Day: Why quantum-safe cryptography is the future of cybersecurity – World Economic Forum

  1. Global Encryption Day: Why quantum-safe cryptography is the future of cybersecurity  World Economic Forum
  2. Commonwealth Cyber Initiative researchers hone cryptographic algorithms to stand against powerful quantum threat  Virginia Tech Daily
  3. Why quantum mechanics will be key to digital security  rea corporativa Banco Santander
  4. The quantum computing threat is real. Now we need to act.  CyberScoop
  5. View Full Coverage on Google News

See the original post:
Global Encryption Day: Why quantum-safe cryptography is the future of cybersecurity - World Economic Forum

Post-Quantum Cryptography: Anticipating Threats and Preparing the Future – ENISA

  1. Post-Quantum Cryptography: Anticipating Threats and Preparing the Future  ENISA
  2. Global Quantum Cryptography Market Size And Forecast |Quintessencelabs, Crypta Labs, Qasky, Qubitekk, Isara and Post-Quantum. Sioux City Catholic Globe  Sioux City Catholic Globe
  3. Quantum Cryptography Professional Market Analysis, Status and Global Outlook 202  Leave The Hall
  4. Quantum Cryptography and Encryption Market Key Players, Volumes, and Investment Opportunities 2022-2028  Alpenhorn News
  5. Rising Adoption of Quantum Cryptography Market To Fuel Revenue Growth Through 2022 - 2032 : Fact.MR  Newstrail
  6. View Full Coverage on Google News

Read the original:
Post-Quantum Cryptography: Anticipating Threats and Preparing the Future - ENISA

Cracking the code of cryptography and life The Irish Times – The Irish Times

Think about the humble envelope. For centuries this paper enclosure has shielded important information from prying eyes that might otherwise steal a glance at an unprotected note. Also, by placing information in an envelope, the sender effectively commits to and freezes this information until it gets to the recipient, assuming it is not tampered with or altered on its journey.

This system of secrecy served us well for centuries in an analogue world, but what about the digital environment in which we now communicate, shop and bank? Enter modern cryptography, which is the subject of this years Royal Irish Academy Hamilton Lecture by Israeli mathematician and computer scientist Prof Avi Wigderson. He will deliver his talk Cryptography: Secrets and Lies, Knowledge and Trust later this month at Trinity College Dublin.

Mathematician Avi Wigderson

Cryptography is nothing new, of course for centuries people have encoded information to scramble its contents, which can then be deciphered or unscrambled by a recipient who knows the key or rules to breaking that code. But cryptography developed a new dimension towards the end of the 20th century thanks to the marriage of computing power and complexity theory, and Wigderson has helped to shape its power.

Hard to solve, easy to verify

Imagine a tough Sudoku puzzle or a tough mathematical problem, he says. Most people may not be able to solve these, but if they saw the correct answer they could check and verify that it was correct. Such problems are of extreme importance in complexity theory.

Modern cryptography often makes use of such hard problems; difficult to solve but relatively easy to verify once solved, explains Wigderson, who is Herbert H Maass professor of mathematics at the School of Mathematics the Institute for Advanced Study, Princeton, New Jersey.

About 40 years ago, people started understanding that introducing computational complexity, namely the fact that some problems are easy and some are hard for us and for computers, could be used as a basis for cryptography, he says.

Cloaking information in such hard-to-solve problems is akin to the sender sealing the envelope in the analogue world. Here one uses specific hard problems with extra structure such as factoring integers into primes which enable encoding any number by another which, like an envelope, obscures the original to anyone else, but commits the encoder to that value, he says.

In a similar way tough mathematical problems can shield information as it travels digitally, and the solutions can be rapidly verified when the information lands.

That complexity-based approach provides a system for secrecy without the need for physical means, and it allows us to do many more things than just send secret messages around, says Wigderson. You can use it to protect a vast array of transactions you might want to carry out in a digital world, and it led eventually to the revolution of online shopping and internet security.

Hamiltonian paths

Much like cryptography, the notion of hard-to-solve problems is not new. Irish mathematician William Rowan Hamilton, after whom the Hamilton Lecture is named, made contributions to the field in the 19th century, particularly with his exploration of Hamiltonian Paths, Wigderson says.

Think of a map with several cities on it, he says. Can you trace an unbroken route through those cities and visit each city only once? If there are 1,000 cities on this map, then it becomes a very hard problem to solve you seem to need to try all possible routes an astronomical number.

But if a solution, namely a route, is provided, then you can look at the map and quickly check that no city has been visited twice in the unbroken route. This was the kind of hard-to-solve but easy-to-verify problem that Hamilton explored. Today we know it to be a prototypical example of problems of this type it is as hard as any of them.

The mystery of A to B

One of the great mysteries of the field of complexity is what goes on between point A and point B, where some seemingly hard problems are solved. This, he says, has ramifications for questions far beyond cryptography, including climate, neuroscience, artificial intelligence and medicine.

How do you explain why a particular drug works to cure a disease, or how actions in the atmosphere affect weather and climate, or how a thought is generated in the brain, or how a neural network can beat a world champion at chess, he asks. It may be easy to verify that these things happen, but how do they happen?

Nor can we explain exactly how scientific luminaries such as Newton, Hamilton, Pasteur and Einstein came up with their insights and theories, he adds. They were extremely successful in explaining things that people hundreds and thousands of years before them couldnt explain that way, Wigderson points out. They came up with something that it seems was much easier to verify than to find.

Complexity and curiosity

He distinguishes between the questions of finding an algorithm that works, and the question of how it works. I think many people would be extremely happy if some black box would solve all their problems, making them happy and healthy and living for 100 years, even if they didnt quite understand how this algorithm came up with it, he says. I think that would be big progress.

But Wigderson, who in 2021 shared the prestigious Abel Prize with Lszl Lovsz for their foundational contributions to theoretical computer science and discrete mathematics, and their leading role in shaping them into central fields of modern mathematics, also wants to understand the how of the complexity that underpins outcomes. And his drive is simple: he is curious. I think it is the most natural thing in the world to want to understand how everything happens.

That curiosity continues to drive Wigderson: I am totally fascinated by computation and what it can and cannot do. And by computation, I do not necessarily mean computers; every physical and natural process that happens ocean waves or the weather or the evolution of growth of embryo in the uterus, or the leaves on a plant or the formation of seashells or viruses causing disease all these processes are computations, in that they evolve in a sequence of simple, local steps like a computer programme.

Taking a computational approach to the natural world can yield important insights, Wigderson believes. We saw in the 1950s how Alan Turing, a mathematician famous for deciphering code, proposed a simple model that shows how patterns such as spots and stripes on animals skin can evolve. Working with computation and complexity you can get profound insights into the natural world and many of the issues that are facing us today, and I find that fascinating.

Hamilton Lecture 2022

Prof Avi Wigderson of the Institute for Advanced Study, Princeton University will deliver the 2022 RIA Hamilton Lecture on Cryptography: Secrets and Lies, Knowledge and Trust on Monday, October 17th, 2022 at 6pm-7.30pm. Tickets are free

Originally posted here:
Cracking the code of cryptography and life The Irish Times - The Irish Times

Dutch influence standards for post-quantum cryptography – ComputerWeekly.com

The US National Institute of Standards and Technology (NIST) has chosen the first group of encryption tools designed to withstand the attack of a future quantum computer, which could potentially crack the security used to protect privacy in the digital systems we rely on today.

Lo Ducas, senior researcher in the cryptology group at the Netherlands Centrum Wiskunde & Informatica (CWI),the national research institute for mathematics and computer science, is involved in the two most important algorithms of the upcoming NIST portfolio one for public key encryption and one for digital signatures.

According to Ducas, who is also a professor at the University of Leiden, these new standards are inevitable because there is nervousness about the arrival of quantum computing. We know quantum computing will not be rife tomorrow, but this standardisation procedure and its deployment take time, he said. Obviously there is certain sensitive information that needs to be secure and confidential not just at present, but in the future as well. Take state secrets, for instance.

Cyber security experts have warned that hackers are stealing data now to decrypt it in the future, when quantum computing could render modern encryption methods obsolete. A report published by NIST in April 2016 cited experts that acknowledged the possibility of quantum technology rendering the commonly used RSA algorithm insecure by 2030. We need to be ready for that, said Ducas. This means we have to anticipate now.

The announcement of the chosen tools follows a six-year effort managed by NIST, which started in 2016 with a call for the worlds cryptographers to devise and then vet encryption methods that could resist an attack from a future quantum computer. A total of 23 signature schemes and 59 encryption schemes were submitted, of which 69 were deemed complete and proper. The NIST competition consists of four rounds, during which some schemes are discarded and others studied more closely.

In July this year, NIST announced the first group of winners from its competition, which included Crystals-Kyber and Crystals-Dilithium, both developed by an international collaboration in which CWI participated. Other team members are ENS Lyon, Radboud University, Ruhr University Bochum, University of Waterloo, IBM, NXP, ARM, SRI International, Florida Atlantic University and Tsinghua University.

It was a rather big team, but that was the key aspect, said Ducas. It consisted of both industrial and academic people, and all their knowledge was necessary to develop the algorithms we have. Take NXP, for example they build chips and already use cryptology to embed in those chips. We needed their knowledge for the design, because it is essential that what we develop not only fits into devices like smartphones and laptops, but also in other places where chips are being used, like in the automotive industry. Fitting cryptology can be a big challenge.

Apart from the two algorithms in which CWI was involved, two further algorithms for signatures were selected by NIST Falcon and Sphincs+. Sphincs+ also was partially conceived in the Netherlands, led by Andreas Hsling from TU Eindhoven.

Ducas added: The selection of our schemes as a standard means that it will be deployed globally, protecting the privacy of billions of users. Fundamental research rarely gets such a direct and broad impact. The credit should go to the whole cryptographic research community. Whe schemes we proposed are merely the crystallisation of decades of scientific effort.

The algorithms developed by the international team are based on lattices, one of Ducas specialities. Both were designed together and share more than just the same mathematical platform, he said. We tried to make them look alike, so they will be easy to implement together. The Falcon algorithm designed for signatures also uses a lattice platform.

But that is where the similarity ends, said Ducas. This algorithm has different advantages and drawbacks.

One of his biggest concerns is that this algorithm computes with floating point numbers, as opposed to integers. Computers are obviously equipped to do this, but it is a real challenge for cryptology, said Ducas. Rounding can differ from computer to computer, so it has challenges for implementation. But because of its shorter keys, it was also selected for the NIST portfolio.

Now the four algorithms have been selected, they need to be written down into proper standards. This is obviously where NIST comes in, whereas we are mainly academics and technicians, said Ducas. NIST will draft up the ultimate text for the standard, but it will be in coordination with us.

NIST hopes to publish the standardisation documents by 2024 but, according to Wikipedia, may speed up the process if there are major breakthroughs in quantum computing.

After the release of the standards, the industry needs to be pushed to put them to use, said Ducas. I have a suspicion that most companies will want to be post-quantum resistant, so I think these standards will be easier to push than, for example, the hash function update from SHA-1 to SHA-2, he said. Moreover, I think IBM and NXP will incorporate their own designs within their own products.

Eventually, NIST is pushing the core of the new standard, the mathematical knowledge, but on top of that, there are a lot of things that are involved, like protocols, documentation, and so on. It might even evolve into an ISO standard, who knows, but NIST is leading the crowd.

So, will the new standards ensure we will be safe from quantum computers ability to possibly crack the RSA encryption? This is related to the P versus NP problem, said Ducas. The best guarantee we can have are the years of documented failures. This is the case with existing cryptology, and still is the case with post-quantum cryptology.

There is reasonable confidence to deploy, but no absolute mathematical guarantee. This is why we often say that cryptographers seldom sleep at night.

Read more from the original source:
Dutch influence standards for post-quantum cryptography - ComputerWeekly.com

Castle Shield Holdings, LLC Updates the Post-Quantum Cryptography (PQC) Algorithms for Its Data-in-Motion Aeolus VPN Solution – Business Wire

SCOTTSVILLE, Va.--(BUSINESS WIRE)--Castle Shield Holdings, LLC., a leader in Zero Trust and cybersecurity solutions, today announced that its Aeolus VPN solution now supports additional post-quantum cryptography (PQC) algorithms selected by the National Institute of Standards and Technology (NIST).

Last year, prior to the conclusion of the third round of the NIST PQC Standardization, we announced the successful integration of the Saber algorithm into Aeolus VPN. Reference our October 11, 2021, press release.

NIST has now selected two primary PQC algorithms for most use cases: CRYSTALS-KYBER and CRYSTALS-Dilithium. In addition, the signature schemes FALCON and SPHINCS+ were standardized as well. Kyber and Dilithium were both selected for their strong security and excellent performance, and NIST expects them to work well in most applications. Therefore, we have integrated Kyber (i.e., Kyber1024) and Dilithium (i.e., Dilithium5) algorithms into Aeolus VPN as well.

Aeolus VPN protects data between two or more network points. It offers a streamlined approach to privacy which results in more stability and lower latency that is a perfect addition to enterprise data-in-motion security for both classic and post-quantum computing environments. Aeolus VPN offers point-to-point asymmetric PQC and symmetric encryption for UDP and TCP on Windows, Linux and macOS platforms.

At Castle Shield, we are encryption agnostic. As NIST selects new PQC standards, we will add them to our suite of solutions that further demonstrates the cryptographic agility of our products. Our primary focus is to seamlessly integrate the best encryption algorithms available with our solutions to protect our customers data for today and tomorrow, said Dr. Milton Mattox, Chief Technology Officer at Castle Shield, Holdings, LLC.

Aeolus VPN with PQC continues to be available today for testing, proofs of concept, and production installations.

About Castle Shield Holdings, LLC

Founded in 2019, Castle Shield offers a complete range of enterprise-grade cybersecurity solutions that protects enterprises and consumers against all internal and external cyber threats. Our quantum-resistant solutions (Fides) stand strong as the last line of defense for enterprise and consumer data in the emerging quantum computing threat landscape. Legion, our Security Information Event Management or (SIEM) product portfolio and Fides work together to strengthen your overall data security. We monitor and address threat vectors through our scalable, multi-tenant SIEM platform, protecting enterprise systems and data in an efficient, cost-effective manner. In addition, we utilize an advanced compliance platform (Senate) and expert analysis with an in-depth understanding of dynamic compliance standards and industry best practices to highlight cyber risk factors. Our Senate system provides comprehensive ratings for third party vendors based on technical risk scores, compliance, and financial impact in the event of a breach. Our 360 proactive security solutions are what sets Castle Shield apart independent of your IT backbone whether cloud, hybrid or premise based. For further information, please go to http://www.castle-shield.com

Read more from the original source:
Castle Shield Holdings, LLC Updates the Post-Quantum Cryptography (PQC) Algorithms for Its Data-in-Motion Aeolus VPN Solution - Business Wire

Yale increases investment in blockchain research – Yale Daily News

Yale, which was ranked 34 in Coindesks 2022 Best Universities for Blockchain, has invested significantly in the rapidly growing field.

Alex Ye 12:08 am, Oct 12, 2022

Staff Reporter

Zoe Berg, Senior Photographer

This time last year, Yale was unranked in CoinDesks Best Universities for Blockchain. A year later the University places 34th overall, on par with Harvard and other major universities around the world.

The reports results recognize Yales recent significant investments into blockchain research, including the hiring of four new blockchain experts to the Computer Science faculty, Ben Fisch, Charalampos Papamanthou, Katerina Sotiraki and Fan Zhang one of whom is leading a project that has received a $5.75 million grant for blockchain development.

In the last few years, blockchain, as an interdisciplinary field, has spurred a huge amount of development in distributed systems and cryptography and their intersection, said Fisch. This is also why its such a fascinating academic topic, because it ties together so many different fields, not only from computer science, but also from economics, law and policy. Yale has a very unique combination of strengths in all these different areas, especially at present.

In August, Yale blockchain researchers accepted a $5.75 million grant from the Algorand Foundation, a not-for-profit organization focused on the development of blockchain technology.

The grant will support PAVE: A Center for Privacy, Accountability, Verification and Economics of Blockchain Systems, which will be led by Papamanthou. PAVE will bring together a cross-disciplinary team of experts from four institutions Yale, Columbia University, the City College of New York and the Swiss Federal Institute of Technology Lausanne, with Yale being the leading institution to advance research of blockchain systems.

Apart from the technical agenda, PAVE will also host hackathons, symposiums and blockchain summer schools.

The expansion of blockchain research at Yale coincides with the rise of the blockchain technology market. The value of blockchain technology in the banking, financial services and insurance sector market is expected to grow by $4.02 billion between 2021 and 2026, according to Technavio. The Technavio study found that easier access to technology and disintermediation of banking services will create more growth opportunities within the industry.

Papamanthou believes the hirings acknowledged the importance of blockchain, and that the University has more generally acknowledged the interdisciplinary nature of the blockchain space. He emphasized the University provides opportunities to explore the blockchain industry, such as interdisciplinary majors like computer science and economics.

Papamanthou spotlighted the newly established Roberts Innovation Fund created by the School of Engineering and Applied Sciences, which assists blockchain projects that could be commercialized through funding and mentoring.

An increasing number of students are interested in the field of blockchain, according to Mariam Alaverdian 23, president of the Yale Blockchain Club.

Alaverdian explained that because of the many applications of blockchain technology from personal identity security to healthcare to money transfers the emergence of blockchain into our lives is inevitable. She added that the Yale Blockchain Club has seen interested students come from a variety of backgrounds, with some having no prior exposure and others who already have startups in the space.

The Yale Blockchain Club started last spring and we received a lot of attention from Yale undergraduate and graduate students, Alaverdian wrote in an email to the News. We had 600 people sign up for our mailing list within a couple of weeks there is definitely a high demand from Yale students for educational materials and guidance.

As the blockchain industry has continued to grow, Yale has been a fierce advocate for blockchain research and development, Papamanthou noted.

Papamanthou explained that because Yales faculty is now made up of leaders in the field of

distributed computing and cryptography, the potential blockchain innovation at Yale could be unprecedented.

Its amazing that Yale has hired two phenomenal professors, Ben Fisch and Fan Zhang, whose research focuses on aspects of blockchains, said Roshan Palakkal 25, a student in Frontiers of Blockchain Research, a course taught by Fisch. Yale CS typically isnt known to be the best, but I think the new classes and faculty have positioned it to become one of the best universities for blockchain, with lots of potential for interdisciplinary collaboration in areas like economics, global affairs, and public policy.

Papamanthou added that students who are interested in blockchain have access to a variety of courses across the Computer Science and Economics Departments, as well as at the Yale School of Management and Yale Law School.

According to Fisch, from a computer science perspective, Yale is educationally competitive with any other university in the field of blockchain.

I will be offering a course in the spring that is comparable to the blockchain course thats offered by Stanford, Fisch said. And the research seminar that Im teaching now is uncommon at other universities, as it really goes in depth at a graduate level into all the most recent research topics that are being worked on currently.

The Yale Computer Science Department is located at 51 Prospect St.

Alex Ye covers faculty and academics. He previously covered the endowment, finance and donations. Originally from Cincinnati, Ohio, he is a sophomore in Timothy Dwight majoring in applied mathematics.

Here is the original post:
Yale increases investment in blockchain research - Yale Daily News