Cryptography Definition – Tech Terms

Cryptography is the science of protecting information by transforming it into a secure format. This process, called encryption, has been used for centuries to prevent handwritten messages from being read by unintended recipients. Today, cryptography is used to protect digital data. It is a division of computer science that focuses on transforming data into formats that cannot be recognized by unauthorized users.

An example of basic cryptography is a encrypted message in which letters are replaced with other characters. To decode the encrypted contents, you would need a grid or table that defines how the letters are transposed. For example, the translation grid below could be used to decode "1234125678906" as "techterms.com".

The above table is also called a cipher. Ciphers can be simple translation codes, such as the example above, or complex algorithms. While simple codes sufficed for encoding handwritten notes, computers can easily break, or figure out, these types of codes. Because computers can process billions of calculations per second, they can even break complex algorithms in a matter of seconds. Therefore, modern cryptography involves developing encryption methods that are difficult for even supercomputers to break.

Updated: July 15, 2015

This page contains a technical definition of Cryptography. It explains in computing terminology what Cryptography means and is one of many technical terms in the TechTerms dictionary.

All definitions on the TechTerms website are written to be technically accurate but also easy to understand. If you find this Cryptography definition to be helpful, you can reference it using the citation links above. If you think a term should be updated or added to the TechTerms dictionary, please email TechTerms!

Go here to see the original:
Cryptography Definition - Tech Terms

Related Posts
This entry was posted in $1$s. Bookmark the permalink.