Raising cryptography’s standards

PUBLIC RELEASE DATE:

31-Oct-2014

Contact: Abby Abazorius abbya@mit.edu 617-253-2709 Massachusetts Institute of Technology @MITnews

Most modern cryptographic schemes rely on computational complexity for their security. In principle, they can be cracked, but that would take a prohibitively long time, even with enormous computational resources.

There is, however, another notion of security information-theoretic security which means that even an adversary with unbounded computational power could extract no useful information from an encrypted message. Cryptographic schemes that promise information-theoretical security have been devised, but they're far too complicated to be practical.

In a series of papers presented at the Allerton Conference on Communication, Control, and Computing, researchers at MIT and Maynooth University in Ireland have shown that existing, practical cryptographic schemes come with their own information-theoretic guarantees: Some of the data they encode can't be extracted, even by a computationally unbounded adversary.

The researchers show how to calculate the minimum-security guarantees for any given encryption scheme, which could enable information managers to make more informed decisions about how to protect data.

"By investigating these limits and characterizing them, you can gain quite a bit of insight about the performance of these schemes and how you can leverage tools from other fields, like coding theory and so forth, for designing and understanding security systems," says Flavio du Pin Calmon, a graduate student in electrical engineering and computer science and first author on all three Allerton papers. His advisor, Muriel Mdard, the Cecil E. Green Professor of Electrical Engineering and Computer Science, is also on all three papers; they're joined by colleagues including Ken Duffy of Maynooth and Mayank Varia of MIT's Lincoln Laboratory.

The researchers' mathematical framework also applies to the problem of data privacy, or how much information can be gleaned from aggregated and supposedly "anonymized" data about Internet users' online histories. If, for instance, Netflix releases data about users' movie preferences, is it also inadvertently releasing data about their political preferences? Calmon and his colleagues' technique could help data managers either modify aggregated data or structure its presentation in a way that minimizes the risk of privacy compromises.

Staying close

Excerpt from:
Raising cryptography's standards

Related Posts
This entry was posted in $1$s. Bookmark the permalink.