How to keep your email private with PGP encryption on your Mac

In our last episode of Private I, I explained the basics of public-key (PK) cryptography, a way to scramble messages in a way that only someone possessing a particular key can decrypt, without that key ever having to be publicly disclosed or shared. Its an effective system that has no known theoretical exploits, and currently deployed implementations are considered robust.

And to recap: The clever bit with the public-key approach is that you have two complementary keys, one public and one private. The public key can be freely distributed. Anything encrypted by someone else with the public key can only be decrypted by having access to the corresponding private key. And a private key can be used to sign a string of text or a document to prove mathematically that only the private keys possessor could have signed it.

But there are two missing pieces that would let Mac, iOS, and other platforms users take advantage of PK. The first is pragmatic: Senders and recipients need compatible software tools or plugins, preferably integrated into apps so that little effort is required. The second is existential: Without pre-arrangement, such as meeting in person or a phone call, how do you know that what purports to be someones public key is actually that persons key?

The easiest way to solve both problems is to use an end-to-end proprietary ecosystem, but that gets us back, more or less, to iMessage or something similar. Silent Circle has one of the best options that embeds public-key cryptography, if you can convince all the people with whom you need to communicate to opt in. It starts at $10 per month for unlimited text, calls, video chat, and file transfers among its users. The services messaging and calling options received scores of 7 out of 7 in the Electronic Frontier Foundations secure messaging scorecard.

But most of us dont live in a walled garden, and one of the companys founders, Phil Zimmermann, is responsible nearly 25 years ago for turning public-key cryptography into what he called PGP, for Pretty Good Privacy. (How PGP works is described in Part 1.)

Composing a message in Mail to a recipient whose key is in your local GPG Keychain, the lock icon can be clicked to encrypt the message when sent.

PGP is available for the Mac via GPGTools, a version of the free software GPG (GNU Privacy Guard). It lets you build a directory of other peoples public keys, while also letting you carry out encryption, decryption, signing, and verifying. (PGP is a trademark, and GPG coined to get around it, but youll often see PGP used generically to refer to this method of using public keys.)

The EFF has very nice step-by-step instructions for installing GPGTools to allow it to be used directly with either Apple Mail or Mozilla Thunderbird for email; the tools are also available via the application Services menu wherever you can manipulate or select text. GPGTools is currently free, but plans to charge a very modest fee for its email plug-in at some point to help support development costs.

The sent message is shown in the Sent mailbox as being encrypted, and has to be decrypted to view as in this window.

The EFF instructions walk you through creating your own public/private key in GPG Keychain. To use GPGTools with email, your key needs to have the same email address as the return address from which you want to send encrypted messages. Once you have a key, you can upload a key to a keyserver by selecting your key and choosing Key > Send Public Key to Keyserver. This makes your key searchable by your name and email address in a PGP directory. A key has an associated fingerprint, a cryptographic transformation of the public key thats far shorter, which Ill get to in a moment.

Original post:
How to keep your email private with PGP encryption on your Mac

Light, meet matter: Single-photon quantum memory in diamond optical phonons at room temperature

11 hours ago by Stuart Mason Dambrot Experimental concept, energy level diagram, and setup. (a) The memory protocol. A horizontally (H) polarized single photon (green, 723 nm) is written into the quantum memory with a vertically (V) polarized write pulse (red, 800 nm). After a delay , an H-polarized read pulse recalls a V-polarized photon. (b) Energy levels in the memory. The ground state j0i and the storage state |1>correspond to the crystal ground state and an optical phonon, respectively. The signal photon and the read-write pulses are in two-photon resonance with the optical phonon (40 THz) and are far detuned from the conduction band j2i. (c) The experimental setup. The laser output is split to pump the photon source and to produce the orthogonally polarized read and write beams. The photons are produced in pairs with one (signal) at 723 nm and the other (herald) at 895 nm. The signal photon is stored in, and recalled from, the quantum memory. The herald and signal photons are detected using APDs and correlations between them are measured using a coincidence logic unit. Credit: D. G. England, K. A.G. Fisher, J-P. W. MacLean, P. J. Bustard, R. Lausten, K. J. Resch, and B. J. Sussman, Storage and Retrieval of THz-Bandwidth Single Photons Using a Room-Temperature Diamond Quantum Memory, Phys. Rev. Lett. 114, 053602 (2015).

(Phys.org)Photonic quantum technologies including cryptography, enhanced measurement and information processing face a conundrum: They require single photons, but these are difficult to create, manipulate and measure. At the same time, quantum memories enable these technologies by acting as a photonic buffer. Therefore, an ideal part of the solution would be a single-photon on-demand read/write quantum memory. To date, however, development of a practical single-photon quantum memory has been stymied by (1) the need for high efficiency, (2) the read/write lasers used introducing noise that contaminates the quantum state, and (3) decoherence of the information stored in the memory.

Recently, scientists at National Research Council of Canada, Ottawa and Institute for Quantum Computing, University of Waterloo demonstrated storage and retrieval of terahertz-bandwidth single photons via a quantum memory in the optical phonons modes of a room-temperature bulk diamond. The researchers report that the quantum memory is low noise, high speed and broadly tunable, and therefore promises to be a versatile light-matter interface for local quantum processing applications. Moreover, unlike existing approaches, the novel device does not require cooling or optical preparation before storage, and is a few millimeters in size. The scientists conclude that diamond is a robust, convenient, and high-speed system extremely well-suited to evaluating operational memory parameters, studying the effects of noise, and developing quantum protocols.

Prof. Benjamin J. Sussman discussed the paper that he, Prof. Kevin Resch, Dr. Duncan G. England, and their colleagues published in Physical Review Letters. "The possibility of using single photons in quantum technologies offers a host of new opportunities in measurement and communications," Sussman tells Phys.org. "However, it's challenging to do so because the light we typically use that is, from the sun, light bulbs, or lasers contains tremendous numbers of photons." Therefore, much of the technology for manipulating and measuring light (including naturally-evolved light-detecting biological organs, such as our eye) have been designed to deal with larger numbers of photons and in addition, background noise from the faintest light source can mask these single photons.

"Creating a single photon is also a formidable problem," Sussman continues, adding that to generate single photons the scientists employ a low probability stochastic quantum optics process called spontaneous parametric down-conversion (SPDC). The method of generation is very effective, but the challenge is that being a probabilistic process a photon is generated not on demand, but unpredictably. "We have to wait for success and then perform an experiment, which means most of the time the experiment fails," Sussman explains. "However, quantum memories are very interesting because they act as photon buffers, and can convert a probabilistic process into a deterministic one. This effectively turns a repeat-until-success single-photon source into an on-demand source."

Sussman notes that the most difficult technical obstacle was verifying the non-classical photon statistics of the memory output. To determine whether single photons were actually retrieved from quantum memory, the scientists performed a so-called g(2) measurement (the degree of coherence between two fields) in which the output photon was coupled into a 50:50 beam splitter, and detectors placed at both output ports. "Because single photons are indivisible, one would never expect to measure coincident detection in both arms and this is what we were able to confirm. Nevertheless, experiments aren't perfect and where the single photon is even slightly contaminated by background noise, we very occasionally make a coincidence measurement. As a result, measuring enough of these coincidences in order to collect significant statistics required over 150 hours of continuous data acquisition." He adds that graduate students Kent Fisher and JP MacLean worked tirelessly to perform the experiment.

"A quantum memory is a conversion between quantum states of light and matter," Sussman tells Phys.org. "However, decoherence is constantly destroying the crucial quantum nature of the matter system, and thus the advantages of quantum technologies. Typically the narrow linewidths of the quantum levels involved limit the bandwidth of such memories to the gigahertz range or below. Our challenge was therefore to work with very short pulses of light to beat decoherence that is, to perform our operations before the system decays. Again, ultrafast Spontaneous Parametric Down-conversion is the most popular source of high purity single photons but with femtosecond oscillators it produces THz-bandwidth photons that can't fully be utilized in lower bandwidth systems. We were able to bridge this three orders of magnitude gap between light and matter by building an ultrafast capable quantum memory."

Since all quantum systems suffer from decoherence effects when they interact with an external environment, isolating the quantum system from its environment is a universal problem in quantum technology. "The key insight behind our experiment was that ultrafast lasers can avoid decoherence. Rather than try to isolate our memory from the environment, we address it on timescales that are fast compared to decoherence by using ultrafast laser pulses of ~200 femtoseconds duration."

Sussman notes that ultrafast lasers were developed to study picosecond and femtosecond dynamics in molecular and bulk phonon vibrations. "It's therefore not surprising that we'd employ these vibration or similar systems as substrates to operate at ultrafast speeds for quantum processing and Dr. England was able to leverage his expertise in these two areas to bridge the National Research Council and Institute for Quantum Computing teams and make the project a success."

The paper states that because the quantum memory is low noise, high speed and broadly tunable, it promises to be a versatile light-matter interface for local quantum processing applications. Sussman explains that the interface between light and matter is an important frontier for quantum information science, in that it combines the advantages of photonic qubits (which move fast and have long decoherence times) with those of matter qubits (stationary and with strong interactions). "The diamond memory is an important innovation because it provides a robust and convenient platform on which to investigate this interface," which he adds are due to its key advantages:

More here:
Light, meet matter: Single-photon quantum memory in diamond optical phonons at room temperature

Oscar Nominees Put Science in the Spotlight

The Oscar buzz is at a high hum for Sunday's Academy Awards ceremony, and this year, some of that buzz is helping to make scientific subjects ranging from World War II cryptography to wormholes and the "Theory of Everything" anything but ho-hum.

The nominees include:

When you add in less serious fare, such as "Guardians of the Galaxy" (two Oscar nods) and "Big Hero 6" (which is up for the animated-feature award), that equals enough science fiction and science fact to merit an Academy Awards category of its own.

Does it matter that the historical truth in the sci-biopics, and the scientific principles behind "Interstellar," get a little stretched during the Hollywoodification process? Not necessarily, says Seth Shostak, an astronomer at the SETI Institute who has consulted on movies ranging from "Contact" to the Keanu Reeves remake of "The Day the Earth Stood Still."

"If you had told me 20 years ago that computer scientists and cosmologists would be the heroes of a Hollywood film, I would have felt like someone had tasered me. I wouldn't have believed it," he told NBC News. "Filmmakers aren't trying to teach anybody computer science or cosmology, nor would they be very good at that. They're just trying to portray the fact that science is actually interesting and important, and what could be better than that?"

If the scientific angle is plausible, and the story grabs the viewer, the fact that a movie motivates some folks to dive into down-to-earth science is a valuable bonus.

"Many scientists go into the field of science, particularly in astronomy ... because they saw some movie when they were a kid," Shostak said last November when "Interstellar" came out. "Movies have a big effect on young people in terms of shaping their interest."

With that in mind, here are some pointers to the science underlying the tales of Turing, Hawking and the wormhole trekkers of "Interstellar":

"The Imitation Game" focuses on the British effort to crack the secret codes that were used by the Germans to communicate via radio codes that were created with the help of a typewriter-like device known as the Enigma machine. Turing masterminded the creation of a primitive computer to run through all the possible permutations, but it turns out that even math whizzes and their machines needed a little help from the human factor.

During a Google Hangout about Hollywood science, Columbia neuroscientist Sean Escola said the same situation holds true for today's code-breakers, who rely on phishing and other real-life stratagems as well as brute-force computing.

Original post:
Oscar Nominees Put Science in the Spotlight

True Random Organic Cryptography Master Key REpurposed Einstein physics principle Bent Reality – Video


True Random Organic Cryptography Master Key REpurposed Einstein physics principle Bent Reality
True Random 105 0503 True Random Organic Cryptography Master Key 105 0503 Re-creation REpurposed Einstein physics principle Bent in Reality Drone Lights eh T...

By: Glacier SpaMud

Originally posted here:
True Random Organic Cryptography Master Key REpurposed Einstein physics principle Bent Reality - Video

TrueCrypt audit back on track after silence and uncertainty

Phase two of the project will begin shortly and will be done by a professional team of consultants

An effort to search for cryptographic flaws in TrueCrypt, a popular disk encryption program, will resume even though the software was abandoned by its creators almost a year ago.

For years TrueCrypt has been the go-to open-source tool for people looking to encrypt files on their computers, especially since it's one of the few solutions to allow encrypting the OS volume.

In October 2013, cryptography professor Matthew Green and security researcher Kenneth White launched a project to perform a professional security audit of TrueCrypt. This was partly prompted by the leaks from former U.S. National Security Agency contractor Edward Snowden that suggested the NSA was engaged in efforts to undermine encryption.

Green and White's Open Crypto Audit Project started accepting donations and contracted iSEC Partners, a subsidiary of information assurance company NCC Group, to probe critical parts of the TrueCrypt code for software vulnerabilities. The firm found some issues, but nothing critical that could be described as a backdoor. Their report, published in April 2014, covered the first phase of the audit.

Phase two was supposed to involve a formal review of the program's encryption functions, with the goal of uncovering any potential errors in the cryptographic implementations -- but then the unexpected happened.

In May 2014, the developers of TrueCrypt, who had remained anonymous over the years for privacy reasons, abruptly announced that they were discontinuing the project and advised users to switch to alternatives.

"This threw our plans for a loop," Green said in a blog post Tuesday. "We had been planning a crowdsourced audit to be run by Thomas Ptacek and some others. However in the wake of TC pulling the plug, there were questions: Was this a good use of folks' time and resources? What about applying those resources to the new 'Truecrypt forks' that have sprung up (or are being developed?)"

Now, almost a year later, the project is back on track. Ptacek, a cryptography expert and founder of Matasano Security, will no longer lead the cryptanalysis and the effort will no longer be crowdsourced. Instead, phase two of the audit will be handled by Cryptography Services, a team of consultants from iSEC Partners, Matasano, Intrepidus Group, and NCC Group.

The cost of professional crypto audits is usually very high, exceeding the US$70,000 the Open Crypto Audit Project raised through crowdfunding. To keep the price down, the project had to be flexible with its time frame and work around Cryptography Services' other engagements.

Originally posted here:
TrueCrypt audit back on track after silence and uncertainty

Prevoty releases cryptography service to make encryption easier for developers at no cost

Los Angeles, CA (PRWEB) February 18, 2015

Prevoty, Inc., a new security software company providing runtime application security technology, today announced availability of the Prevoty Cryptography Service (PCS), a new free service providing sophisticated encryption, decryption and hashing for developers.

Implementing and using cryptographic functions within a single software development project, let alone a business application, is a complicated and error-prone process. A developer ultimately has to weigh the merits of various algorithms, select a particular algorithm, ensure its implementation is verifiably correct and pass the correct arguments for execution.

This, combined with the realization that the average developer is unlikely to be aware of exactly how cryptographic functions actually work, has resulted in secure data not being properly encrypted or data breaches exposing plain-text passwords. For larger organizations, having different developers make these decisions independently compounds that risk.

Few will disagree that encrypting sensitive data is really important, said Jamil Farshchi, Chief Information Security Officer at Time Warner, Inc. But a key challenge to realizing the value of encryption is standardizing the implementation and use of cryptographic functions across the entire suite of business applications.

PCS provides a cloud interface for developers to achieve industry-standard security. Applications enabled with the service have the ability to simply encrypt, decrypt, hash, and generate keys and random numbers in a manner that is both secure and verifiable.

PCS enables robust cryptography to be used across enterprises without developers needing to become security experts, said Kunal Anand, co-founder and CTO of Prevoty. Developers can trust that the supported cryptographic functions within PCS are always maintained and updated with the latest security guidance.

Applications developed in C#, Go, Java, node.js, PHP, Python and Ruby can take advantage of this service by including the appropriate Prevoty SDK and invoking the desired functions.

To reduce complexity, PCS has pre-built aliases for developers to accurately hash passwords and encrypt content without even having to specify an encryption key, cipherkey or initialization vector.

More information on PCS is available at https://www.prevoty.com/developer and developers can gain access to the Prevoty Cryptography Service for free at http://info.prevoty.com/free-developer-services.

More here:
Prevoty releases cryptography service to make encryption easier for developers at no cost