The Encrypting File System – technet.microsoft.com

By Roberta Bragg

An Overview of the Encrypting File SystemWhat EFS IsBasic How-tosPlanning for and Recovering Encrypted Files: Recovery PolicyHow EFS WorksKey Differences Between EFS on Windows 2000, Windows XP, and Windows Server 2003Misuse and Abuse of EFS and How to Avoid Data Loss or ExposureRemote Storage of Encrypted Files Using SMB File Shares and WebDAVBest Practices for SOHO and Small BusinessesEnterprise How-tosTroubleshootingRadical EFS: Using EFS to Encrypt Databases and Using EFS with Other Microsoft ProductsDisaster RecoveryOverviews and Larger ArticlesSummary

The Encrypting File System (EFS) is a component of the NTFS file system on Windows 2000, Windows XP Professional, and Windows Server 2003. (Windows XP Home doesn't include EFS.) EFS enables transparent encryption and decryption of files by using advanced, standard cryptographic algorithms. Any individual or program that doesn't possess the appropriate cryptographic key cannot read the encrypted data. Encrypted files can be protected even from those who gain physical possession of the computer that the files reside on. Even persons who are authorized to access the computer and its file system cannot view the data. While other defensive strategies should be used, and encryption isn't the correct countermeasure for every threat, encryption is a powerful addition to any defensive strategy. EFS is the built-in file encryption tool for Windows file systems.

However, every defensive weapon, if used incorrectly, carries the potential for harm. EFS must be understood, implemented appropriately, and managed effectively to ensure that your experience, the experience of those to whom you provide support, and the data you wish to protect aren't harmed. This document will

Provide an overview and pointers to resources on EFS.

Point to implementation strategies and best practices.

Name the dangers and counsel mitigation and prevention from harm.

Many online and published resources on EFS exist. The major sources of information are the Microsoft resource kits, product documentation, white papers, and Knowledge Base articles. This paper provides a brief overview of major EFS issues. Wherever possible, it doesn't rework existing documentation; rather, it provides links to the best resources. In short, it maps the list of desired knowledge and instruction to the actual documents where they can be found. In addition, the paper catalogs the key elements of large documents so that you'll be able to find the information you need without having to work your way through hundreds of pages of information each time you have a new question.

The paper discusses the following key EFS knowledge areas:

What EFS is

Basic how-tos, such as how to encrypt and decrypt files, recover encrypted files, archive keys, manage certificates, and back up files, and how to disable EFS

How EFS works and EFS architecture and algorithms

Key differences between EFS on Windows 2000, Windows XP, and Windows Server 2003

Misuse and abuse of EFS and how to avoid data loss or exposure

Remote storage of encrypted files using SMB file shares and WebDAV

Best practices for SOHO and small businesses

Enterprise how-tos: how to implement data recovery strategies with PKI and how to implement key recovery with PKI

Troubleshooting

Radical EFS: using EFS to encrypt databases and using EFS with other Microsoft products

Disaster recovery

Where to download EFS-specific tools

Using EFS requires only a few simple bits of knowledge. However, using EFS without knowledge of best practices and without understanding recovery processes can give you a mistaken sense of security, as your files might not be encrypted when you think they are, or you might enable unauthorized access by having a weak password or having made the password available to others. It might also result in a loss of data, if proper recovery steps aren't taken. Therefore, before using EFS you should read the information links in the section "Misuse and Abuse of EFS and How to Avoid Data Loss or Exposure." The knowledge in this section warns you where lack of proper recovery operations or misunderstanding can cause your data to be unnecessarily exposed. To implement a secure and recoverable EFS policy, you should have a more comprehensive understanding of EFS.

You can use EFS to encrypt files stored in the file system of Windows 2000, Windows XP Professional, and Windows Server 2003 computers. EFS isn't designed to protect data while it's transferred from one system to another. EFS uses symmetric (one key is used to encrypt the files) and asymmetric (two keys are used to protect the encryption key) cryptography. An excellent primer on cryptography is available in the Windows 2000 Resource Kit as is an introduction to Certificate Services. Understanding both of these topics will assist you in understanding EFS.

A solid overview of EFS and a comprehensive collection of information on EFS in Windows 2000 are published in the Distributed Systems Guide of the Windows 2000 Server Resource Kit. This information, most of which resides in Chapter 15 of that guide, is published online at http://www.microsoft.com/technet/prodtechnol/windows2000serv/reskit/default.mspx. (On this site's page, use the TOC to go to the Distributed Systems Guide, Distributed Security, Encrypting File System.)

There are differences between EFS in Windows 2000, Windows XP Professional, and Windows Server 2003. The Windows XP Professional Resource Kit explains the differences between Windows 2000 and Windows XP Professionals implementation of EFS, and the document "Encrypting File System in Windows XP and Windows Server 2003" (http://www.microsoft.com/technet/prodtechnol/winxppro/deploy/cryptfs.mspx) details Windows XP and Windows Server 2003 modifications. The section below, "Key Differences between EFS on Windows 2000, Windows XP, and Windows Server 2003," summarizes these differences.

The following are important basic facts about EFS:

EFS encryption doesn't occur at the application level but rather at the file-system level; therefore, the encryption and decryption process is transparent to the user and to the application. If a folder is marked for encryption, every file created in or moved to the folder will be encrypted. Applications don't have to understand EFS or manage EFS-encrypted files any differently than unencrypted files. If a user attempts to open a file and possesses the key to do so, the file opens without additional effort on the user's part. If the user doesn't possess the key, they receive an "Access denied" error message.

File encryption uses a symmetric key, which is then itself encrypted with the public key of a public key encryption pair. The related private key must be available in order for the file to be decrypted. This key pair is bound to a user identity and made available to the user who has possession of the user ID and password. If the private key is damaged or missing, even the user that encrypted the file cannot decrypt it. If a recovery agent exists, then the file may be recoverable. If key archival has been implemented, then the key may be recovered, and the file decrypted. If not, the file may be lost. EFS is an excellent file encryption systemthere is no "back door."

File encryption keys can be archived (e.g. exported to a floppy disk) and kept in a safe place to ensure recovery should keys become damaged.

EFS keys are protected by the user's password. Any user who can obtain the user ID and password can log on as that user and decrypt that user's files. Therefore, a strong password policy as well as strong user education must be a component of each organization's security practices to ensure the protection of EFS-encrypted files.

EFS-encrypted files don't remain encrypted during transport if saved to or opened from a folder on a remote server. The file is decrypted, traverses the network in plaintext, and, if saved to a folder on the local drive that's marked for encryption, is encrypted locally. EFS-encrypted files can remain encrypted while traversing the network if they're being saved to a Web folder using WebDAV. This method of remote storage isn't available for Windows 2000.

EFS uses FIPS 140-evaluated Microsoft Cryptographic Service Providers (CSPcomponents which contain encryption algorithms for Microsoft products).

EFS functionality is straightforward, and you can find step-by-step instructions in many documents online. Links to specific articles for each possible EFS function, as well as some documents which summarize multiple functionality, follow. If the document is a Knowledge Base article, the Knowledge Base number appears in parentheses after the article title.

Encrypting and Decrypting

The process of encrypting and decrypting files is very straightforward, but its important to decide what to encrypt and to note differences in EFS based on the operating system.

Sharing Encrypted Files

The GUI for sharing encrypted files is available only in Windows XP and Windows Server 2003.

A recovery policy can be an organization's security policy instituted to plan for proper recovery of encrypted files. It's also the policy enforced by Local Security Policy Public Key Policy or Group Policy Public Key Policy. In the latter, the recovery policy specifies how encrypted files may be recovered should the user private key be damaged or lost and the encrypted file unharmed. Recovery certificate(s) are specified in the policy. Recovery can be either data recovery (Windows 2000, Windows XP Professional, and Windows Server 2003) or key recovery (Windows Server 2003 with Certificate Services). Windows 2000 EFS requires the presence of a recovery agent (no recovery agent, no file encryption), but Windows XP and Windows Server 2003 don't. By default, Windows 2000 and Windows Server 2003 have default recovery agents assigned. Windows XP Professional doesn't.

The data recovery process is simple. The user account bound to the recovery agent certificate is used to decrypt the file. The file should then be delivered in a secure manner to the file owner, who may then encrypt the file. Recovery via automatically archived keys is available only with Windows Server 2003 Certificate Services. Additional configuration beyond the installation of Certificate Services is required. In either case, it's most important that a written policy and procedures for recovery are in place. These procedures, if well written and if followed, can ensure that recovery keys and agents are available for use and that recovery is securely carried out. Keep in mind that there are two definitions for "recovery policy." The first definition refers to a written recovery policy and procedures that describe the who, what, where, and when of recovery, as well as what steps should be taken to ensure recovery components are available. The second definition, which is often referred to in the documents below, is the Public Key Policy that's part of the Local Security Policy on stand-alone systems, or Group Policy in a domain. It can specify which certificates are used for recovery, as well as other aspects of Public Key Policies in the domain. You can find more information in the following documents:

Disabling or Preventing Encryption

You may decide that you don't wish users to have the ability to encrypt files. By default, they do. You may decide that specific folders shouldn't contain encrypted files. You may also decide to disable EFS until you can implement a sound EFS policy and train users in proper procedures. There are different ways of disabling EFS depending on the operating system and the desired effect:

System folders cannot be marked for encryption. EFS keys aren't available during the boot process; thus, if system files were encrypted, the system file couldn't boot. To prevent other folders being marked for encryption, you can mark them as system folders. If this isn't possible, then a method to prevent encryption within a folder is defined in "Encrypting File System."

NT 4.0 doesn't have the ability to use EFS. If you need to disable EFS for Windows 2000 computers joined to a Windows NT 4.0 domain, see "Need to Turn Off EFS on a Windows 2000-Based Computer in Windows NT 4.0-Based Domain" (288579). The registry key mentioned can also be used to disable EFS in Window XP Professional and Windows Server 2003.

Disabling EFS for Windows XP Professional can also be done by clearing the checkbox for the property page of the Local Security Policy Public Key Policy. EFS can be disabled in XP and Windows Server 2003 computers joined in a Windows Server 2003 domain by clearing the checkbox for the property pages of the domain or organizational unit (OU) Group Policy Public Key Policy.

"HOW TO: Disable/Enable EFS on a Stand-Alone Windows 2000-Based Computer" (243035) details how to save the recovery agent's certificate and keys when disabling EFS so that you can enable EFS at a future date.

"HOW TO: Disable EFS for All Computers in a Windows 2000-Based Domain" (222022) provides the best instruction set and clearly defines the difference between deleted domain policy (an OU-based policy or Local Security Policy can exist) versus Initialize Empty Policy (no Windows 2000 EFS encryption is possible throughout the domain).

Special Operations

Let enough people look at anything, and you'll find there are questions that are just not answered by existing documentation or options. A number of these issues, third-party considerations, and post introduction issues can be resolved by reviewing the following articles.

Specifications for the use of a third-party Certification Authority (CA) can be found at "Third-Party Certification Authority Support for Encrypting File System" (273856). If you wish to use third-party CA certificates for EFS, you should also investigate certificate revocation processing. Windows 2000 EFS certificates aren't checked for revocation. Windows XP and Windows Server 2003 EFS certificates are checked for revocation in some cases, and third-party certificates may be rejected. Information about certificate revocation handling in EFS can be found in the white paper "Encrypting File System in Windows XP and Windows Server 2003".

When an existing plaintext file is marked for encryption, it's first copied to a temporary file. When the process is complete, the temporary file is marked for deletion, which means portions of the original file may remain on the disk and could potentially be accessible via a disk editor. These bits of data, referred to as data shreds or remanence, may be permanently removed by using a revised version of the cipher.exe tool. The tool is part of Service Pack 3 (SP3) for Windows 2000 and is included in Windows Server 2003. Instructions for using the tool, along with the location of a downloadable version, can be found in "HOW TO: Use Cipher.exe to Overwrite Deleted Data in Windows" (315672) and in "Cipher.exe Security Tool for the Encrypting File System" (298009).

How to make encrypted files display in green in Windows Explorer is explained in "HOW TO: Identify Encrypted Files in Windows XP" (320166).

"How to Enable the Encryption Command on the Shortcut Menu" (241121) provides a registry key to modify for this purpose.

You may wish to protect printer spool files or hard copies of encrypted files while they're printing. Encryption is transparent to the printing process. If you have the right (possess the key) to decrypt the file and a method exists for printing files, the file will print. However, two issues should concern you. First, if the file is sensitive enough to encrypt, how will you protect the printed copy? Second, the spool file resides in the system32SpoolPrinters folder. How can you protect it while its there? You could encrypt that folder, but that would slow printing enormously. The Windows 2000 Resource Kit proposes a separate printer for the printing of these files and how to best secure that printer in the Distributed Systems, Distributed Security, Encrypting Files System, Printing EFS Files section.

To understand EFS, and therefore anticipate problems, envision potential attacks, and troubleshoot and protect EFS-encrypted files, you should understand the architecture of EFS and the basic encryption, decryption, and recovery algorithms. Much of this information is in the Windows 2000 Resource Kit Distributed Systems Guide, the Windows XP Professional Resource Kit, and the white paper, "Encrypting File System in Windows XP and Windows Server 2003." Many of the algorithms are also described in product documentation. The examples that follow are from the Windows XP Professional Resource Kit:

A straightforward discussion of the components of EFS, including the EFS service, EFS driver, and the File System Run Time Library, is found in "Components of EFS," a subsection of Chapter 17, "Encrypting File System" in the Windows XP Professional Resource Kit.

A description of the encryption, decryption, and recovery algorithms EFS uses is in the Resource Kit section "How Files Are Encrypted." This section includes a discussion of the file encryption keys (FEKs) and file Data Recovery Fields and Data Decryption Fields used to hold FEKs encrypted by user and recovery agent public keys.

"Working with Encryption" includes how-to steps that define the effect of decisions made about changing the encryption properties of folders. The table defines what happens for each file (present, added later, or copied to the folder) for the choice "This folder only" or the option "This folder, subfolders and files."

"Remote EFS Operations on File Shares and Web Folders" defines what happens to encrypted files and how to enable remote storage.

EFS was introduced in Windows 2000. However, there are differences when compared with Windows XP Professional EFS and Windows Server 2003 EFS, including the following:

You can authorize additional users to access encrypted files (see the section "Sharing Encrypted Files", above). In Windows 2000, you can implement a programmatic solution for the sharing of encrypted files; however, no interface is available. Windows XP and Windows Server 2003 have this interface.

Offline files can be encrypted. See "HOW TO: Encrypt Offline Files to Secure Data in Windows XP."

Data recovery agents are recommended but optional. XP doesn't automatically include a default recovery agent. XP will take advantage of an existing Windows 2000 domain-level recovery agent if one is present, but the lack of a domain recovery agent wont prevent encryption of files on an XP system. A self-signed recovery agent certificate can be requested by using the cipher /R:filename command, where filename is the name that will be used to create a *.cer file to hold the certificate and a *.pfx file to hold the certificate and private key.

The Triple DES (3DES) encryption algorithm can be used to replace Data Encryption Standard X (DESX), and after XP SP1, Advanced Encryption Standard (AES) becomes the default encryption algorithm for EFS.

For Windows XP and Windows Server 2003 local accounts, a password reset disk can be used to safely reset a user's password. (Domain passwords cannot be reset using the disk.) If an administrator uses the "reset password" option from the user's account in the Computer Management console users container, EFS files won't be accessible. If users change the password back to the previous password, they can regain access to encrypted files. To create a password reset disk and for instructions about how to use a password reset disk, see product documentation and/or the article "HOW TO: Create and Use a Password Reset Disk for a Computer That Is Not a Domain Member in Windows XP" (305478).

Encrypted files can be stored in Web folders. The Windows XP Professional Resource Kit section "Remote EFS Operations in a Web Folder Environment" explains how.

Windows Server 2003 incorporates the changes introduced in Windows XP Professional and adds the following:

A default domain Public Key recovery policy is created, and a recovery agent certificate is issued to the Administrator account.

Certificate Services include the ability for customization of certificate templates and key archival. With appropriate configuration, archival of user EFS keys can be instituted and recovery of EFS-encrypted files can be accomplished by recovering the user's encryption keys instead of decrypting via a file recovery agent. A walk-through providing a step-by-step configuration of Certificate Services for key archival is available in "Certificate Services Example Implementation: Key Archival and Recovery."

Windows Server 2003 enables users to back up their EFS key(s) directly from the command line and from the details property page by clicking a "Backup Keys" button.

Unauthorized persons may attempt to obtain the information encrypted by EFS. Sensitive data may also be inadvertently exposed. Two possible causes of data loss or exposure are misuse (improper use of EFS) or abuse (attacks mounted against EFS-encrypted files or systems where EFS-encrypted files exist).

Inadvertent Problems Due to Misuse

Several issues can cause problems when using EFS. First, when improperly used, sensitive files may be inadvertently exposed. In many cases this is due to improper or weak security policies and a failure to understand EFS. The problem is made all the worse because users think their data is secure and thus may not follow usual precautionary methods. This can occur in several scenarios:

If, for example, users copy encrypted files to FAT volumes, the files will be decrypted and thus no longer protected. Because the user has the right to decrypt files that they encrypted, the file is decrypted and stored in plaintext on the FAT volume. Windows 2000 gives no warning when this happens, but Windows XP and Windows Server 2003 do provide a warning.

If users provide others with their passwords, these people can log on using these credentials and decrypt the user's encrypted files. (Once a user has successfully logged on, they can decrypt any files the user account has the right to decrypt.)

If the recovery agent's private key isn't archived and removed from the recovery agent profile, any user who knows the recovery agent credentials can log on and transparently decrypt any encrypted files.

By far, the most frequent problem with EFS occurs when EFS encryption keys and/or recovery keys aren't archived. If keys aren't backed up, they cannot be replaced when lost. If keys cannot be used or replaced, data can be lost. If Windows is reinstalled (perhaps as the result of a disk crash) the keys are destroyed. If a user's profile is damaged, then keys are destroyed. In these, or in any other cases in which keys are damaged or lost and backup keys are unavailable, then encrypted files cannot be decrypted. The encryption keys are bound to the user account, and a new iteration of the operating system means new user accounts. A new user profile means new user keys. If keys are archived, or exported, they can be imported to a new account. If a revocation agent for the files exists, then that account can be used to recover the files. However, in many cases in which keys are destroyed, both user and revocation keys are absent and there is no backup, resulting in lost data.

Additionally, many other smaller things may render encrypted files unusable or expose some sensitive data, such as the following:

Finally, keeping data secure takes more than simply encrypting files. A systems-wide approach to security is necessary. You can find several articles that address best practices for systems security on the TechNet Best Practices page at http://www.microsoft.com/technet/archive/security/bestprac/bpent/sec2/secentbb.mspx. The articles include

Attacks and Countermeasures: Additional Protection Mechanisms for Encrypted Files

Any user of encrypted files should recognize potential weaknesses and avenues of attack. Just as its not enough to lock the front door of a house without considering back doors and windows as avenues for a burglar, encrypting files alone isn't enough to ensure confidentiality.

Use defense in depth and use file permissions. The use of EFS doesn't obviate the need to use file permissions to limit access to files. File permissions should be used in addition to EFS. If users have obtained encryption keys, they can import them to their account and decrypt files. However, if the user accounts are denied access to the file, the users will be foiled in their attempts to gain this sensitive information.

Use file permissions to deny delete. Encrypted files can be deleted. If attackers cannot decrypt the file, they may choose to simply delete it. While they don't have the sensitive information, you don't have your file.

Protect user credentials. If an attacker can discover the identity and password of a user who can decrypt a file, the attacker can log on as that user and view the files. Protecting these credentials is paramount. A strong password policy, user training on devising strong passwords, and best practices on protecting these credentials will assist in preventing this type of attack. An excellent best practices approach to password policy can be found in the Windows Server 2003 product documentation. If account passwords are compromised, anyone can log on using the user ID and password. Once user have successfully logged on, they can decrypt any files the user account has the right to decrypt. The best defense is a strong password policy, user education, and the use of sound security practices.

Protect recovery agent credentials. Similarly, if an attacker can log on as a recovery agent, and the recovery agent private key hasn't been removed, the attacker can read the files. Best practices dictate the removal of the recovery agent keys, the restriction of this account's usage to recovery work only, and the careful protection of credentials, among other recovery policies. The sections about recovery and best practices detail these steps.

Seek out and manage areas where plaintext copies of the encrypted files or parts of the encrypted files may exist. If attackers have possession of, or access to, the computer on which encrypted files reside, they may be able to recover sensitive data from these areas, including the following:

Data shreds (remanence) that exist after encrypting a previously unencrypted file (see the "Special Operations" section of this paper for information about using cipher.exe to remove them)

The paging file (see "Increasing Security for Open Encrypted Files," an article in the Windows XP Professional Resource Kit, for instructions and additional information about how to clear the paging file on shutdown)

Hibernation files (see "Increasing Security for Open Encrypted Files" at http://technet.microsoft.com/library/bb457116.aspx)

Temporary files (to determine where applications store temporary files and encrypt these folders as well to resolve this issue

Printer spool files (see the "Special Operations" section)

Provide additional protection by using the System Key. Using Syskey provides additional protection for password values and values protected in the Local Security Authority (LSA) Secrets (such as the master key used to protect user's cryptographic keys). Read the article "Using the System Key" in the Windows 2000 Resource Kit's Encrypting File System chapter. A discussion of the use of Syskey, and possible attacks against a Syskey-protected Windows 2000 computer and countermeasures, can be found in the article "Analysis of Alleged Vulnerability in Windows 2000 Syskey and the Encrypting File System."

If your policy is to require that data is stored on file servers, not on desktop systems, you will need to choose a strategy for doing so. Two possibilities existeither storage in normal shared folders on file servers or the use of web folders. Both methods require configuration, and you should understand their benefits and risks.

If encrypted files are going to be stored on a remote server, the server must be configured to do so, and an alternative method, such as IP Security (IPSec) or Secure Sockets Layer (SSL), should be used to protect the files during transport. Instructions for configuring the server are discussed in "Recovery of Encrypted Files on a Server" (283223) and "HOW TO: Encrypt Files and Folders on a Remote Windows 2000 Server" (320044). However, the latter doesn't mention a critical step, which is that the remote server must be trusted for delegation in Active Directory. Quite a number of articles can be found, in fact, that leave out this step. If the server isn't trusted for delegation in Active Directory, and a user attempts to save the file to the remote server, an "Access Denied" error message will be the result.

If you need to store encrypted files on a remote server in plaintext (local copies are kept encrypted), you can. The server must, however, be configured to make this happen. You should also realize that once the server is so configured, no encrypted files can be stored on it. See the article "HOW TO: Prevent Files from Being Encrypted When Copied to a Server" (302093).

You can store encrypted files in Web folders when using Windows XP or Windows Server 2003. The Windows XP Professional Resource Kit section "Remote EFS Operations in a Web Folder Environment" explains how.

If your Web applications need to require authentication to access EFS files stored in a Web folder, the code for using a Web folder to store EFS files and require authentication to access them is detailed in "HOW TO: Use Encrypting File System (EFS) with Internet Information Services" (243756).

Once you know the facts about EFS and have decided how you are going to use it, you should use these documents as a checklist to determine that you have designed the best solution.

By default, EFS certificates are self-signed; that is, they don't need to obtain certificates from a CA. When a user first encrypts a file, EFS looks for the existence of an EFS certificate. If one isn't found, it looks for the existence of a Microsoft Enterprise CA in the domain. If a CA is found, a certificate is requested from the CA; if it isn't, a self-signed certificate is created and used. However, more granular control of EFS, including EFS certificates and EFS recovery, can be established if a CA is present. You can use Windows 2000 or Windows Server 2003 Certificate Services. The following articles explain how.

Troubleshooting EFS is easier if you understand how EFS works. There are also well known causes for many of the common problems that arise. Here are a few common problems and their solutions:

You changed your user ID and password and can no longer decrypt your files. There are two possible approaches to this problem, depending on what you did. First, if the user account was simply renamed and the password reset, the problem may be that you're using XP and this response is expected. When an administrator resets an XP user's account password, the account's association with the EFS certificate and keys is removed. Changing the password to the previous password can reestablish your ability to decrypt your files. For more information, see "User Cannot Gain Access to EFS Encrypted Files After Password Change or When Using a Roaming Profile" (331333), which explains how XP Professional encrypted files cannot be decrypted, even by the original account, if an administrator has changed the password. Second, if you truly have a completely different account (your account was damaged or accidentally deleted), then you must either import your keys (if you've exported them) or ask an administrator to use recovery agent keys (if implemented) to recover the files. Restoring keys is detailed in "HOW TO: Restore an Encrypting File System Private Key for Encrypted Data Recovery in Windows 2000" (242296). How to use a recovery agent to recover files is covered in "Five-Minute Security AdvisorRecovering Encrypted Data Using EFS."

Read the original here:
The Encrypting File System - technet.microsoft.com

FBI cant break the encryption on Texas shooters smartphone

Getty Images | Peter Dazeley

The Federal Bureau of Investigation has not been able to break the encryption on the phone owned by a gunman who killed 26 people in a Texas church on Sunday.

"We are unable to get into that phone," FBI Special Agent Christopher Combs said in a press conference yesterday (see video).

Combs declined to say what kind of phone was used by gunman Devin Kelley, who killed himself after the mass shooting."I'm not going to describe what phone it is because I don't want to tell every bad guy out there what phone to buy, to harass our efforts on trying to find justice here," Combs said.

The phone is an iPhone,The Washington Post reported today:

After the FBI said it was dealing with a phone it couldnt open, Apple reached out to the bureau to learn if the phone was an iPhone and if the FBI was seeking assistance. Late Tuesday an FBI official responded, saying it was an iPhone but the agency was not asking anything of the company at this point. Thats because experts at the FBIs lab in Quantico, Va., are trying to determine if there are other methods to access the phones data, such as through cloud storage backups or linked laptops, these people said.

The US government has been calling on phone makers to weaken their devices' security, but companies have refused to do so.Last year, Apple refused to help the government unlock and decrypt the San Bernardino gunman's iPhone, but the FBI ended up paying hackers for a vulnerability that it used to access data on the device.

Deliberately weakening the security of consumer devices would help criminals target innocent people who rely on encryption to ensure their digital safety, Apple and others have said.

"With the advance of the technology in the phones and the encryptions, law enforcement, whether it's at the state, local, or the federal level, is increasingly not able to get into these phones," Combs said yesterday.

Combs said he has no idea how long it will take before the FBI can break the encryption."I can assure you we are working very hard to get into the phone, and that will continue until we find an answer," he said. The FBI is also examining "other digital media" related to the gunman, he said.

There are currently "thousands of seized devices sit[ting] in storage, impervious to search warrants," Deputy Attorney General Rod Rosenstein said last month.

More:
FBI cant break the encryption on Texas shooters smartphone

DOJ: Strong encryption that we dont have access to is …

Enlarge / US Deputy Attorney General Rod Rosenstein delivers remarks at the 65th Annual Attorney General's Awards Ceremony at the Daughters of the American Revolution Constitution Hall October 25, 2017 in Washington, DC.

Just two days after the FBI said it could not get into the Sutherland Springs shooter's seized iPhone, Politico Pro published a lengthy interview with a top Department of Justice official who has become the "governments unexpected encryption warrior."

According to the interview, which was summarized and published in transcript form on Thursday for subscribers of the website, Deputy Attorney General Rod Rosenstein indicated that the showdown between the DOJ and Silicon Valley is quietly intensifying.

"We have an ongoing dialogue with a lot of tech companies in a variety of different areas," he told Politico Pro. "There's some areas where they are cooperative with us. But on this particular issue of encryption, the tech companies are moving in the opposite direction. They're moving in favor of more and more warrant-proof encryption."

While the battle against encryption has been going on within federal law enforcement circles since at least the early 1990s, Rosenstein has been the most outspoken DOJ official on this issue in recent months.

The DOJ's number two has given multiple public speeches in which he has called for "responsible encryption." The interview with Politico Pro represents the clearest articulation of the DOJs position on this issue, and it suggests that a redux of the 2016 FBI v. Apple showdown is inevitable in the near future.

"I want our prosecutors to know that, if there's a case where they believe they have an appropriate need for information and there is a legal avenue to get it, they should not be reluctant to pursue it," Rosenstein said. "I wouldn't say we're searching for a case. I'd say were receptive, if a case arises, that we would litigate."

What Rosenstein didn't note, however, is that the DOJ and its related agencies, including the FBI, are not taking encryption lying down.

The FBI maintains an office, known as the National Domestic Communications Assistance Center(NDCAC), which actively provides technical assistance to local law enforcement in high profile cases.

In its most recently published minutes from May 2017, the NDCAC said that one of its goals is to make such commercial tools, like Cellebrite's services, "more widely available" to state and local law enforcement. Earlier this year, the NDCAC provided money to Miami authorities to pay Cellebrite to successfully get into a seized iPhone in a local sextortion case.

In the interview, Rosenstein also said he "favors strong encryption."

"I favor strong encryption, because the stronger the encryption, the more secure data is against criminals who are trying to commit fraud," he explained. "And I'm in favor of that, because that means less business for us prosecuting cases of people who have stolen data and hacked into computer networks and done all sorts of damage. So I'm in favor of strong encryption."

"This is, obviously, a related issue, but it's distinct, which is, what about cases where people are using electronic media to commit crimes? Having access to those devices is going to be critical to have evidence that we can present in court to prove the crime. I understand why some people merge the issues. I understand that they're related. But I think logically, we have to look at these differently. People want to secure their houses, but they still need to get in and out. Same issue here."

He later added that the claim that the "absolutist position" that strong encryption should be by definition, unbreakable, is "unreasonable."

"And I think it's necessary to weigh law enforcement equities in appropriate cases against the interest in security," he said.

The DOJ's position runs counter to the consensus of information security experts, who say that it is impossible to build the strongest encryption system possible that would also allow the government access under certain conditions.

"Of course, criminals and terrorists have used, are using, and will use encryption to hide their planning from the authorities, just as they will use many aspects of society's capabilities and infrastructure: cars, restaurants, telecommunications," Bruce Schneier, a well-known cryptographer, wrote last year.

"In general, we recognize that such things can be used by both honest and dishonest people. Society thrives nonetheless because the honest so outnumber the dishonest. Compare this with the tactic of secretly poisoning all the food at a restaurant. Yes, we might get lucky and poison a terrorist before he strikes, but we'll harm all the innocent customers in the process. Weakening encryption for everyone is harmful in exactly the same way."

Rosenstein closed his interview by noting that he understands re-engineering encryption to accommodate government may make it weaker.

"And I think that's a legitimate issue that we can debatehow much risk are we willing to take in return for the reward?" he said.

"My point is simply that I think somebody needs to consider what's on the other side of the balance. There is a cost to having impregnable security, and we've talked about some of the aspects of that. The cost is that criminals are going to be able to get away with stuff, and that's going to prevent us in law enforcement from holding them accountable."

See the article here:
DOJ: Strong encryption that we dont have access to is ...

Trumps DOJ tries to rebrand weakened encryption as …

A high-ranking Department of Justice official took aim at encryption of consumer products today, saying that encryption creates "law-free zones" and should be scaled back by Apple and other tech companies. Instead of encryption that can't be broken, tech companies should implement "responsible encryption" that allows law enforcement to access data, he said.

"Warrant-proof encryption defeats the constitutional balance by elevating privacy above public safety," Deputy Attorney General Rod Rosenstein said in a speech at the US Naval Academy today (transcript). "Encrypted communications that cannot be intercepted and locked devices that cannot be opened are law-free zones that permit criminals and terrorists to operate without detection by police and without accountability by judges and juries."

Rosenstein was nominated by President Donald Trump to be the DOJ's second-highest-ranking official, after Attorney General Jeff Sessions. He was confirmed by the Senate in April.

Rosenstein's speech makes several references to Apple, continuing a battle over encryption between Apple and the US government that goes back to the Obama administration. Last year, Apple refused to help the government unlock and decrypt the San Bernardino gunman's iPhone, but the FBI ended up paying hackers fora vulnerabilitythat it used to access data on the device.

"Fortunately, the government was able to access data on that iPhone without Apple's assistance," Rosenstein said. "But the problem persists. Today, thousands of seized devices sit in storage, impervious to search warrants."

"If companies are permitted to create law-free zones for their customers, citizens should understand the consequences," he also said. "When police cannot access evidence, crime cannot be solved. Criminals cannot be stopped and punished."

We asked Apple for a response to Rosenstein's speech and will update this story if we get one.

Separately, state lawmakers in New York and California have proposed legislationto prohibit the sale of smartphones with unbreakable encryption.

Despite his goal of giving law enforcement access to encrypted data on consumer products, Rosenstein acknowledged the importance of encryption to the security of computer users. He said that "encryption is a foundational element of data security and authentication," that "it is essential to the growth and flourishing of the digital economy," and that "we in law enforcement have no desire to undermine it."

But Rosenstein complained that "mass-market products and services incorporating warrant-proof encryption are now the norm," that instant-messaging service encryption cannot be broken by police, and that smartphone makers have "engineer[ed] away" the ability to give police access to data.

Apple CEO Tim Cook has argued in the past that the intentional inclusion of vulnerabilities in consumer products wouldn't just help law enforcement solve crimesit would also help criminals hack everyday people who rely on encryption to ensure their digital safety.

Rosenstein claimed that this problem can be solved with "responsible encryption." He said:

Responsible encryption is achievable. Responsible encryption can involve effective, secure encryption that allows access only with judicial authorization. Such encryption already exists. Examples include the central management of security keys and operating system updates; the scanning of content, like your e-mails, for advertising purposes; the simulcast of messages to multiple destinations at once; and key recovery when a user forgets the password to decrypt a laptop.

No one calls any of those functions a "back door." In fact, those capabilities are marketed and sought out by many users.

It's not clear exactly how Rosenstein would implement his desired responsible encryption.

Rosenstein's"key recovery when a user forgets the password to decrypt a laptop" reference seems to refer to Apple and Microsoft providing the ability to store recovery keys in the cloud. But users who encrypt Mac or Windows laptops aren't required to do thisthey can store the keys locally only if they prefer. To guarantee law enforcement access in this scenario, people who encrypt laptops would have to be forced to store their keys in the cloud. Alternatively, Apple and Microsoft would have to change the way their disk encryption systems work, overriding the consumer's preference to have an encrypted system that cannot be accessed by anyone else.

Rosenstein gave some further insight into how "responsible encryption" might work in this section of his speech:

We know from experience that the largest companies have the resources to do what is necessary to promote cybersecurity while protecting public safety. A major hardware provider, for example, reportedly maintains private keys that it can use to sign software updates for each of its devices. That would present a huge potential security problem, if those keys were to leak. But they do not leak, because the company knows how to protect what is important. Companies can protect their ability to respond to lawful court orders with equal diligence.

Of course, there are many examples of companies leaking sensitive data due to errors or serious vulnerabilities. The knowledge that errors will happen at some point explains why technology companies take so many precautions to protect customer data. Maintaining a special system that lets third parties access data that would otherwise only be accessible by its owner increases the risk that sensitive data will get into the wrong hands.

Rosenstein claimed that "responsible encryption can protect privacy and promote security without forfeiting access for legitimate law enforcement needs supported by judicial approval." But he doubts that tech companies will do so unless forced to:

Technology companies almost certainly will not develop responsible encryption if left to their own devices. Competition will fuel a mindset that leads them to produce products that are more and more impregnable. That will give criminals and terrorists more opportunities to cause harm with impunity.

"Allow me to conclude with this thought," Rosenstein said just before wrapping up his speech. "There is no constitutional right to sell warrant-proof encryption. If our society chooses to let businesses sell technologies that shield evidence even from court orders, it should be a fully-informed decision."

More:
Trumps DOJ tries to rebrand weakened encryption as ...

The myth of responsible encryption: Experts say it can’t work

Governments want tech companies to create a master key that only law enforcement can use. Security experts say it's a fantasy.

Governments want to have their cake and eat it too.

Many support a concept called responsible encryption, which, the idea goes, would provide complete privacy and security for people while also letting law enforcement see encrypted messages to better protect you.

Sounds fantastic, right? Unfortunately, security specialists say it's a paradox.

Yet the concept continues to rear its head. The most recent responsible-encryption advocate is US Deputy Attorney General Rod Rosenstein. During a speech to the US Naval Academy on Tuesday, Rosenstein called out tech companies for refusing to help with uncovering private messages.

"Responsible encryption can protect privacy and promote security without forfeiting access for legitimate law enforcement needs supported by judicial approval," he said, according to a transcript.

Rosenstein isn't alone. Officials in Australia and the UK have also called for responsible encryption, despite the fact that both governments have suffered major breaches that shatter the concept.

Responsible encryption, according to the lawmakers who demand it, would require companies to create a secret key, or back door, that would make it possible to read coded data. Only the government could access the key, so that with the proper warrant or court order, law enforcement could read through messages. The key would be kept secret -- unless hackers stole it in a breach.

Companies like Apple, WhatsApp and Signal provide end-to-end encryption, meaning people can chat privately, with their messages hidden even from the companies themselves. Such encryption means that only you and the person to whom you sent your messages can read them, since no one else has a key to unlock the code.

End-to-end encryption provides security and privacy for people who want to make sure no one's spying on their messages -- a desire some would call modest in an age of mass surveillance. Governments around the world have a problem with that though.

Rosenstein instead sees a future where companies keep their data encrypted, unless the government needs data to investigate a crime or a potential terrorist attack. It's the same rallying cry UK Prime Minister Theresa May made after a June 4 terrorist attack that took place on the London Bridge. May blamed encryption for providing a safe space for extremists.

Rosenstein uses password recovery and email scanning as examples of responsible encryption. But neither of those involve end-to-end encryption. He references an unnamed "major hardware provider," which "maintains private keys it can use to sign software updates for each of its devices." And then he touches on a major problem with responsible encryption: Creating a back door for police also means creating an opening for hackers.

"That would present a huge potential security problem, if those keys were to leak," Rosenstein said. "But they do not leak, because the company knows how to protect what is important."

Except these important files have leaked on multiple occasions, including from the US government itself.

Adobe accidentally released its private key on its security blog in September. In 2011, RSA's SecurID authentication tokens were stolen. The notorious malware Stuxnet used stolen encryption keys to install itself. The US National Security Agency has fallen victim to multiple breaches, from Russian spies stealing its secrets to the Shadow Brokers hacker group selling the agency's tools.

"When the companies have the keys, they can be stolen," said security researcher Jake Williams, founder of cybersecurity provider Rendition Infosec. "Law enforcement calls [end-to-end encryption] 'warrant proof crypto,' but many companies will tell you they're not trying to dodge a warrant, they're just doing what's right for security."

It's why Apple refused to create a back door for the FBI in 2016, when the agency wanted to crack into an iPhone belonging to one of the shooters in the San Bernardino terror attack. Apple CEO Tim Cook said last year that the back door is "the equivalent of cancer," arguing that the master key could be stolen and abused by hackers, like it had been in previous cases.

It's unclear why Rosenstein seems to think encryption keys can't be stolen. The Justice Department confirmed Rosenstein's comments and declined to elaborate.

The call for encryption loopholes has alarmed the security community, which says it's experiencing deja vu.

"I think it's extremely concerning that the man responsible for prosecuting crimes on the federal level would expect the invasion of everyone's privacy simply to make law enforcement's job easier," said Mike Spicer, an expert and founder of the security company Initec.

The myth resurfaces nearly every year, said Eva Galperin, the cybersecurity director at the Electronic Frontier Foundation, a digital-rights group. Every time, the EFF slams the demand, saying it's a "zombie argument."

"Calling it responsible encryption is hypocritical," Galperin said. "Building insecurity in your encryption is irresponsible."

Visit link:
The myth of responsible encryption: Experts say it can't work

Encryption Definition – Tech Terms

Encryption is the process of converting data to an unrecognizable or "encrypted" form. It is commonly used to protect sensitive information so that only authorized parties can view it. This includes files and storage devices, as well as data transferred over wireless networks and the Internet.

You can encrypt a file, folder, or an entire volume using a file encryption utility such as GnuPG or AxCrypt. Some file compression programs like Stuffit Deluxe and 7-Zip can also encrypt files. Even common programs like Adobe Acrobat and Intuit TurboTax allow you to save password-protected files, which are saved in an encrypted format.

Encryption is also used to secure data sent over wireless networks and the Internet. For example, many Wi-Fi networks are secured using WEP or the much stronger WPA encryption. You must enter a password (and sometimes a username) connect to a secure Wi-Fi network, but once you are connected, all the data sent between your device and the wireless router will be encrypted.

Many websites and other online services encrypt data transmissions using SSL. Any website that begins with "https://," for example, uses the HTTPS protocol, which encrypts all data sent between the web server and your browser. SFTP, which is a secure version of FTP, encrypts all data transfers.

There are many different types of encryption algorithms, but some of the most common ones include AES (Advanced Encryption Standard), DES (Data Encryption Standard), Blowfish, RSA, and DSA (Digital Signature Algorithm). While most encryption methods are sufficient for securing your personal data, if security is extremely important, it is best to use a modern algorithm like AES with 256-bit encryption.

Updated: November 11, 2014

This page contains a technical definiton of Encryption. It explains in computing terminology what Encryption means and is one of many technical terms in the TechTerms dictionary.

All definitions on the TechTerms website are written to be technically accurate but also easy to understand. If you find this Encryption definition to be helpful, you can reference it using the citation links above. If you think a term should be updated or added to the TechTerms dictionary, please email TechTerms!

See original here:
Encryption Definition - Tech Terms

Hacker Lexicon: What Is End-to-End Encryption? | WIRED

TL;DR:

End-to-end encryption is a system of communication where the only people who can read the messages are the people communicating. No eavesdropper can access the cryptographic keys needed to decrypt the conversationnot even a company that runs the messaging service.

Plenty of companies brag that their communications app is encrypted. But that marketing claim demands a followup question: Who has the key? In many cases, the company itself holds the cryptographic key data that lets it decrypt your messagesand so, therefore, does any hacker who compromises the company or government official standing over its shoulder.

But increasingly, privacy-conscious communications tools are rolling out a feature known as end-to-end encryption." That end-to-end promise means that messages are encrypted in a way that allows only the unique recipient of a message to decrypt it, and not anyone in between. In other words, only the endpoint computers hold the cryptographic keys, and the company's server acts as an illiterate messenger, passing along messages that it cant itself decipher.

That notion of the decryption key never leaving the user's device might seem like a paradox. If the company's server can never see the key, then how does it get onto the device when the user installs the app in the first place?

The answer is possible because of another crypto trick known as public-key encryption. In public key crypto systems, a program on your computer mathematically generates a pair of keys. One, called the private key or secret key, is used for decrypting messages sent to you and never leaves your device. The other, called the public key, is used for encrypting messages that are sent to you, and it's designed so that only the corresponding private key can decrypt those messages. That key can be shared with anyone who wants to encrypt a message to you. Think of the system like a lockbox on your doorstep for the UPS delivery man: anyone with your public key can put something in the box and lock it, but only you have the private key to unlock it.

The first free, widely used end-to-end encrypted messaging software was PGP, or Pretty Good Privacy, a program coded by Phil Zimmermann and released in 1991. But it's taken decades for that complete encryption tunnel to reach the masses. Programs like the "Off The Record" plugin for Jabber instant-messaging applications and TextSecure for text messaging have made end-to-end encryption far easier to use. Apple uses a form of end-to-end encryption in its iMessage app. (Though some security researchers have pointed to flaws in its implementation that might allow its messages to be decrypted.) Google is experimenting with an end-to-end encryption email plugin for Chrome. And just last week smartphone messaging app Whatsapp integrated TextSecure into its Android software, turning on end-to-end encryption for hundreds of millions of users.

Even end-to-end encryption isn't necessarily impervious from snooping. Rather than try to actually break the encryption, for instance, an eavesdropper may try to impersonate a message recipient so that messages are encrypted to their public key instead of the one the sender intended. After decrypting the message, the snoop can then encrypt it to the recipient's actual public key and send it on again to avoid detection; this is what's known as a man-in-the-middle attack. To combat that tactic, some end-to-end encryption programs generate unique one-time strings of characters based on the two users' public keys. The two people communicating read out that passphrase to each other before starting their conversation. If the characters match, they can be reassured there's no man in the middle.

Of course, there are still two vulnerable points left in even perfect end-to-end encryption systems: the ends. Each users' computer can still be hacked to steal his or her cryptographic key or simply read the recipients' decrypted messages. Even the most perfectly encrypted communication pipe is only as secure as the mailbox on the other end.

Hacker Lexicon is WIRED's explainer series that seeks to de-mystify the jargon of information security, surveillance and privacy.

Originally posted here:
Hacker Lexicon: What Is End-to-End Encryption? | WIRED

Distrustful U.S. allies force spy agency to back down in …

SAN FRANCISCO (Reuters) - An international group of cryptography experts has forced the U.S. National Security Agency to back down over two data encryption techniques it wanted set as global industry standards, reflecting deep mistrust among close U.S. allies.

In interviews and emails seen by Reuters, academic and industry experts from countries including Germany, Japan and Israel worried that the U.S. electronic spy agency was pushing the new techniques not because they were good encryption tools, but because it knew how to break them.

The NSA has now agreed to drop all but the most powerful versions of the techniques - those least likely to be vulnerable to hacks - to address the concerns.

The dispute, which has played out in a series of closed-door meetings around the world over the past three years and has not been previously reported, turns on whether the International Organization of Standards should approve two NSA data encryption techniques, known as Simon and Speck.

The U.S. delegation to the ISO on encryption issues includes a handful of NSA officials, though it is controlled by an American standards body, the American National Standards Institute (ANSI).

The presence of the NSA officials and former NSA contractor Edward Snowdens revelations about the agencys penetration of global electronic systems have made a number of delegates suspicious of the U.S. delegations motives, according to interviews with a dozen current and former delegates.

A number of them voiced their distrust in emails to one another, seen by Reuters, and in written comments that are part of the process. The suspicions stem largely from internal NSA documents disclosed by Snowden that showed the agency had previously plotted to manipulate standards and promote technology it could penetrate. Budget documents, for example, sought funding to insert vulnerabilities into commercial encryption systems.

More than a dozen of the experts involved in the approval process for Simon and Speck feared that if the NSA was able to crack the encryption techniques, it would gain a back door into coded transmissions, according to the interviews and emails and other documents seen by Reuters.

I dont trust the designers, Israeli delegate Orr Dunkelman, a computer science professor at the University of Haifa, told Reuters, citing Snowdens papers. There are quite a lot of people in NSA who think their job is to subvert standards. My job is to secure standards.

The NSA, which does not confirm the authenticity of any Snowden documents, told Reuters it developed the new encryption tools to protect sensitive U.S. government computer and communications equipment without requiring a lot of computer processing power.

NSA officials said via email they want commercial technology companies that sell to the government to use the techniques, and that is more likely to happen when they have been designated a global standard by the ISO.

Asked if it could beat Simon and Speck encryption, the NSA officials said: We firmly believe they are secure.

ISO, an independent organization with delegations from 162 member countries, sets standards on everything from medical packaging to road signs. Its working groups can spend years picking best practices and technologies for an ISO seal of approval.

As the fight over Simon and Speck played out, the ISO twice voted to delay the multi-stage process of approving them.

In oral and written comments, opponents cited the lack of peer-reviewed publication by the creators, the absence of industry adoption or a clear need for the new ciphers, and the partial success of academics in showing their weaknesses.

Some ISO delegates said much of their skepticism stemmed from the 2000s, when NSA experts invented a component for encryption called Dual Elliptic Curve and got it adopted as a global standard.

ISOs approval of Dual EC was considered a success inside the agency, according to documents passed by Snowden to the founders of the online news site The Intercept, which made them available to Reuters. The documents said the agency guided the Dual EC proposal through four ISO meetings until it emerged as a standard.

In 2007, mathematicians in private industry showed that Dual EC could hide a back door, theoretically enabling the NSA to eavesdrop without detection. After the Snowden leaks, Reuters reported that the U.S. government had paid security company RSA $10 million to include Dual EC in a software development kit that was used by programmers around the world.

The ISO and other standards groups subsequently retracted their endorsements of Dual EC. The NSA declined to discuss it.

In the case of Simon and Speck, the NSA says the formulas are needed for defensive purposes. But the official who led the now-disbanded NSA division responsible for defense, known as the Information Assurance Directorate, said his unit did not develop Simon and Speck.

There are probably some legitimate questions around whether these ciphers are actually needed, said Curtis Dukes, who retired earlier this year. Similar encryption techniques already exist, and the need for new ones is theoretical, he said.

ANSI, the body that leads the U.S. delegation to the ISO, said it had simply forwarded the NSA proposals to the organization and had not endorsed them.

When the United States first introduced Simon and Speck as a proposed ISO standard in 2014, experts from several countries expressed reservations, said Shinichiro Matsuo, the head of the Japanese encryption delegation.

Some delegates had no objection. Chris Mitchell, a member of the British delegation, said he supported Simon and Speck, noting that no one has succeeded in breaking the algorithms. He acknowledged, though, that after the Dual EC revelations, trust, particularly for U.S. government participants in standardization, is now non-existent.

At a meeting in Jaipur, India, in October 2015, NSA officials in the American delegation pushed back against critics, questioning their expertise, witnesses said.

A German delegate at the Jaipur talks, Christian Wenzel-Benner, subsequently sent an email seeking support from dozens of cryptographers. He wrote that all seven German experts were very concerned about Simon and Speck.

How can we expect companies and citizens to use security algorithms from ISO standards if those algorithms come from a source that has compromised security-related ISO standards just a few years ago? Wenzel-Benner asked.

Such views helped delay Simon and Speck again, delegates said. But the Americans kept pushing, and at an October 2016 meeting in Abu Dhabi, a majority of individual delegates approved the techniques, moving them up to a country-by-country vote.

There, the proposal fell one vote short of the required two-thirds majority.

Finally, at a March 2017 meeting in Hamilton, New Zealand, the Americans distributed a 22-page explanation of its design and a summary of attempts to break them - the sort of paper that formed part of what delegates had been seeking since 2014.

Simon and Speck, aimed respectively at hardware and software, each have robust versions and more lightweight variants. The Americans agreed in Hamilton to compromise and dropped the most lightweight versions.

Opponents saw that as a major if partial victory, and it paved the way to compromise. In another nation-by-nation poll last month, the sturdiest versions advanced to the final stage of the approval process, again by a single vote, with Japan, Germany and Israel remaining opposed. A final vote takes place in February.

Reporting by Joseph Menn; Editing by Jonathan Weber and Ross Colvin

Go here to see the original:
Distrustful U.S. allies force spy agency to back down in ...

How to encrypt (almost) anything | PCWorld

It's all too easy to neglect data security, especially for a small business. While bigger organizations have IT departments, service contracts, and enterprise hardware, smaller companies frequently rely on consumer software, which lacks the same sort of always-on security functionality.

But that doesnt mean that your data is unimportant, or that it has to be at risk.

Encryption is a great way to keep valuable data safewhether youre transmitting it over the Internet, backing it up on a server, or just carrying it through airport security on your laptop. Encrypting your data makes it completely unreadable to anyone but you or its intended recipient. Best of all, much of the software used in offices and on personal computers already has encryption functionality built in. You just need to know where to find it. In this article, Ill show you where and how.

Any discussion about encryption needs to start with a different topic: password strength. Most forms of encryption require you to set a password, which allows you to encrypt the file and to decrypt it later on when you want to view it again. If you use a weak password, a hacker can break the encryption and access the filedefeating the purpose of encryption.

A strong password should be at least 10 characters, though 12 is better. It should include a mix of uppercase and lowercase letters, as well as numbers and symbols. If you find letters-only easier to remember, such a password can still be secure if its significantly longer; think 20 characters or more.

If youre unsure aboutwhether your password is good enough, run it through Microsofts free password checker. Never use a password rated less than Strong.

You probably already have a login password for Windows on your PC, but that wont actually protect your data if somebody steals your computer or hard drivethe thief can simply plug your drive into another PC and access the data directly. If you have lots of sensitive information on your computer, you want to employ full-disk encryption, which protects all your data even if your hardware falls into the wrong hands.

Microsofts BitLocker software makes setting up full-disk encryption in Windows incredibly easyas long as your computer meets the following two criteria:

1. You have the Ultimate or Enterprise version of Windows 7 or Vista, or the Pro or Enterprise version of Windows 8.

2. Your computer has a TPM (Trusted Platform Module) chip.

The easiest way to see if your computer has a TPM chip is simply to attempt to enable BitLocker. Windows will let you know if you dont have one.

To enable BitLocker, go to Control Panel > System and Security > BitLocker Drive Encryption, or do a search for BitLocker in Windows 8. In the BitLocker menu, click Turn on BitLocker next to the drive(s) you wish to encrypt. Its as easy as that.

If your PC doesnt meet the requirements for BitLocker, you can still useTrueCrypt or DiskCryptor for free full-disk encryption.

For full-disk encryption of thumb drives and USB hard drives, you can use BitLocker To Go, which is designed for removable media. You still need a professional or enterprise version of Windows, but you dont need a TPM to use BitLocker To Go.

All you have to do is plug in the device you want to encrypt, and then once again go to the BitLocker menu. At the bottom of the menu, youll see the BitLocker To Go section, where you can click Turn on BitLockernext to the device.

Sometimes you want to encrypt your outgoing and incoming Internet traffic. If youre on an unsecured Wi-Fi network (at an airport, for instance), a hacker can intercept the data traveling to and from your laptop, which might contain sensitive information. To make that data useless to eavesdroppers, you can encrypt it, using a VPN.

A virtual private network creates a secure tunnel to a trusted third-party server. Data sent through this tunnel (either to or from your computer) is encrypted, so its safe even if intercepted. You can find Web-based VPNs that charge a small monthly fee but provide very easy access, or you can set up your own personal or business VPN.

The process of selecting or setting up a VPN is a little too long to describe here, so see ourarticle on VPN for beginners and experts alike.

If you or other people in your organization use Dropbox or SugarSync, youll be glad to know that those popular cloud storage services already encrypt your data, protecting it in transit and while it sits on their servers. Unfortunately, those same services also hold the decryption keys, which means that they can decrypt your files if, for instance, law enforcement directs them to do so.

If you have any really sensitive files in your cloud storage, use a second layer of encryption to keep them safe from prying eyes. The most straightforward way to do this is to use TrueCrypt to create an encrypted volume inside of your Dropbox. (For a complete guide to encrypting anything with TrueCrypt, see the end of this article.)

If you want to be able to access the data from other computers, consider putting a portable version of TrueCrypt in your Dropbox, as well. To do so, run the TrueCrypt installer; during the installation, choose the Extract option, and choose to put the extracted files in your Dropbox or other cloud storage.

Next page: Encrypt your email and nearly anything else...

Excerpt from:
How to encrypt (almost) anything | PCWorld

Global Email Encryption Market – Size, Trend, Share …

1. Executive summary1.1. Key findings1.2. Market attractiveness and trend analysis1.3. Competitive landscape and recent industry development analysis2. Introduction2.1. Report description2.2. Scope and definitions2.3. Research methodology3. Market landscape3.1. Growth drivers3.1.1. Impact analysis3.2. Restraints and challenges3.2.1. Impact analysis3.3. Porters analysis3.3.1. Bargaining power of buyers3.3.2. Bargaining power of suppliers3.3.3. Threat of substitutes3.3.4. Industry rivalry3.3.5. Threat of new entrants3.4. Global email encryption market shares analysis, 2014-20253.4.1. Global email encryption market shares by deployment, 2014-20253.4.2. Global email encryption market shares by end user, 2014-20253.4.3. Global email encryption market shares by geography, 2014-20254. Global email encryption market by deployment4.1. On-premise4.1.1. Historical market size by region, 2014-20164.1.2. Market forecast by region, 2017-20254.2. Cloud4.2.1. Historical market size by region, 2014-20164.2.2. Market forecast by region, 2017-20255. Global email encryption market by end user5.1. BFSI5.1.1. Historical market size by region, 2014-20165.1.2. Market forecast by region, 2017-20255.2. Healthcare5.2.1. Historical market size by region, 2014-20165.2.2. Market forecast by region, 2017-20255.3. Government5.3.1. Historical market size by region, 2014-20165.3.2. Market forecast by region, 2017-20255.4. Retail5.4.1. Historical market size by region, 2014-20165.4.2. Market forecast by region, 2017-20255.5. IT & telecom5.5.1. Historical market size by region, 2014-20165.5.2. Market forecast by region, 2017-20255.6. Education5.6.1. Historical market size by region, 2014-20165.6.2. Market forecast by region, 2017-20255.7. Manufacturing5.7.1. Historical market size by region, 2014-20165.7.2. Market forecast by region, 2017-20255.8. Others5.8.1. Historical market size by region, 2014-20165.8.2. Market forecast by region, 2017-20256. Global email encryption market by geography6.1. North America6.1.1. U.S.6.1.1.1. Historical market size, 2014-20166.1.1.2. Market forecast, 2017-20256.1.2. Canada6.1.2.1. Historical market size, 2014-20166.1.2.2. Market forecast, 2017-20256.1.3. Mexico6.1.3.1. Historical market size, 2014-20166.1.3.2. Market forecast, 2017-20256.2. Europe6.2.1. UK6.2.1.1. Historical market size, 2014-20166.2.1.2. Market forecast, 2017-20256.2.2. Germany6.2.2.1. Historical market size, 2014-20166.2.2.2. Market forecast, 2017-20256.2.3. France6.2.3.1. Historical market size, 2014-20166.2.3.2. Market forecast, 2017-20256.2.4. Spain6.2.4.1. Historical market size, 2014-20166.2.4.2. Market forecast, 2017-20256.2.5. Italy6.2.5.1. Historical market size, 2014-20166.2.5.2. Market forecast, 2017-20256.2.6. Rest of Europe6.2.6.1. Historical market size, 2014-20166.2.6.2. Market forecast, 2017-20256.3. Asia-Pacific6.3.1. China6.3.1.1. Historical market size, 2014-20166.3.1.2. Market forecast, 2017-20256.3.2. Japan6.3.2.1. Historical market size, 2014-20166.3.2.2. Market forecast, 2017-20256.3.3. India6.3.3.1. Historical market size, 2014-20166.3.3.2. Market forecast, 2017-20256.3.4. Australia6.3.4.1. Historical market size, 2014-20166.3.4.2. Market forecast, 2017-20256.3.5. South Korea6.3.5.1. Historical market size, 2014-20166.3.5.2. Market forecast, 2017-20256.3.6. Rest of Asia-Pacific6.3.6.1. Historical market size, 2014-20166.3.6.2. Market forecast, 2017-20256.4. LAMEA6.4.1. Brazil6.4.1.1. Historical market size, 2014-20166.4.1.2. Market forecast, 2017-20256.4.2. Saudi Arabia6.4.2.1. Historical market size, 2014-20166.4.2.2. Market forecast, 2017-20256.4.3. South Africa6.4.3.1. Historical market size, 2014-20166.4.3.2. Market forecast, 2017-20256.4.4. Rest of LAMEA6.4.4.1. Historical market size, 2014-20166.4.4.2. Market forecast, 2017-20257. Company profiles7.1. HP, Inc.7.1.1. Overview7.1.2. Financials and Business Segments7.1.3. Recent Developments7.2. Symantec7.2.1. Overview7.2.2. Financials and Business Segments7.2.3. Recent Developments7.3. McAfee7.3.1. Overview7.3.2. Financials and business segments7.3.3. Recent developments7.4. Sophos7.4.1. Overview7.4.2. Financials and business segments7.4.3. Recent developments7.5. TrendMicro7.5.1. Overview7.5.2. Financials and business segments7.5.3. Recent developments7.6. Cisco7.6.1. Overview7.6.2. Financials and business segments7.6.3. Recent developments7.7. Proofpoint7.7.1. Overview7.7.2. Financials and business segments7.7.3. Recent developments7.8. Entrust7.8.1. Overview7.8.2. Financials and business segments7.8.3. Recent developments7.9. Zix Corporation7.9.1. Overview7.9.2. Financials and business segments7.9.3. Recent developments7.10. Microsoft Corporation7.10.1. Overview7.10.2. Financials and business segments7.10.3. Recent developments

List of Tables

Table 1. Email Encryption Market Share, by Deployment, 2014-2025Table 2. Email Encryption Market Share, by End-User, 2014-2025Table 3. Email Encryption Market Share, by Region, 2014-2025Table 4. Email Encryption Market Value for On-Premise, by Region, 2017-2025, $millionTable 5. Email Encryption Market Value for Cloud, by Region, 2017-2025, $millionTable 6. Email Encryption Market Value for BFSI, by Region, 2017-2025, $millionTable 7. Email Encryption Market Value for Healthcare, by Region, 2017-2025, $millionTable 8. Email Encryption Market Value for Government, by Region, 2017-2025, $millionTable 9. Email Encryption Market Value for Retail, by Region, 2017-2025, $millionTable 10. Email Encryption Market Value for IT & Telecom, by Region, 2017-2025, $millionTable 11. Email Encryption Market Value for Education, by Region, 2017-2025, $millionTable 12. Email Encryption Market Value for Manufacturing, by Region, 2017-2025, $millionTable 13. Email Encryption Market Value for Others, by Region, 2017-2025, $millionTable 14. Email Encryption Market Value for North America, by Country, 2014-2025, $millionTable 15. Email Encryption Market Value for North America, by Deployment, 2014-2025, $millionTable 16. Email Encryption Market Value for North America, by End-User, 2014-2025, $millionTable 17. Email Encryption Market Value for Europe, by Country, 2014-2025, $millionTable 18. Email Encryption Market Value for Europe, by Deployment, 2014-2025, $millionTable 19. Email Encryption Market Value for Europe, by End-User, 2014-2025, $millionTable 20. Email Encryption Market Value for Asia-Pacific, by Country, 2014-2025, $millionTable 21. Email Encryption Market Value for Asia-Pacific, by Deployment, 2014-2025, $millionTable 22. Email Encryption Market Value for Asia-Pacific, by End-User, 2014-2025, $millionTable 23. Email Encryption Market Value for LAMEA, by Country, 2014-2025, $millionTable 24. Email Encryption Market Value for LAMEA, by Deployment, 2014-2025, $millionTable 25. Email Encryption Market Value for LAMEA, by End-User, 2014-2025, $millionTable 26. HP, Inc. - Company SnapshotTable 27. Symantec - Company SnapshotTable 28. McAfee - Company SnapshotTable 29. Sophos - Company SnapshotTable 30. TrendMicro - Company SnapshotTable 31. Cisco - Company SnapshotTable 32. Proofpoint - Company SnapshotTable 33. Entrust - Company SnapshotTable 34. Zix Corporation - Company SnapshotTable 35. Microsoft Corporation - Company Snapshot

List of Figures

Figure 1. Email Encryption: On-Premise Market Value, 2014-2016, $millionFigure 2. Email Encryption: Cloud Market Value, 2014-2016, $millionFigure 3. Email Encryption: BFSI Market Value, 2014-2016, $millionFigure 4. Email Encryption: Healthcare Market Value, 2014-2016, $millionFigure 5. Email Encryption: Government Market Value, 2014-2016, $millionFigure 6. Email Encryption: Retail Market Value, 2014-2016, $millionFigure 7. Email Encryption: IT & Telecom Market Value, 2014-2016, $millionFigure 8. Email Encryption: Education Market Value, 2014-2016, $millionFigure 9. Email Encryption: Manufacturing Market Value, 2014-2016, $millionFigure 10. Email Encryption: Others Market Value, 2014-2016, $millionFigure 11. Email Encryption: U.S. Market Value, 2014-2016, $millionFigure 12. Email Encryption: U.S. Market Value, 2017-2025, $millionFigure 13. Email Encryption: Canada Market Value, 2014-2016, $millionFigure 14. Email Encryption: Canada Market Value, 2017-2025, $millionFigure 15. Email Encryption: Mexico Market Value, 2014-2016, $millionFigure 16. Email Encryption: Mexico Market Value, 2017-2025, $millionFigure 17. Email Encryption: UK Market Value, 2014-2016, $millionFigure 18. Email Encryption: UK Market Value, 2017-2025, $millionFigure 19. Email Encryption: Germany Market Value, 2014-2016, $millionFigure 20. Email Encryption: Germany Market Value, 2017-2025, $millionFigure 21. Email Encryption: France Market Value, 2014-2016, $millionFigure 22. Email Encryption: France Market Value, 2017-2025, $millionFigure 23. Email Encryption: Spain Market Value, 2014-2016, $millionFigure 24. Email Encryption: Spain Market Value, 2017-2025, $millionFigure 25. Email Encryption: Italy Market Value, 2014-2016, $millionFigure 26. Email Encryption: Italy Market Value, 2017-2025, $millionFigure 27. Email Encryption: Rest of Europe Market Value, 2014-2016, $millionFigure 28. Email Encryption: Rest of Europe Market Value, 2017-2025, $millionFigure 29. Email Encryption: China Market Value, 2014-2016, $millionFigure 30. Email Encryption: China Market Value, 2017-2025, $millionFigure 31. Email Encryption: Japan Market Value, 2014-2016, $millionFigure 32. Email Encryption: Japan Market Value, 2017-2025, $millionFigure 33. Email Encryption: India Market Value, 2014-2016, $millionFigure 34. Email Encryption: India Market Value, 2017-2025, $millionFigure 35. Email Encryption: Australia Market Value, 2014-2016, $millionFigure 36. Email Encryption: Australia Market Value, 2017-2025, $millionFigure 37. Email Encryption: South Korea Market Value, 2014-2016, $millionFigure 38. Email Encryption: South Korea Market Value, 2017-2025, $millionFigure 39. Email Encryption: Rest of Asia-Pacific Market Value, 2014-2016, $millionFigure 40. Email Encryption: Rest of Asia-Pacific Market Value, 2017-2025, $millionFigure 41. Email Encryption: Brazil Market Value, 2014-2016, $millionFigure 42. Email Encryption: Brazil Market Value, 2017-2025, $millionFigure 43. Email Encryption: Saudi Arabia Market Value, 2014-2016, $millionFigure 44. Email Encryption: Saudi Arabia Market Value, 2017-2025, $millionFigure 45. Email Encryption: South Africa Market Value, 2014-2016, $millionFigure 46. Email Encryption: South Africa Market Value, 2017-2025, $millionFigure 47. Email Encryption: Rest of LAMEA Market Value, 2014-2016, $millionFigure 48. Email Encryption: Rest of LAMEA Market Value, 2017-2025, $million

Read the original post:
Global Email Encryption Market - Size, Trend, Share ...