SSD Survey Highlights Misconceptions About Encryption & Performance

COLORADO SPRINGS, Co. -- A survey by the Storage Networking Industry Association (SNIA) launched last fall has revealed some interesting user perceptions regarding the characteristics of solid state drives (SSDs), including their endurance expectations and their lack of interest in using built-in encryption features.

Paul Wassenberg, chair of SNIAs Solid State Storage Initiative (SSSI) said the results of the survey will be used to guide the groups education activities around the capabilities and features of SSDs. The call for input began last fall. Initial results, comprising 75% of the ultimate total of participants, were presented at the Storage Visions Conference earlier this year.

The survey identified respondents in four market segments, namely the mobile, desktop, server, and storage subsystem segments. Within each segment, SSD uses were broken down based on applications as well as interfaces being used. Overall, the highest use of SSDs is in storage subsystems -- by approximately 33%, with servers at roughly 27%, and mobile at around 21%. Desktop use of SSDs was about 8%. The majority, approximately 65%, were using the using the 2.5-inch form factor, 19% were using PCIe cards, and less than 5% were using mSATA. Capacity-wise, about 33% of respondents were using SSDs greater than 500 GB, followed closely by about 31% using between 301 and 500 GB.

The SSSI survey focused on five key attributes of SSDs -- performance, power, endurance, data integrity, and data encryption. While the ratings varied depending on the segment and uses, across all segments performance was fairly important, with IOPS and latency favored over throughput. Power was fairly important, but power management received only middling ratings.

Wassenberg said endurance was most important of all attributes for users, who consistently ranked it above all else. Data integrity and encryption were rated as fairly important, but the latter less so than anticipated. Wassenberg said this was notable, since comments from the survey revealed some outdated data ideas that encryption can reduce performance. That isnt true, he said, because recent generations of self-encrypting drives (SEDs) do not measurably impact SSD performance.

Key management is also a concern in larger systems with multiple drives, the survey found. Wassenberg said mobile devices, such as notebooks PCs, are particularly vulnerable to theft, and encryption would prevent the data from being accessed. Many SSDs being shipped today have data protection and encryption features built in, but often those abilities are not being switched on by OEMs.

Samsung, for example, recently added new security features to its self-encrypting drive (SED), the 840 EVO SSD, making it compatible with professional security software employed by enterprise organizations. In addition, there are a number of third-party vendors such as WinMagic and Wave Systems that offer tools to make SEDs easier for IT departments to deploy and manage while not degrading the performance of SSDs and or complicating the user experience.

Wassenberg said educating users on encryption technologies for SSDs and the benefits will be a focus for the SSSI going forward. Another area of education will be performance, he said, and the importance of preconditioning drives so that users have better expectations of how a drive performs over time. An SSDs performance is higher fresh out of the box, but it will drop after several writes, and then give a more realistic indication of how it will likely perform over time.

The SSSI offers test specification, specifications and software that allow users to test workloads and maintain industry-standard methodology for pre-conditioning and steady state determination for SSDs.

For now, the SSSI survey is going to be kept open for an indefinite period to gather more data, and users are welcome to participate in a dedicated LinkedIn group.

See the article here:
SSD Survey Highlights Misconceptions About Encryption & Performance

5 Ways to Get Open Source Software Support

One great irony of proprietary software is that you pay to have less freedom and flexibility than you would get if you downloaded free open source software.

That's particularly true when you consider support. If you buy a commercial software package, you're usually able to get different levels of support from the software vendor. This may be included in the license fee, or you may have to pay extra for it.

In almost all circumstances, though, you're restricted to whatever the vendor offers. If you don't like what's offered, that's just too bad.

Free Software, Free Market Dynamic

The situation is quite different with open source software, as the source code is freely available for anyone to examine and modify. Support may not be available from a vendor in the way that it is with proprietary software - although vendors such as Red Hat do provide support as part of their subscription offering - but that certainly doesn't mean it isn't available at all.

Far from it. "The way to think about it is that support is unbundled (from the software) but widely available," says Simon Phipps, president of the Open Source Initiative and founder of open source management consultancy Meshed Insights.

If you're an Oracle customer, for example, you're effectively locked in to Oracle support. If you use Apache software, on the other hand, a number of support suppliers compete on quality and price.

[ Tips: How to Run Your Small Business With Free Open Source Software ][ Counterpoint: 7 Reasons Not to Use Open Source Software ]

It's hard, then, to avoid the fact that commercial software companies that restrict access to their source code have a monopoly on the provision of support. With open source software the polar opposite is true. "With open source, there's a free market dynamic to support," Phipps says, "and prices are controlled by the market."

This is a theme taken up by Simon Bowring, a director at open source support provider Transitiv Technologies. "We have customers who were previously using proprietary software and they were locked in. If they needed new features they had no option but to wait for the vendor to write them," he says. "With open source software, we can write code for our customers very quickly, and contribute it back to the community, if the customer agrees."

Read the rest here:
5 Ways to Get Open Source Software Support

Heartbleed: the beginning of the end for open source?

OpenSSL is an open source project, meaning its original source code is freely available for developers to use and modify. This brings plenty of benefits a wider pool of talent creating and enhancing code which is available for free but also negatives while many might be involved in the development of the code, very few are scrutinising it for flaws.

There was common consensus that, because the OpenSSL code had been reviewed so many times, it must be secure. In reality, however, it was during one of these review cycles that the Heartbleed bug was introduced.

This is not unique to open source code, the same could have occurred in a commercial development environment, as even the best developers cannot spot all the issues that lie in their code.

However, the inherent problem with open source projects is that there are thousands of passionate developers but a real lack of passionate testers as American writer Kurt Vonnegut says, Another flaw in the human character is that everybody wants to build and nobody wants to do maintenance.

So how do we prevent this in the future? The answer is not necessarily to stop using open source code but instead to realise measuring the code quality of a program is as important as the development of the program itself.

The received wisdom is that open source software is often regarded as more secure than close source because in theory, the more people who contribute to and edit the software, the higher the quality. In reality the security from open source projects will come from not just a wealth of contributors, but from offering an unbiased way to measure the quality of the code being used across so many of our critical applications.

Some programs are so critical to the world that their quality and security is paramount, and more needs to be done to ensure that they not only function correctly, but the code they are based on is well written and free of flaws.

Google, Facebook and Amazon all rely on open source projects, like OpenSSL, for their success and need to take responsibility to ensure that any code they use is checked and measured. Those who benefit most from the gift of the web should also serve as guardians, making sure it can be used safely for mutual benefit.

Damien Choizit is solutions engineer at software analysis and measurement company CAST.

Continued here:
Heartbleed: the beginning of the end for open source?

Straight from the source: Dell CTO details cloud roadmap | #RHSummit

The open source revolution is spilling over to the cloud as more and more incumbent data center vendors rally behind OpenStack in response to Amazons growing enterprise gains. Sam Greenblatt, the vice president of technology and architecture and CTO for Dells core Enterprise Solutions Group, returned to theCUBE at the recently concluded Red Hat Summit to give us an update on his companys role in driving this industry-wide shift.

The conference saw the announcement of several milestones in a landmark partnership between Red Hat and Dell, originally signed last December, centering on the joint development of hybrid cloud products based on open source software. These so-called co-engineered solutions serve different purposes but utilize the same underlying technologies, combining Dell hardware with the Linux distributors flagship platform, its OpenShift platform-as-a-service stack, OpenStack and the Docker container engine. The goal of the collaboration is to abstract away infrastructure and allow customers to focus entirely on application logic, an objective that Dell is also been pursuing independently.

Greenblatt tells theCUBE hosts John Furrier and Stu Miniman that his company is currently working to provide integration for the Swift object and Cinder block store components of OpenStack throughout its EqualLogic and Compellent portfolios, as well as support for the Ceph unified storage backend. The effort aims to enable the high level of scalability required by Dells biggest customers, he says, a large number of whom are in the financial services sector.

On the network side, the company sells top-of-rack switches loaded with a software-defined networking platform from an emerging startup called Cumulus and counts itself as a bronze sponsor of the OpenDaylight project, a collaborative effort led by The Linux Foundation to develop a set of common standards for SDN. Dell also participates in Junipers OpenContrail initiative, Greenblatt points out, although to a lesser extent.

Lastly, the vendor is developing tools that simplify the management of OpenStack environments. Its arguably most important contribution is the Crowbar deployment and operations tool, but its far from being the only one. Dell has released several reference architectures and a rules engine for the project, Greenblatt details, and also published a number of enhancements to the Nova compute component and the complementary open source Puppet Razor hardware provisioning tool.

.

While Dell is taking the open source road to the software-defined data center, some of its rivals are choosing to go it alone. One of the biggest threats facing the company is EMCs ViPR, a storage abstraction platform that Greenblatt sees as the vendors attempt to unify its six primary product lines under a single management layer. It has potential, he admits, but insists that his company does it better.

We believe that how you should do it is the way were doing it with EqualLogic and Compellent, the executive elaborates. Were merging the stacks into what we call next-gen, and were gonna keep alive both products but working on a single stack. ViPR is an abstraction layer above it and what we find with abstraction layers is, when youre dealing with storage, you gotta build a software hypervisor thats able to work with the hardware much more efficiently.

Although certainly a critical component, storage abstraction is but one aspect of the software-defined vision. Greenblatt believes that Linux containers, which provide a lightweight alternative to traditional virtualization, are emerging as an equally important piece of the puzzle.

We think that containers are going to be the next form of virtualization. Will it replace it? Absolutely not. Youre not gonna see SAP virtualize into a container. But what you will see is applications that you want to put into a PC, into a mobile device, into the Internet of Things, he says. Bringing the technology into the enterprise mainstream is one of the primary goals of Dells partnership with Red Hat.

Original post:
Straight from the source: Dell CTO details cloud roadmap | #RHSummit

Open source pitfalls — and how to avoid them

April 21, 2014, 12:26 PM It's hard to imagine a company these days that isn't using open source software somewhere, whether it's Linux running a company's print and web servers, the Firefox browser on user desktops, or the Android operating system on mobile devices.

In, fact, there are now more than a million different open source projects, according to Black Duck Software, a maker of open source management tools and owner of the Ohloh open source software directory. And open source continues to grow. According to an SAP research report, the number of open source projects roughly doubles every 14 months.

But not all open source projects are created equal. According to Ohloh, for the 100,375 projects for which activity information is available, around 80 percent were listed as having low activity, very low activity or were completely inactive.

Already an Insider? Sign in

Read the original:
Open source pitfalls -- and how to avoid them

Crypto()Currency – CryptoCurrency.org

AltCoins

What is Litecoin?

Litecoin is a peer-to-peer Internet currency that enables instant payments to anyone in the world. It differs from its parent Bitcoin in that can be efficiently mined with consumer-grade hardware. Litecoin provides faster confirmations (targeted at every 2.5 minutes on average) and uses memory-hard, scrypt-based mining to target the CPUs and GPUs most people already have. The Litecoin network is scheduled to produce four times as many currency units as Bitcoin.

One of the aims of Litecoin was to provide a mining algorithm that could run at the same time, on the same hardware used to mine bitcoins. With the rise of specialized ASICs for Bitcoin, Litecoin continues to satisify these goals. It is unlikely for FPGA or ASIC mining to take over Litecoin until the currency is widely used.

Namecoin is a peer-to-peer generic name/value datastore system based on Bitcoin technology (a decentralized cryptocurrency). It allows you to:

There are plenty of possible use cases. Read more about Namecoin.

Note: the latest version of this document is at: this github location.

Devcoin is the coin where 90% of the generation goes to open source developers and 10% to the miners.

Here is the original post:
Crypto()Currency - CryptoCurrency.org

Cryptocurrency | Ground Zero with Clyde Lewis

There has been an argument made that we have had plenty of distractions to keep us from paying attention to the economy. From the situation in Europe to the missing plane in East Asia, there are a lot of people that are looking at the crisis of the week knowing full well than when all of the dust clears, when the wreckage is found and put back together we will have to deal with an economy that is devolving and what it may become in the near future.

Arguably, there seems to be a positive outcome of the shaky economy and that is the birth of cryptocurrency.

Bitcoin at the moment has captured the imagination of banks and investors. Bitcoin has the worlds largest virtual currency market capitalization at over $8 billion dollars. Apart from bitcoin, there are at least 100 other cryptocurrencies, ranging from Ripple at ($1.4 billion) and Litecoin ($453 million) also at the high end there is Germanys Deutsche eMark ($106,000) and Grumpycoin ($88,000) at the low end. Even criminals have begun to diversify into homemade cryptocurrencies.

There is also a cryptocurrency called Mazacoin that was created by Payu Harris in hopes that the Lakota can use it in order to have greater independence.

The question is: does the boom in cryptocurrency indicate that people want to become independent of the almighty dollar?

There are many people that agree that while we are seeing successes and failures with cryptocurrency that the dollar will eventually go the way of digital.

InformationWeek reports: According to former Central Intelligence Agency CTO Gus Hunt, in the future, the dollar could well become a crypto currency. Governments going to learn from Bitcoin, and all the official government currencies are going to become crypto currencies themselves, he said during a recent panel discussion in San Francisco hosted by information security firm eSentire, for which he sits on the board of advisers.

Others are saying that bitcoin will do for digital currency what Napster did for downloading music.

It is creating the format for which all digital dollars will be distributed.

See the article here:
Cryptocurrency | Ground Zero with Clyde Lewis

Captain America and NSA spying

What is patriotism? Is it doing what the government says, or is it doing what you believe is true to the Constitution and American values? "Captain America: The Winter Soldier" currently the No. 1 movie in the country comes down on the latter side, wrapping its message in a red, white, and blue action-packed candy shell.

Unfrozen WWII super-soldier Steve Rogers a.k.a. Captain America works for SHIELD, which is basically the CIA plus Navy SEAL Team Six plus the NSA times a thousand. At first, the movie seems to be only a commentary on targeted killing. And it is that, featuring a set of enormous SHIELD military drones called "helicarriers," which prompt an exchange between Cap and SHIELD boss Nick Fury.

"We're gonna neutralize a lot of threats before they even happen."

"I thought the punishment usually came after the crime," Cap replies, recalling the Obama administration's elastic definition of the word "imminent" in its legal justification for putting people on the real-world kill list.

"SHIELD takes the world as it is, not as we'd like it to be," Fury says, echoing Dick Cheney's defense of over-the-line counterterrorism tactics.

"This isn't freedom this is fear," Cap declares.

What later becomes apparent is that the movie is also about dragnet surveillance, revealed in the way that targets are selected for death under the secret helicarrier program, Project Insight. The methodology is explained in the confession of one of the bad guys:

"The 21st century is a digital book. Your bank records, medical histories, voting patterns, emails, phone calls, your damn SAT scores! [The] algorithm evaluates people's past to predict their future. Then the Insight helicarriers scratch people off the list a few million at a time."

It turns out that SHIELD has been infiltrated by a group called Hydra, which was founded by Nazis during World War II. "Hydra was founded on the belief that humanity could not be trusted with its own freedom," explains a Hydra leader to Captain America in the classic movie-villain move of explaining everything to the hero because the hero is doomed (but then of course the hero ends up somehow narrowly escaping certain death).

"What we did not realize is that if you try to take that freedom, they resist. The war taught us much. Humanity needed to surrender its freedom willingly."

Go here to see the original:
Captain America and NSA spying