5 Ways to Get Open Source Software Support

One great irony of proprietary software is that you pay to have less freedom and flexibility than you would get if you downloaded free open source software.

That's particularly true when you consider support. If you buy a commercial software package, you're usually able to get different levels of support from the software vendor. This may be included in the license fee, or you may have to pay extra for it.

In almost all circumstances, though, you're restricted to whatever the vendor offers. If you don't like what's offered, that's just too bad.

Free Software, Free Market Dynamic

The situation is quite different with open source software, as the source code is freely available for anyone to examine and modify. Support may not be available from a vendor in the way that it is with proprietary software - although vendors such as Red Hat do provide support as part of their subscription offering - but that certainly doesn't mean it isn't available at all.

Far from it. "The way to think about it is that support is unbundled (from the software) but widely available," says Simon Phipps, president of the Open Source Initiative and founder of open source management consultancy Meshed Insights.

If you're an Oracle customer, for example, you're effectively locked in to Oracle support. If you use Apache software, on the other hand, a number of support suppliers compete on quality and price.

[ Tips: How to Run Your Small Business With Free Open Source Software ][ Counterpoint: 7 Reasons Not to Use Open Source Software ]

It's hard, then, to avoid the fact that commercial software companies that restrict access to their source code have a monopoly on the provision of support. With open source software the polar opposite is true. "With open source, there's a free market dynamic to support," Phipps says, "and prices are controlled by the market."

This is a theme taken up by Simon Bowring, a director at open source support provider Transitiv Technologies. "We have customers who were previously using proprietary software and they were locked in. If they needed new features they had no option but to wait for the vendor to write them," he says. "With open source software, we can write code for our customers very quickly, and contribute it back to the community, if the customer agrees."

Read the rest here:
5 Ways to Get Open Source Software Support

Heartbleed: the beginning of the end for open source?

OpenSSL is an open source project, meaning its original source code is freely available for developers to use and modify. This brings plenty of benefits a wider pool of talent creating and enhancing code which is available for free but also negatives while many might be involved in the development of the code, very few are scrutinising it for flaws.

There was common consensus that, because the OpenSSL code had been reviewed so many times, it must be secure. In reality, however, it was during one of these review cycles that the Heartbleed bug was introduced.

This is not unique to open source code, the same could have occurred in a commercial development environment, as even the best developers cannot spot all the issues that lie in their code.

However, the inherent problem with open source projects is that there are thousands of passionate developers but a real lack of passionate testers as American writer Kurt Vonnegut says, Another flaw in the human character is that everybody wants to build and nobody wants to do maintenance.

So how do we prevent this in the future? The answer is not necessarily to stop using open source code but instead to realise measuring the code quality of a program is as important as the development of the program itself.

The received wisdom is that open source software is often regarded as more secure than close source because in theory, the more people who contribute to and edit the software, the higher the quality. In reality the security from open source projects will come from not just a wealth of contributors, but from offering an unbiased way to measure the quality of the code being used across so many of our critical applications.

Some programs are so critical to the world that their quality and security is paramount, and more needs to be done to ensure that they not only function correctly, but the code they are based on is well written and free of flaws.

Google, Facebook and Amazon all rely on open source projects, like OpenSSL, for their success and need to take responsibility to ensure that any code they use is checked and measured. Those who benefit most from the gift of the web should also serve as guardians, making sure it can be used safely for mutual benefit.

Damien Choizit is solutions engineer at software analysis and measurement company CAST.

Continued here:
Heartbleed: the beginning of the end for open source?

Straight from the source: Dell CTO details cloud roadmap | #RHSummit

The open source revolution is spilling over to the cloud as more and more incumbent data center vendors rally behind OpenStack in response to Amazons growing enterprise gains. Sam Greenblatt, the vice president of technology and architecture and CTO for Dells core Enterprise Solutions Group, returned to theCUBE at the recently concluded Red Hat Summit to give us an update on his companys role in driving this industry-wide shift.

The conference saw the announcement of several milestones in a landmark partnership between Red Hat and Dell, originally signed last December, centering on the joint development of hybrid cloud products based on open source software. These so-called co-engineered solutions serve different purposes but utilize the same underlying technologies, combining Dell hardware with the Linux distributors flagship platform, its OpenShift platform-as-a-service stack, OpenStack and the Docker container engine. The goal of the collaboration is to abstract away infrastructure and allow customers to focus entirely on application logic, an objective that Dell is also been pursuing independently.

Greenblatt tells theCUBE hosts John Furrier and Stu Miniman that his company is currently working to provide integration for the Swift object and Cinder block store components of OpenStack throughout its EqualLogic and Compellent portfolios, as well as support for the Ceph unified storage backend. The effort aims to enable the high level of scalability required by Dells biggest customers, he says, a large number of whom are in the financial services sector.

On the network side, the company sells top-of-rack switches loaded with a software-defined networking platform from an emerging startup called Cumulus and counts itself as a bronze sponsor of the OpenDaylight project, a collaborative effort led by The Linux Foundation to develop a set of common standards for SDN. Dell also participates in Junipers OpenContrail initiative, Greenblatt points out, although to a lesser extent.

Lastly, the vendor is developing tools that simplify the management of OpenStack environments. Its arguably most important contribution is the Crowbar deployment and operations tool, but its far from being the only one. Dell has released several reference architectures and a rules engine for the project, Greenblatt details, and also published a number of enhancements to the Nova compute component and the complementary open source Puppet Razor hardware provisioning tool.

.

While Dell is taking the open source road to the software-defined data center, some of its rivals are choosing to go it alone. One of the biggest threats facing the company is EMCs ViPR, a storage abstraction platform that Greenblatt sees as the vendors attempt to unify its six primary product lines under a single management layer. It has potential, he admits, but insists that his company does it better.

We believe that how you should do it is the way were doing it with EqualLogic and Compellent, the executive elaborates. Were merging the stacks into what we call next-gen, and were gonna keep alive both products but working on a single stack. ViPR is an abstraction layer above it and what we find with abstraction layers is, when youre dealing with storage, you gotta build a software hypervisor thats able to work with the hardware much more efficiently.

Although certainly a critical component, storage abstraction is but one aspect of the software-defined vision. Greenblatt believes that Linux containers, which provide a lightweight alternative to traditional virtualization, are emerging as an equally important piece of the puzzle.

We think that containers are going to be the next form of virtualization. Will it replace it? Absolutely not. Youre not gonna see SAP virtualize into a container. But what you will see is applications that you want to put into a PC, into a mobile device, into the Internet of Things, he says. Bringing the technology into the enterprise mainstream is one of the primary goals of Dells partnership with Red Hat.

Original post:
Straight from the source: Dell CTO details cloud roadmap | #RHSummit

Open source pitfalls — and how to avoid them

April 21, 2014, 12:26 PM It's hard to imagine a company these days that isn't using open source software somewhere, whether it's Linux running a company's print and web servers, the Firefox browser on user desktops, or the Android operating system on mobile devices.

In, fact, there are now more than a million different open source projects, according to Black Duck Software, a maker of open source management tools and owner of the Ohloh open source software directory. And open source continues to grow. According to an SAP research report, the number of open source projects roughly doubles every 14 months.

But not all open source projects are created equal. According to Ohloh, for the 100,375 projects for which activity information is available, around 80 percent were listed as having low activity, very low activity or were completely inactive.

Already an Insider? Sign in

Read the original:
Open source pitfalls -- and how to avoid them

Coverity finds open source software quality better than proprietary code

Summary: Coverity, a company specializing in software quality and security testing solutions, finds that open source programs tend to have fewer errors than proprietary programs.

The irony isn't lost on me: Coverity, a a company specializing in software quality and security testing solution, has found that open source software has fewer defects in its code than proprietary programs in the aftermath of open-source OpenSSL Heartbleed programming fiasco. Nevertheless, the numbers don't lie and the 2013 Coverity Scan Open Source Report (PDF Link) found that open source had fewer errors per thousand lines of code (KLoC) than proprietary software.

The Coverity Scan service, which the study was based on, was started with the US Department of Homeland Security in 2006. The project was designed to give hard answers to questions about open source software quality and security.

For this latest Coverity Scan Report, the company analyzed code from more than 750 open source C/C++ projects as well as an anonymous sample of enterprise projects. In addition, the report highlights analysis results from several popular, open source Java projects that have joined the Scan service since March 2013. Specifically, the company scanned the code of C/C++ programs, such as NetBSD, FreeBSD, LibreOffice, and Linux, and Java projects such as Apache Hadoop, HBase, and Cassandra.

The 2013 report's key findings included:

Zack Samocha, senior director of products for Coverity, said in a statement, "Our objective with the Coverity Scan service is to help the open source community create high-quality software. Based on the results of this report as well as the increasing popularity of the service open source software projects that leverage development testing continue to increase the quality of their software, such that they have raised the bar for the entire industry."

Coverity also announced that it has opened up access to the Coverity Scan service, allowing anyone interested in open source software to view the progress of participating projects. Individuals can now become Project Observers, which enables them to track the state of relevant open source projects in the Scan service and view high-level data including the count of outstanding defects, fixed defects, and defect density.

"Weve seen an exponential increase in the number of people who have asked to join the Coverity Scan service, simply to monitor the defects being found and fixed. In many cases, these people work for large enterprise organizations that utilize open source software within their commercial projects," concluded Samocha. "By opening up the Scan service to these individuals, we are now enabling a new level of visibility into the code quality of the open-source projects, which they are including in their software supply chain."

Related Stories:

Go here to read the rest:
Coverity finds open source software quality better than proprietary code

Plant Breeders Release First ‘Open Source Seeds’

hide captionBackers of the new Open Source Seed Initiative will pass out 29 new varieties of fourteen different crops, including broccoli, carrots and kale on Thursday.

Backers of the new Open Source Seed Initiative will pass out 29 new varieties of fourteen different crops, including broccoli, carrots and kale on Thursday.

A group of scientists and food activists is launching a campaign Thursday to change the rules that govern seeds. They're releasing 29 new varieties of crops under a new "open source pledge" that's intended to safeguard the ability of farmers, gardeners, and plant breeders to share those seeds freely.

It's inspired by the example of open source software, which is freely available for anyone to use, but cannot legally be converted into anyone's proprietary product.

At an event on the campus of the University of Wisconsin-Madison, backers of the new Open Source Seed Initiative will pass out 29 new varieties of fourteen different crops, including carrots, kale, broccoli and quinoa. Anyone receiving the seeds must pledge not to restrict their use by means of patents, licenses or any other kind of intellectual property. In fact, any future plant that's derived from these open source seeds also has to remain freely available as well.

Irwin Goldman, a vegetable breeder at the University of Wisconsin-Madison, helped organize the campaign. It's an attempt to restore the practice of open sharing that was the rule among plant breeders when he entered the profession more than 20 years ago.

"If other breeders asked for our materials, we would send them a packet of seed, and they would do the same for us," he says. "That was a wonderful way to work, and that way of working is no longer with us."

These days, seeds are intellectual property. Some are patented as inventions. You need permission from the patent holder to use them, and you're not supposed to harvest seeds for replanting the next year.

Even university breeders operate under these rules. When Goldwin creates a new variety of onions, carrots or table beets, a technology-transfer arm of the university licenses it to seed companies.

This brings in money that helps pay for Goldman's work, but he still doesn't like the consequences of restricting access to plant genes what he calls germplasm. "If we don't share germplasm and freely exchange it, then we will limit our ability to improve the crop," he says.

View post:
Plant Breeders Release First 'Open Source Seeds'

Open Source Software Is the Worst Kind Except for All of the Others

Heartbleed, for anyone who doesn't read the papers, is a serious bug in the popular OpenSSL security library. Its effects are particularly bad, because OpenSSL is so popular, used to implement the secure bit of https: secure web sites on many of the most popular web servers such as apache, nginx, and lighttpd.

A few people have suggested that the problem is that OpenSSL is open source, and code this important should be left to trained professionals. They're wrong. The problem is that writing and testing cryptographic software is really, really hard.

Writing and testing any sort of security software is hard, because the goals are more or less the opposite of normal software. For normal software, the main goal is to do the right thing with correct input. If it's a word processor or spreadsheet, you want it to compute and display the right results with reasonable input, but you don't much care what happens with unreasonable input. If you tell a spreadsheet to open a file full of garbage, and you get a strange screen display or the program crashes, that is at worst mildly annoying.

With security software, though, the entire value is to make sure that it rejects every incorrect input, and doesn't erroneously reveal secure material. This distinction is not one that is well understood in the computer industry. I can recall far too many reviews of desktop file encryption programs where the reviewer went on at great length about the speed of encryption and decryption, the ease of use of the various screen displays, but never bothered to check that it rejected attempts to decode data with the wrong password. Since the number of possible invalid inputs is stupendously greater than the number of valid inputs, ensuring that security software does what it is supposed to do presents a severe debugging and testing problem.

Public key (PK) cryptography, the core functions for which everyone uses OpenSSL, is doubly difficult because the programming is really tricky. All PK algorithms depend on mathematical operations which are relatively easy to do, but very difficult to reverse. A well-known example is multiplying two prime numbers to get their product vs. finding the two primes if you only know the product. All current algorithms involve arithmetic on very large numbers, much larger than any computer can handling without using special arithmetic libraries.

Even the relatively easy PK operations are still pretty slow, so the practical way to use PK is to use conventional shared key cryptography to protect the web page or mail message or whatever, and only use the PK for one end to tell the other the shared key to use. Cryptography has advanced a lot in the decade that OpenSSL has been in use, with both new PK cryptographic algorithms and shared key algorithms, so there are now about two dozen combinations of initial PK and session shared key schemes that OpenSSL has to support.

People have added extra complication to try to make things faster; one trick is to note that if you fetch a secure web page from a server, you'll probably fetch other stuff from the same server, so with the agreement of both ends they can leave the session open after the page is complete, and reuse the same session and same shared key on subsequent requests. The heartbleed bug is in "heartbeat" code that periodically checks to see if the other end of an open session is still there. (One way to fix heartbleed is just to turn off the session saving feature, in which case everything will still work, just slower since there will be more sessions to to create.)

As if all this weren't complex enough, public keys by themselves are just large pseudo-random numbers, which have to be securely associated with domain names for web servers, or e-mail addresses for S/MIME mail, and there's a whole Public Key Infrastructure (PKI) in which well known entities can assert that a particular key belongs to a particular name using digital signatures, essentially PK encryption run backwards. The package of key, name, and a bunch of other stuff is known as a certificate, which is encoded using a system called ASN.1. The encodings and options in ASN.1 are so complicated that it is notoriously difficult to implement correctly, and has led to multiple security issues just from encoding errors.

Any software package that does what OpenSSL does has to do all the stuff I described above. OpenSSL has a few issues of its own. One is its history; it evolved from an earlier package in the 1990s called SSLeay which was apparently originally an experimental implementation of the large number arithmetic needed for PK cryptography. SSLeay turned into OpenSSL in 1998, so there is now close to 20 years of evolutionary cruft in the half million lines of OpenSSL code. It is written in the C programming language, which remains the lingua franca of the software community, in that no matter what language your application is written in, there's always a way for that language to connect to and use C libraries.

C grew up in the 1970s on small computers, notably 16-bit PDP-11s, where every bit of code and data space was precious, so it doesn't have much in the way of defensive programming features to detect and prevent buffer overruns and other bugs. Modern C applications can use libraries that provide much of this defensive programming, but rewriting old C code to be defensive is tedious, and doing so without introducing new bugs is hard, so people rarely do.

Follow this link:
Open Source Software Is the Worst Kind Except for All of the Others

Heartbleed: Open source’s worst hour

Summary: People assumed that open source software is somehow magical, that it's immune to ordinary programming mistakes and security blunders. It's not.

Heartbleed was open source software'sbiggest failure to date. A simple OpenSSL programming mistake opened a security hole in a program that affected hundreds of millions of websites, and God alone knows how many users, who relied upon it for their fundamental security.

We know what happened. A programming blunder enabled attackers to pull down 64k chunks of "secure" server memory. Of course, a hacker would then have to sift through this captured memory for social security numbers, credit-card numbers, and names, but that's trivial.

We know how it happened. German programmer Dr. Robin Seggelmann added a new "feature" and forgot to validate a variable containing a length. The code reviewer, Dr Stephen Henson, "apparently also didnt notice the missing validation," said Seggelmann, "so the error made its way from the development branch into the released version." And, then for about two years the defective code would be used, at one time or another, by almost ever Internet user in the world.

Sorry, there was no grand National Secuity Agency (NSA) plan to spy on the world. It was just a trivial mistake with enormous potential consequences.

So why did this happen? Simple everyone makes mistakes. Estimates on the number of errors per lines of code (LOC) ranges from 15 to 50 errors per LOC to three if the code is rigorously checked and tested. OpenSSL has approximately 300-thousand LOC. Thinks about it.

Still, open source programming methodology is supposed to catch this kind of thing. By bringingmany eyeballs to programs a fundamental open source principle it's believed more errors will be caught. It didn't work here.

This mistake, while not quite as much a beginner's blunder as Apple's GOTO fiasco, was the kind of simple-minded mistake that any developer might make if tired, and that anyone who knows their way around the language should have spotted.

So why didn't they? Was it because OpenSSL is underfunded and doesn't have enough programmers?

Was it because, as Poul-Henning Kamp, a major FreeBSD and security developer, put it, "OpenSSL sucks. The code is a mess, the documentation is misleading, and the defaults are deceptive. Plus it's 300,000 lines of code that suffer from just about every software engineering ailment you can imagine."

Link:
Heartbleed: Open source's worst hour