The Open Source Initiative | Open Source Initiative

Open source software is software that can be freely used, changed, and shared (in modified or unmodified form) by anyone. Open source software is made by many people, and distributed under licenses that comply with the Open Source Definition.

The Open Source Initiative (OSI) is a global non-profit that supports and promotes the open source movement. Among other things, we maintain the Open Source Definition, and a list of licenses that comply with that definition. See our about and history pages for more.

OSI is a member-based organizationjoin and support us!

February 2, 2014: January '14 Newsletter The OSI published its latest newsletter with updates on Affiliate activities, member profiles and news and events. see it now...

Nov 21, 2013: Hello OSI Community Members As our new General Manager, I'd like to take a few moments to introduce myself, and also provide a bit more information about both the new GM role itself, as well as some of the new programs identified by the OSI Board of Directors to extend and enable our mission. Read more...

Oct 22, 2013: OSI Names New General Manager: Newly Appointed General Manager Patrick Masson Joins OSI from University of Massachusetts. Read more...

July 24, 2013: Election Result: Individual Members elect Richard Fontana as new OSI Director. Read more...

June 14, 2013: Board News: OSI opens recruitment for General Manager. Read more...

May 11, 2013: Board Report: The OSI Board has made plans for the election of the first Individual Member Director, as well as to hire OSI's first General Manager. Read more...

May 2, 2013: Affiliates Select New OSI Director Read more...

Read this article:
The Open Source Initiative | Open Source Initiative

7 reasons not to use open source software

Paul Rubens | Feb. 12, 2014

Businesses of all sizes embrace open source software and the benefits it can bring. Sometimes, though, choosing proprietary software makes better business sense. Here are seven scenarios when it pays to pay for your software.

Talk to an open source evangelist and chances are he or she will tell you that software developed using the open source model is the only way to go.

The benefits of open source software are many, varied and, by now, well-known. It's free to use. You can customise it as much as you want. Having many sets of eyes on the source code means security problems can be spotted quickly. Anyone can fix bugs; you're not reliant on a vendor. You're not locked in to proprietary standards. Finally, you're not left with an orphaned product if the vendor goes out of business or simply decides that the product is no longer profitable.

However, the open-source evangelist probably won't tell you that, despite all these very real benefits, there are times when using closed-sourced, proprietary software actually makes far more business sense.

Here are some of the circumstances when old-fashioned proprietary products are a better business choice than open source software.

1. When It's Easier for Unskilled Users Linux has made a huge impact on the server market, but the same can't be said for the desktop market and for good reason. Despite making strides in the last several years, it's still tricky for the uninitiated to use, and the user interfaces of the various distributions remain far inferior to those of Windows or Mac OS X.

While Linux very well may be technically superior to these proprietary operating systems, its weaknesses mean that most users will find it more difficult and less appealing to work with. That means lower productivity, which will likely cost far more than purchasing a proprietary operating system with which your staff is familiar.

2. When It's the De Facto Standard Most knowledge workers are familiar with, and use, Microsoft Word and Excel. Even though there are some excellent open source alternatives to Office, such as LibreOffice and Apache OpenOffice, they aren't identical in terms of functionality or user interface, performance, plugins and APIs for integration with third-party products. They are probably close enough as much as 90 percent of the time, but on rare occasions there's a risk that these differences will cause problems especially when exchanging documents with suppliers or customers.

It also makes sense to use proprietary software in specialist fields where vendors are likely to have gone into universities and trained students on their software. "The software may not necessarily be better, but it may be selected by a university before an open source solution gets a big enough community around it," says Chris Mattman, an Apache Software Foundation member and a senior computer scientist at the NASA Jet Propulsion Laboratory.

View post:
7 reasons not to use open source software

OpenDaylight Summit: SDN Needs Open Source and Open Standards

The OpenDaylight open source Software Defined Networking project kicked off its first ever OpenDaylight Summit today, highlighted by the Hydrogen SDN platform release.

The event also served as a proof point for the power of open source and why it is a model appropriate not just for operating system software like Linux, but also for networking. Jim Zemlin, executive director of the Linux Foundation, delivered one of the day's keynotes, starting off by telling the audience that technically he was Linux creator Linus Torvalds's boss.

That, however, doesn't mean that he directs Torvalds's activities. Zemlin said that his five-year-old daughter and Torvalds are very much alike.

"Both are adorable and both are geniuses and neither listen to me at all," Zemlin said.

On a more serious note, Zemlin noted that from his earliest days of involvement with Linux, he was told and reminded by industry experts that Linux would fail. Time has proven those experts wrong. The modern world literally runs on Linux.

"A world without open source would be a pretty grim world," Zemlin said. "85 percent of the world's stock exchanges would shut down, you wouldn't have any friends - Facebook runs on Linux, and you'd have to go the bookstore to buy books, since Amazon runs on Linux."

Zemlin added that during the recent Consumer Electronics Show (CES), he was hard-pressed to find any technology that wasn't running on Linux and open source.

The same types of criticism that were leveled against open source and Linux in general were also leveled against OpenDaylight when the project first started. Zemlin argued that the Hydrogen release and the incredible amount of participation in the release prove the naysayers wrong. Over 1 million lines of code developed by over 150 developers landed in the Hydrogen release.

"We're on the right side of history in terms of what people want and creating a better way to innovate and a better, faster, cheaper way to create products," Zemlin stressed. "The future is open."

Open Source vs Open Standards

Original post:
OpenDaylight Summit: SDN Needs Open Source and Open Standards

BLOG: Why open source will rule the data centre

Michael Bushong | Feb. 6, 2014

According to Michael Bushong of networking startup Plexxi, three commonly occurring conditions ensure that open source software will steadily widen its data centre footprint.

Rare is the infrastructure that lacks open source software, but as we watch such new technologies as SDN move into our data centers, open source seems increasingly likely to penetrate every corner not just servers and applications, but networking, storage, and more.

Michael Bushong, vice president of marketing at SDN vendor Plexxi, sees the dominance of open source as inevitable. In this week's New Tech Forum, he walks us through his reasoning that open source will spread throughout IT. Paul Venezia

Open source as the future of IT Open is playing an increasingly vital role in IT infrastructure. The current, dominant position of open source in server-side computing is well understood, and networking is now edging its way toward open source with the OpenDaylight movement. But is open source a natural evolutionary path for all IT disciplines, or do certain characteristics make some areas more attractive for open source than others?

When we think about networking as an industry, for example, we tend to compare its progress to the evolutionary track taken by the compute world. The assumption is that the networking industry will unfold in much the same way that the server industry did, marching past similar milestones. But this view of the world assumes that evolution follows a two-dimensional track, and industries are either parked somewhere along the continuum or they're moving toward a predetermined end.

But what if evolution doesn't follow some set schedule or even a singular path?If we assume that technological evolution is not predetermined, then what conditions drive an industry toward open source?

To address these questions, let's start by examining the three major drivers for broad open source adoption:

Single platform When lots of applications run on a single platform, that platform is primed for open source. For most platform plays, value and differentiation are not in the platform, but rather reside in what runs on top of the platform. It makes sense that, to the extent possible, vendors developing on a platform should leverage a common body of work. Re-creating foundational elements not unique is duplicative work that ultimately costs the end-user. Additionally, a common platform helps ensure that all applications on top of the platform can run in what ends up looking like a fairly ubiquitous execution environment. This is largely what drove the migration of compute toward Linux.

Contrary to popular belief, a platform that's open source and ubiquitous can also be lucrative. Companies like Red Hat have been successful at leveraging a broad installed base to generate solid revenue streams. That uniformity of the platform Red Hat supports helps ensure that its customer base is as large as possible. Even small deviations in the underlying platform would fracture Red Hat's customer base into smaller sets.

The rest is here:
BLOG: Why open source will rule the data centre

The open source countdown has begun

'By freeing themselves from the shackles of proprietary IT systems, companies can gain a further competitive edge'

On Wednesday, Cabinet Office minister, Francis Maude, outlined his plans to start shifting away from using proprietary Microsoft productivity applications in order to adopt more open source technologies.

A move that could potentially save the public sector millions of pounds annually, it would also see him, and government, break away from what he refers to as the vendor oligopoly currently dominating IT.

Since 2010, over 200 million has been spent by the government on Microsoft Office alone. This is a startling figure when one considers that there are open source software packages capable of delivering almost exactly the same functionality for little to no cost.

In a time of austerity, when we have all been asked to shoulder some of the burden, it then almost seems absurd that the government would incur such an expense when a viable alternative would be available for practically zero cost.

The arguments for Microsoft Office and against the open source alternatives, LibreOffice and OpenOffice, are well versed. Microsoft Office is, after all, a very slick piece of software, with a huge range of features.

But, the truth of the matter is that only a very small fraction of users utilise more than the most basic of these features. The advanced features are the reserve of a handful of power users, who need them for a very specific set of applications many of which are now also offered by the open source packages.

In the past, it has been argued that LibreOffice and OpenOffice are buggy and that they did not offer a comparable user experience to its Microsoft Office competitor. Today, that is not the case. The open source options available in the market today can meet user needs just as well as proprietary software if not better.

>See also:Open-source cancer diagnosis

Read the original post:
The open source countdown has begun

The Pentagon’s Mad Science Is Going Open Source

National security is often synonymous with secrecy. But when it comes to software development, the U.S. defense and intelligence establishment can be surprisingly open.

This week, the Defense Advanced Research Projects Agency or DARPA, the research arm of the U.S. Defense department published a list of all the open source computer science projects it has funded, including links to source code and academic papers that detail the codes underlying concepts.

Anyone is free to not only peruse the source code and add to it, but actually use it to build their own software and that includes foreign governments. The belief is that because anyone can contribute to these projects, the quality of the code will only improve, making the software more useful to everyone. Its an approach that has paid off in spades among web companies from Google and Facebook to Twitter and Square, and the government has now realized that it too can benefit from the open source ethos.

DARPA is known for some pretty whacked out projects. Mind controlled exoskeletons. Space colonization. Turning pets into intelligence assets. That sort of thing. But it does have a more sober side. The agency funded the creation of the network that eventually became the internet, for example. And, more recently, it funded work on Mesos, the open source platform used by Twitter to scale applications across thousands of servers. Its more of the latter that shows up on DARPAs new site.

The site is focused on computer science research, so projects that fall outside of that discipline such as the OpenBCI brain scanner and the open source amphibious tank wont be found on the list. But theres still quite a few important projects, including Mesos, the in-memory data processing system Apache Spark, and the Julia programming language for mathematicians and scientists.

Most of these DARPA-backed projects are on GitHub, the popular code hosting and collaboration service that has come to symbolize the type of non-hierarchical collaboration celebrated by open source enthusiasts and tech culture in general. The site makes it easy for anyone to examine source code, suggest changes, and discuss decisions. Mirroring the way it treats software, the company itself operates with no job titles, no middle management, and only a thin layer of top-level management, preferring instead flat or holacratic structure.

That sort of non-hierarchical thinking may seem at odds with military culture, but in reality, many of these ideas were pioneered by military researchers. Today, we often trace the origins of open source software to work done by industrial research labs like Bell Labs and Xerox PARC. But in his book From Counterculture to Cyberculture, Fred Turner argues that open sources roots stretch back even further to the World War II era defense research laboratories that created technologies such as radar, the atomic bomb, submarines, aircraft, and, yes, digital computers. The laboratories within which the research and development took place witnessed a flourishing of nonhierarchical, interdisciplinary collaboration, Turner writes.

He points to the MIT Radiation Laboratory which was formed by the National Defense Research Committee, a predecessor of sorts to DARPA as a model example. It brought together scientists and mathematicians from MIT and elsewhere, engineers and designers from industry, and many different military and government planners, Turner says. Formerly specialized scientists were urged to become generalists in their research, able not only to theorize but also to design and build new technologies.

Today, were more familiar with the NSAs cloak and dagger approach to research, but the collaborative approach of the WWII era military-industrial-academic complex has never really gone away. The Army recently partnered with Local Motors to crowdsource new military vehicle designs. The CIA created In-Q-Tel, a venture capital firm that funds tech startups, including open source big data companies like Cloudant and MongoDB. Even the NSA is part of the action, open sourcing its big data storage system Accumulo.

In other words, the defense industry sees what Facebook and Twitter and so many other web companies see: that innovation often comes from openness.

Read this article:
The Pentagon's Mad Science Is Going Open Source

Blender Tutorial – 2D Animation (1) Bone Rigging, Shape Character Planes by VscorpianC – Video


Blender Tutorial - 2D Animation (1) Bone Rigging, Shape Character Planes by VscorpianC
Blender is open source software, this tutorial shows how to use a single image/character sheet as b.g. reference to shape mesh planes for 2D character parts,...

By: VscorpianC

Visit link:
Blender Tutorial - 2D Animation (1) Bone Rigging, Shape Character Planes by VscorpianC - Video

Free and open source software key for multicore hardware

Press Release,

New Zealand, February 3, 2014.

Free and open source software key to taking advantage of multicore hardware

Free and open source software will almost undoubtedly be the way to manage hugely powerful multicore computers says Nicolas Erdody.

The organiser of Multicore World 2014 Conference at Aucklands AUT on 25 and 26 February, says computer engineers are beginning to get to grips with writing programs to effectively handle many cores on one chip (multicore), which dramatically increases computing processing power.

But theres many different approaches to how to provide these instructions, and weve assembled a world-class range of speakers to outline these software advances, which so far havent matched the massive hardware increases by computer-chip manufacturers, says Erdody.

For IT managers, CTOs & CIOs, computer engineers and developers and anyone with even a hint of interest in where computing is heading, this conference will be invaluable.

Among the speakers is Associate Professor Manuel Chakravarty of the University of New South Wales who will illustrate how the Accelerate open source framework delivers a competitive multicore performance with a fraction of the effort of alternatives.

The Lead Data Technologist at Germanys codecentric AG, Pavlo Baron, will explain why their approach is to use Java Virtual Machine (JVM) as a way to deal with multiple millions of events per second in a multicore environment.

New Zealands Catalyst IT, who are also one of the conferences sponsors, will have its Cloud Engineer Ricardo Rocha describe some of the significant shifts that have occurred in data storage systems, where new interfaces aim to relax, and speed up, some of the traditional access protocols.

Continue reading here:
Free and open source software key for multicore hardware