Microsoft and Partners Highlight Open Source Dev Tools for Azure … – Redmondmag.com

News

Microsoft and some of its open source partners conducted an Azure OpenDev online presentation on Wednesday.

The presentation was yet another profession of love by Microsoft for open source software development, plus demonstrations of open source tools that can be used to tap Microsoft Azure resources. The 3.5-hour event, available on demand here, perhaps was aimed at convincing developers to use technologies such as containers for their regular dev-test work.

Microsoft and Open Source John Gossman, lead architect for Microsoft Azure, kicked off the presentation saying that "Microsoft developers love open source" because they can debug code in a way that they could not do with proprietary source code.

Currently, there are more than 15,000 Microsoft employees that have GitHub accounts, Gossman said. Microsoft Technical Fellow Anders Hejlsberg uses GitHub to maintain the open source TypeScript language, which adds static checking and code refactoring tools to JavaScript applications. Microsoft's Visual Studio Code lightweight editor and integrated development environment supports TypeScript, as well as Go and Node. Visual Studio Code is based on the Atom source editor and is an open source project on GitHub, Gossman explained. He added that John Howard, a senior program manager on the Windows team, is the leading contributor to Docker, a maker of container technologies for Linux and Windows.

The Cloud Foundry Foundation was part of the presentation. Microsoft announced earlier this month that it had joined that organization. The foundation is a collaborative project of the Linux Foundation, according to Abby Kearns, executive director of the Cloud Foundry Foundation. She added that the collaboration with Microsoft offered a tremendous potential for developers to use Cloud Foundry on Azure.

Joshua McKenty, head of global ecosystem engineering at Pivotal, argued during the presentation that applications should be able to run in the public cloud of the developer's choice. He said that Pivotal is working with Microsoft to get its "patterns" to work in .NET. Pivotal offers its own Cloud Foundry implementation for developers.

Open Source Announcements The open source partners profiled during the presentation included representatives from Docker, Canonical, Pivotal, Red Hat and Chef. There were a couple of announcements made during the event.

First, Docker announced that it will be bringing the Docker Community Edition to the Microsoft Azure Container Service. It will be scalable and secure by default using Swarm, Docker's native clustering solution with native load balancing, according to Michael Frissm, a Docker product manager. During the event, Frissm demonstrated using this solution to build an application on the Azure Container Service in a couple of minutes. He mentioned that using containers is a great way to package an application and share it with colleagues. With containers, only the operating system is virtualized. The processors and file system get sandboxed, so there's a low hardware overhead. Docker provides container images that developers can use to "containerize" their apps, he added. Currently, Docker offers two product editions, Community and Enterprise, which are based on open standards.

The second announcement during the presentation concerned Chef's work with Habitat, which is an open source Apache project on GitHub for building, deploying and managing applications in any environment, according to Nell Shamrell-Harrington, a senior software development engineer at Chef. She's also the core maintainer of Habitat at GitHub.

Habitat packages apps and works with Docker and Kubernetes. It assumes failures and is self-healing. Habitat works without containers but it "shines when using them," Shamrell-Harrington said. She explained that containers can sometimes be obscure and painful to use, but when a container image is created with Habitat, "it's not a black box." While the Bash shell for Linux is currently used to create Habitat packages, Shamrell-Harrington announced the news that "soon you'll be able to use it with Windows." Habitat for Windows is still in development, but it'll be designed to leverage PowerShell to create packages.

Open Source Demos Other open source technologies were demonstrated and discussed during the event.

On the Docker side, Scott Johnston, chief operating officer, made the claim that the use of microservices will revolutionize application development workflows. Existing applications can be containerized, which adds security because of the isolation and adds greater efficiency because half the resource gets used, he claimed. The Docker Enterprise Edition can be used to "modernize" existing apps by putting them in a container. Docker has partnered with Microsoft and Avanade on a proof-of-concept project to modernize applications, he added.

One of the Docker tools that can be used to more easily move traditional applications to containers is the Image to Docker tool. It inspects a virtual machine image to determine the components that can be "Dockerized," according to Frissm.

Mark Shuttleworth, founder of Ubuntu Linux and Canonical, described using "conjure-up" with Kubernetes as way to harness Microsoft Azure's compute capabilities. Ubuntu conjure-up works with Juju, MAAS and LXD to package solutions in cloud infrastructures, according to Ubuntu's documentation. Shuttleworth said that conjure-up goes out to Azure and requests the resources to build virtual machines, and allows the individual components to scale.

Shuttleworth also described using Helm to deploy applications on top of Kubernetes. Helm is an open source Microsoft tool added with the acquisition of Deis in April. It's a package manager for Kubernetes to keep track of resources, according to Michelle Noorali, a senior software engineer for Microsoft Azure and the core maintainer of the Kubernetes Helm project. Noorali explained that a package in Helm is called a "Chart," which consists of metadata, Kubernetes resource definitions, configurations and documentation. It's a tool to ease developers into Kubernetes, which is "still really hard" to master, she said.

Read more:
Microsoft and Partners Highlight Open Source Dev Tools for Azure ... - Redmondmag.com

Open Source Valued Despite Poor Documentation and Bad Behavior – iProgrammer

Findings from an Open Source Survey designed by GitHubtogether with researchers from academia, industry, and the community, provide interesting insightsabout the attitudes, experiences, and backgrounds of those who use, build, and maintain open source software. The full results are available as an open data set available on GitHub.

The survey had over 50 questions and collected responsesfrom 5,500 randomly sampled respondents sourced from over 3,800 open source repositories on GitHub.com, and over 500 responses from a non-random sample of communities that work on other platforms.

The key insights identified by GitHub include:

With regard to the final point, 72% of respondents claimed theyalways seek out open source optionswhen evaluating new tools. The main reason for this was security:86% of those surveyed considered security extremely or very important in choosing software and 58% believed that open source software is usually better than proprietary software (58%) with respect to security.

While users also valued stability with 88% rating it extremely or very important, only 30%thought open source software more stable than proprietary options.. Similarly, while 75% of respondents value user experience, only 36% considered open source software superior in this respect.

(click in chart to enlarge)

The most prevalent problem identified in the survey was Incomplete or outdated documentation. This wasobserved by 93 percent of respondents, but on the other hand 60 percent of contributors say they rarely or never contribute to documentation.The finding that nearly a quarter of the open source community reads and writes English less than very well' is a factor that both contributes to poor or lacking documentation and the need for it to be clearly expressed and comprehensive.

Open source brings together people from all over the world, which can lead to conflicts. While serious incidents are rare, the public nature of open source makes Negative interactions highly visible.

(click in chart to enlarge)

While 18% of respondents have personally experienced a negative interaction with another user in open source, 50% have witnessed one between other people. By far, the most frequently encountered bad behavior is rudeness (45% witnessed, 16% experienced), followed by name calling (20% witnessed, 5% experienced) and stereotyping (11% witnessed, 3% experienced). More serious incidents, such as sexual advances, stalking, or doxxing are each encountered by less than 5% of respondents and experienced by less than 2% (but cumulatively witnessed by 14%, and experienced by 3%).

With regard to representation, findings reported by GitHub included:

GitHub also reported that the majority of employed respondents use and contribute to open source at work:

There is plenty more to discover from this data, which is available to download and explore.

Open Source Survey

Open Source Survey Download Data

GitHub Octoverse Reveals The State Of Open Source

Vision Mobile Developer Survey Extended

Grimoire Lab-GitHub - Stats On Steroids

To be informed about new articles on IProgrammer,sign up for ourweekly newsletter,subscribe to theRSSfeedandfollow us on,Twitter,Facebook,Google+orLinkedin.

blog comments powered by

Visit link:
Open Source Valued Despite Poor Documentation and Bad Behavior - iProgrammer

Why Open Source will Overtake Proprietary Software by 2020 … – Computer Business Review

Add to favorites

The future is in open source, and proprietary will have to either get on board or be left in the dust.

Is proprietary software dead? Maybe not entirely, but pretty soon, its place in the enterprise will be greatly diminished due to the rapid adoption of innovative open source alternatives. While proprietary tools often boast small, yet stable, customer bases, open source software can claim passionate, loyal followings that only keep growing.

Although open source has always had a strong support system, free code was almost synonymous with pirating at one point. In fact, back in 1976, Bill Gates wrote his Open Letter to Hobbyists, which argued that if developers get ahold of software without paying for it, theres no way to encourage the next generation to create high quality products as they will never reap the financial benefits.

While his argument was convincing at the time, it has been largely disproven today. Open source software is growing exponentially in popularity so much so that it is threatening to kill proprietary software by 2020. Or maybe even before. So, what contributed to open sources strong hold on developers? Lets start from the beginning.

Open source has been slowly taking hold for decades, but its pivotal moment came in 1991 with the release of Linux Kernel, which created the first free operating system. Shortly thereafter, Eric S. Raymond wrote the Cathedral and the Bazaar, which proposed a coding economy based on community, sharing, and building. He argued that by having code open to the public, it will be less prone to bugs as everyone will have the opportunity to correct flaws.

Netscape released Netscape Communicator in 1997 as an open source Internet suite, thereby establishing open source as a mainstream movement. As the years have gone on, an increasing number of high quality, diverse open source offerings have become available, some of the most popular of which are the ELK Stack logging platform, the Linux operating system, and the Apache HTTP Server.

As I mentioned, open source software is backed by a strong and devoted community. But what has made this community so eager to support open source projects? Open source is built by developers, for developers. Thats why people who participate and use tools within the community feel close to those who created it as well as the tools themselves.

Other than the emotional connection, the projects are free, of high quality, and can be constantly perfected due to their public nature. Because it is community-driven, projects only advance as a result of being highly usable. Developers are able to build on top of open source offerings, making the projects more diverse, useful, and valuable.

Furthermore, the loyalty to open source is largely ingrained in developer culture. Developers may become familiar with specific tools as students, often times even contributing to their code. They then move on and advance in their career, taking the tools they know with them.

In contrast, proprietary tools are expensive and lack options for customization. If a new feature is needed, developers are dependant on the company that created the tool to recognize the need and release a new update. Generally speaking, such updates take quite a bit of time and lack the beloved transparency found within open source.

Overall, the open source movement entails many unique components. One such component is its popularity in the software field. Though a variety of great open source hardware exists, the trend has really gained momentum within the context of software.

The reason for this is softwares ease of delivery. Open source software is simply downloaded whereas hardware must be manufactured. The components needed are classically difficult to come by and expensive to ship. Despite all these issues, open source hardware is increasing in popularity, developing a strong following similar to its counterpart in software.

In addition to open sources unique dominance in software, another interesting aspect in open source is the fact that commercial companies are eager to get in on the action. Some prominent examples include Googles Kubernetes and Elastics Elasticsearch.

So, why would a company as large as Google want to jump on the open source bandwagon? The value for commercial companies is the ability to take part in educating the open source community while playing a profound role in a flourishing grassroots movement. By leaving their mark in the open source world, large companies can play an active role in the innovation taking place, earning points among a strong segment of their market.

However, whats interesting to note is many of these companies lose prominence next to their open source offering. While this cant be said for Google, Elastics Elasticsearch is a prime example. Elastic, the commercial company, has little recognition beyond the main products it produces, Elasticsearch, Logstash, and Kibana (collectively deemed the ELK Stack). The same cannot be said in the majority of other industries in which users remember the brand as well as the product itself. This is not necessarily a bad thing. In fact, it only reconfirms the fact that the open source market is so powerful that it is a brand in and of itself.

Proprietary tools have had a successful run, but it is no match for the vibrant culture that is encouraged in the open source world. The quality of the products being produced and the inclusiveness of the community makes it a force to be reckoned with in modern IT departments. In essence, the future is in open source, and proprietary will have to either get on board or be left in the dust.

Read the original here:
Why Open Source will Overtake Proprietary Software by 2020 ... - Computer Business Review

How Analytics Has Changed in the Last 10 Years (and How It’s Stayed the Same) – Harvard Business Review

Executive Summary

Ten years ago, Jeanne Harris and I published the book Competing on Analytics, and weve just finished updating it for publication in September. Revising our book offered a chance to take stock of ten years of change in analytics. These include advances in hardware, efforts to incorporate unstructured data, an increased reliance on open source software, and the increased use of autonomous analytics, or artificial intelligence. The change in analytics technologies has been rapid and broad. Theres no doubt that the current array of analytical technologies is more powerful and less expensive than the previous generation. In short, all analytical boats have risen.

Ten years ago, Jeanne Harris and I published the book Competing on Analytics, and weve just finished updating it for publication in September. One major reason for the update is that analytical technology has changed dramatically over the last decade; the sections we wrote on those topics have become woefully out of date. So revising our book offered us a chance to take stock of 10years of change in analytics.

Of course, not everything is different. Some technologies from a decade ago are still in broad use, and Ill describe them here too. There has been even more stability in analytical leadership, change management, and culture, and in many cases those remain the toughest problems to address. But were here to talk about technology. Heres a brief summary of whats changed in the past decade.

The last decade, of course, was the era of big data. New data sources such as online clickstreams required a variety of new hardware offerings on premise and in the cloud, primarily involving distributed computing spreading analytical calculations across multiple commodity servers or specialized data appliances. Such machines often analyze data in memory, which can dramatically accelerate times-to-answer. Cloud-based analytics made it possible for organizations to acquire massive amounts of computing power for short periods at low cost. Even small businesses could get in on the act, and big companies began using these tools not just for big data but also for traditional small, structured data.

Along with the hardware advances, the need to store and process big data in new ways led to a whole constellation of open source software, such as Hadoop and scripting languages. Hadoop is used to store and do basic processing on big data, and its typically more than an order of magnitude cheaper than a data warehouse for similar volumes of data. Today many organizations are employing Hadoop-based data lakes to store different types of data in their original formats until they need to be structured and analyzed.

Since much of big data is relatively unstructured, data scientists created ways to make it structured and ready for statistical analysis, with new (and old) scripting languages like Pig, Hive, and Python. More-specialized open source tools, such as Spark for streaming data and R for statistics, have also gained substantial popularity. The process of acquiring and using open source software is a major change in itself for established businesses.

The technologies Ive mentioned for analytics thus far are primarily separate from other types of systems, but many organizations today want and need to integrate analytics with their production applications. They might draw from CRM systems to evaluate the lifetime value of a customer, for example, or optimize pricing based on supply chain systems about available inventory. In order to integrate with these systems, a component-based or microservices approach to analytical technology can be very helpful. This involves small bits of code or an API call being embedded into a system to deliver a small, contained analytical result; open source software has abetted this trend.

This embedded approach is now used to facilitate analytics at the edge or streaming analytics. Small analytical programs running on a local microprocessor, for example, might be able to analyze data coming from drill bit sensors in an oil well drill and tell the bit whether to speed up or slow down. With internet of thingsdata becoming popular in many industries, analyzing data near the source will become increasingly important, particularly in remote geographies where telecommunications constraints might limit centralization of data.

Another key change in the analytics technology landscape involves autonomous analytics a form of artificial intelligence or cognitive technology. Analytics in the past were created for human decision makers, who considered the output and made the final decision. But machine learning technologies can take the next step and actually make the decision or adopt the recommended action. Most cognitive technologies are statistics-based at their core, and they can dramatically improve the productivity and effectiveness of data analysis.

Of course, as is often the case with information technology, the previous analytical technologies havent gone away after all, mainframes are still humming away in many companies. Firms still use statistics packages, spreadsheets, data warehouses and marts, visual analytics, and business intelligence tools. Most large organizations are beginning to explore open source software, but they still use substantial numbers of proprietary analytics tools as well.

Its often the case, for example, that its easier to acquire specialized analytics solutions say, for anti-money laundering analysis in a bank than to build your own with open source. In data storage there are similar open/proprietary combinations. Structured data in rows and columns requiring security and access controls can remain in data warehouses, while unstructured/prestructured data resides in a data lake. Of course, the open source software is free, but the people who can work with open source tools may be more expensive than those who are capable withproprietary technologies.

The change in analytics technologies has been rapid and broad. Theres no doubt that the current array of analytical technologies is more powerful and less expensive than the previous generation. Itenables companies to store and analyze both far more data and many different types of it. Analyses and recommendations come much faster, approaching real time in many cases. In short, all analytical boats have risen.

However, these new tools are also more complex and in many cases require higher levels of expertise to work with. As analytics has grown in importance over the last decade, the commitments that organizations must make to excel with it havealso grown. Because so many companies have realized that analytics are critical to their business success, new technologies havent necessarily made it easier to become and remain an analytical competitor. Using state-of-the-art analytical technologies is a prerequisite for success, but their widespread availability puts an increasing premium on nontechnical factors like analytical leadership, culture, and strategy.

Read more:
How Analytics Has Changed in the Last 10 Years (and How It's Stayed the Same) - Harvard Business Review

WhiteSource Raises $10 Million to Expand Market Leadership in … – Business Wire (press release)

NEW YORK & TEL AVIV, Israel--(BUSINESS WIRE)--WhiteSource, the leader in continuous open source software security and compliance management, today announced a $10M Series B financing round led by 83North, with additional participation from Microsoft Ventures and individual investor David Strohm of Greylock Partners. The new funding will help WhiteSource expand its market leadership in open source security and compliance solutions.

Application development has undergone a revolution in recent years as organizations embrace open source software, in some cases making up to 80 percent of the code base. These practices not only reduce cost and accelerate delivery times, but also introduce management challenges and security vulnerabilities. Some of the most publicized security breaches of recent years, such as the Heartbleed bug, were introduced through the deployment of vulnerable open source components.

Founded by three serial entrepreneurs, Ron Rymon, Azi Cohen and Rami Sass, WhiteSource grew revenues 300 percent year-over-year for the last three years by tapping the need to better track and secure open source assets. The strong offering has also been recognized by Forrester, who recently ranked WhiteSource as a strong performer among all Software Composition Analysis (SCA) offerings in its 2017 Wave report.

The investment in WhiteSource follows our conviction that the rapid acceptance of open source software, which now comprises most of the new code in SMBs and enterprises, is not well managed, said Erez Ofer, Partner at 83North. This issue is exacerbated as companies employ DevOps to reduce time- to-market, increasing the need for a comprehensive solution like WhiteSources.

WhiteSources namesake solution secures and manages open source components of hundreds of enterprises and SMBs around the world. It empowers customers to fully control open source usage with real-time alerts, reports and automated enforcement of policies across the DevOps continuous process.

Our vision from the outset was to help organizations safely, securely and continuously adopt, manage and deploy open source components, stated Rami Sass, Co-founder and CEO of WhiteSource. We are a first mover, and through nonstop innovation, have built a solution that monitors code components in real-time to become an integral part of the DevOps cycle.

"Microsoft is committed to strengthening its relationship with the open source community," said Mony Hassid, general manager and managing director, Microsoft Ventures EMEA. As a Whitesource partner, weve seen firsthand the value its open source security and compliance solutions bring to enterprises. Ourinvestment in its technology is a testimonial of our drive to make open source software practical, productive and secure.

For more information, visit http://www.whitesourcesoftware.com

About WhiteSource

WhiteSource allows engineering, security and compliance officers to effortlessly secure and manage the use of open source components in their software, allowing developers to focus on building great products. WhiteSource fully automates all open source management processes: component detection; security vulnerability alerts and fixes; license risk and compliance analysis along with policy enforcement; quality review, and new version alerts. It offers a complete suite of control, reporting and management to help software teams manage open source truly effortlessly.

For more information about WhiteSource, visit http://www.whitesourcesoftware.com or follow us on twitter: @whitesourcesoft.

Go here to read the rest:
WhiteSource Raises $10 Million to Expand Market Leadership in ... - Business Wire (press release)

Why the last thing open source needs is more corporate oversight … – TechRepublic

Image: iStockphoto/DragonImages

According to a new Black Duck survey, developers can't get enough of open source, ramping up open source adoption by 60% last year. Why the uptick? A whopping 84% cited superior cost savings, ease-of-access, and no vendor lock-in.

That same survey, however, would have us believe that developers live in fear of open source, shuddering at open source vulnerabilities exposing their code, open source "infecting" proprietary software, and more.

Across town, other developers have started creating new, hybrid licenses to help pay the rent for their open source efforts, even as the volume of open source code continues to grow.

In other words, something is amiss.

We've spent decades wringing our hands over the need for open source review boards to govern the intake and release of open source code, yet that code hasn't waited. And despite pleading poverty for years, the open source developer population keeps defying Malthus, cranking out code (and, apparently, getting paid for it). Can we put the fear-mongering to rest?

It's not as if the fear-mongering has worked. Quite the opposite. Open source has become so pervasive that, as Cloudera co-founder Mike Olson declared: "No dominant platform-level software infrastructure has emerged in the last ten years in closed-source, proprietary form." That's "none" as in "zero." Indeed, open source is such a staple of developer life that, he continued, "You can no longer win with a closed-source platform."

Already rampant, open source adoption has grown 60% within the 819 enterprises surveyed by Black Duck. Why? Because of "cost savings, easy access, and no vendor lock-in (84%); ability to customize code and fix defects directly (67%); better features and technical capabilities (55%); and the rate of open source evolution and innovation (55%)."

SEE: Open source documentation is bad, but proprietary software is worse (TechRepublic)

Even so, these same respondents worry about a variety of factors:

Given these concerns, it's perhaps not surprising that roughly half of those surveyed are worried about the lack of formal policies for managing open source code. So worried, in fact, that they keep adopting more and more open source software. They can't seem to download it fast enough, but they're sure worried about what might happen!

See the disconnect?

And then there's the "Brother, can you spare a dime?" nonsense. I spent most of my career trying to monetize open source software. It's hard. I tried a variety of approaches, many of them involving the GNU General Public License (GPL), essentially as a scare tactic to induce risk-averse enterprises to pay. The companies I worked for had various degrees of success with this, most of it middling.

Why? Because open source isn't a business model, as Marten Mickos has stressed. It's a fantastic way to develop software and a pretty miserable way to sell it.

SEE: Why AWS Lambda could be the worst thing to happen to open source (TechRepublic)

This isn't new. This is common knowledge, which is why I have little patience for Sourcegraph, MariaDB, and others that have recently launched hybrid licenses in an attempt to capture the benefits of open source without actually being open source. Good luck with that. In the past I ripped into Sourcegraph's Fair Source Licensing, and a year's worth of pondering hasn't changed my opinion. Redmonk analyst Stephen O'Grady has diplomatically offered, "It's not clear...that hybrid licenses...are a worthwhile approach."

I'll go one step further: They're garbage, and decades of open source make that crystal clear.

Envoy developer Matt Klein, contemplating building a business around the software, decided not to. Among other reasons, perhaps the primary reason was that the success of the project largely depends upon it not having a single company standing behind it. He wrote:

Get that? Open source is all about developers, and developers speak code, not corporate. This is why so many vanity foundations, set up as a facade for corporations to control code but appear not to, don't end up succeeding. To succeed, open source needs to be about code, not the whims of a corporate sugar daddy.

In short, open source continues to do amazingly well precisely because open source review boards aren't stunting its growth. It's thriving even as corporations can't figure out efficient ways to monetize it directly. That's the point. It's always been a way for developers to get stuff done with minimal corporate bureaucracy. It's time to celebrate that and not continue trying to shove it into a corporate cubicle.

See original here:
Why the last thing open source needs is more corporate oversight ... - TechRepublic

General Catalyst, Founder Collective fund the creators of open source programming language – Boston Business Journal

Please Sign In and use this article's on page print button to print this article.

poll

Julia Computing makes it easier for organizations to use the open source programming more

Screenshot of Julia Computing website

The creators of the programming language Julia, several of whom have connections to MIT and Harvard, have raised $4.6 million from General Catalyst and Founder Collective for a startup that aims to commercialize the open source code, a type of business that is becoming more common in the Boston area.

Julia Computing builds professional software tools to make it easier for organizations, especially in the finance world, to make use of the Julia language, which is particularly good for in-demand tasks like data analytics and machine learning. Asset manager BlackRock and large British insurer Aviva are both Julia Computing customers, for example.

Julia Computing makes it easier for organizations to use the open source programming more

Screenshot of Julia Computing website

Alan Edelman, a math professor at MIT, helped start Julia Computing in 2015, along with a number of other computer science researchers affiliated with MIT and Harvard. The team is led by CEO Viral Shah and chief operating officer Deepak Vinchhi, who are both based in India, according to their LinkedIn profiles. The co-founders were all early creators Julia.

We selected General Catalyst and Founder Collective as our principal investors because of their successful track records in the technology sector and our shared commitment to open source," Shah said in a statement. "This investment helps us accelerate product development and continue delivering outstanding support to our customers and users."

Donald Fischer, the lead investor from General Catalyst, was an early employee at Red Hat Inc., a pioneer of commercializing open source software. Raleigh, N.C.-based Red Hat (NYSE: RHT) went public in 1999 and brought in more than $2 billion in revenue in fiscal 2016. The company has been growing its presence in Greater Boston in recent years and soon will be opening a 45,000 square foot office at 300 A St. in the city's Fort Point neighborhood.

Other local startups founded to commercialize open source software include Acquia, RapidMiner, Mautic and R Studio, also a General Catalyst investment. Black Duck Software in Burlington helps companies securely manage their use of various open source languages, and Boston-based DataRobot used open source algorithms to automate some data science tasks.

poll

Read the original post:
General Catalyst, Founder Collective fund the creators of open source programming language - Boston Business Journal

Univa Contributes Universal Resource Broker to the Open Source Community – Business Wire (press release)

FRANKFURT, Germany--(BUSINESS WIRE)--ISC High Performance Conference, Booth 1214 Univa, the Data Center Workload Optimization Company, today announced the contribution of its Universal Resource Broker (URB) technology to the open-source community.

The Universal Resource Broker is a software solution that allows distributed application frameworks written for Apache Mesos to run seamlessly on Univa Grid Engine. Making URB available as an open-source project opens the door to continued innovation, enabling community contributors to build adapters to additional workload managers and application frameworks. In addition to open-sourcing the project, Univa is extending URB to support Kubernetes clusters as well.

By using URB, users have the option of deploying any Mesos compatible framework (including Spark, Hadoop, Storm, Jenkins, Marathon and Chronos) along with any other type of workload using Univa Grid Engine as an underlying workload management substrate for high performance and high throughput environments. Users can also choose to run URB on a variety of Kubernetes based cluster solutions in cases where containerized microservices are key. Enterprises requiring more powerful scheduling features and policy-based controls will want to select the combination of Navops Command, URB and Kubernetes with the additional option of Navops Commands Mixed Workload support for non-containerized applications run inside the Kubernetes environment.

According to Fritz Ferstl, CTO at Univa Corporation, This is an important development for our large installed base of Grid Engine customers and for the burgeoning Kubernetes ecosystem. With the Universal Resource Broker, customers can easily deploy a single cluster supporting batch, interactive, containerized and now Mesos API driven workloads without contention. Were continuing to deliver on our promise to help customers achieve better business results on a more cost-efficient, shared infrastructure.

URB provides Kubernetes users with a seamless way to run application frameworks written for Mesos while protecting existing investments. The technology complements Univas Navops Command, an advanced, enterprise-proven policy management solution for Kubernetes that supports mixed workloads. With URB and Navops Command, Univa supports the widest variety of container and non-container based application workloads on a shared Kubernetes environment.

Open source software is critical in the modern application container ecosystem and market, where availability, flexibility and integration can be enhanced by open source software components and frameworks. said Jay Lyman, principal analyst for 451 Research. Univa's URB also supports the variety of software - including big data, continuous integration and container management and orchestration technology - that is critical to success in enterprise IT today.

We are pleased to contribute this key technology to the open source community said Rob Lalonde, General Manager of Univas Navops business unit. This contribution demonstrates our ongoing commitment to open source software and to helping organizations get the most from their infrastructure investments. We are very excited to extend the capabilities of URB to Kubernetes.

The open-source Universal Resource Broker for Grid Engine is expected to be available in July of 2017 and will be released under an Apache open-source license. URB for Kubernetes is planned for August of 2017 availability.

For more information visit http://www.univa.com, contact Univa atsales@univa.comor stop by booth #C-1214 at the ISC High Performance 2017, June 19-21 in Frankfurt, Germany.

About Univa Grid Engine

Univa Grid Engine is the leading workload management system. The solution maximizes the use of shared resources in a datacenter and applies advanced policy management tools to deliver products and results faster, more efficiently, and with lower overall costs. The product can be deployed in any technology environment, including containers: on-premises, hybrid or in the cloud. A variety of add-ons can be utilized to extend workload management capabilities and create a customized solution for any enterprise infrastructure. For more information, please visit http://www.univa.com or follow on Twitter @Grid_Engine

About Navops

Navops is a suite of products that enables enterprises to take full advantage of Kubernetes and provides the ability to quickly and efficiently run containers at scale. Navops utilizes workload placement and advanced policy management across on-premises, cloud, and hybrid infrastructures. With Navops, companies can automate microservices applications and efciently respond to end user demand. For more information, please visit http://www.navops.io or follow on Twitter @Navops

About Univa Corporation

Univa, the Data Center Automation Company, is the leading provider of automation and management software for computational and big data infrastructures. Our products and global enterprise support give our customers the power to manage all of their compute resources, no matter how big or where deployed. Many of the leading brands in the world depend on Univa's unsurpassed expertise, and premier services and support. Univa is headquartered in Hoffman Estates, Illinois, USA, with offices in Markham, ON, Canada, Munich and Regensburg, Germany.

See original here:
Univa Contributes Universal Resource Broker to the Open Source Community - Business Wire (press release)

Open source security challenges in cars – Information Age

Both auto OEMS and their suppliers should adopt management practices that inventories open source software; that maps software against known vulnerabilities as well as alerting to new security threats; that identifies potential licensing and code quality risks; and that can maximise the benefits of open source while effectively managing risks

A revolution is underway in the automotive industry. The car is no longer simply a means of getting from here to there. Todays car reaches out for music streamed from the cloud, allows hands-free phone calls, and provides real-time traffic information and personalised roadside assistance.

Almost every modern automobile feature speed monitoring, fuel efficiency tracking, anti-lock braking, traction and skid-control is now digitised to provide drivers with easier, safer operation and better information.

>See also:The scariest vulnerabilities in driverless cars

Recent innovations enable automobiles to monitor and adjust their position on the highway, alerting drivers if they are drifting out of their lane, even automatically slowing down when they get too close to another car. And whether were ready or not, well soon be sharing the roads with autonomous vehicles.

Driving the technology revolution in the automotive industry is software, and that software is built on a core of open source. Open source use is pervasive across every industry vertical, including the automotive industry.

When it comes to software, every auto manufacturer wants to spend less time on what are becoming commoditiessuch as the core operating system and components connecting the various pieces togetherand focus on features that will differentiate their brand. The open source model supports that objective by expediting every aspect of agile product development.

But just as lean manufacturing and ISO-9000 practices brought greater agility and quality to the automotive industry, visibility and control over open source will be essential to maintaining the security, license compliance, and code quality of automotive software applications and platforms.

When someonethinks of building software, we think of it being created by an internal development teams. But auto manufacturers rely on hundreds of independent vendors supplying hardware and software components to Tier 1 and 2 vendors as well as directly to OEMs.

>See also:The future of driverless cars and data security

The software from each of those vendors is likely a mix of custom code written by the vendor and third-party code (both proprietary and open source). With tens of millions of lines of code executing on as many as 100 microprocessor-based electronic control units (ECUs) networked throughout the car, understanding exactly which open source components are part of the mix can be extremely difficult for the OEMs. When you add in the fact that over 3,000 open source vulnerabilities are reported every year, the security implications are clear.

Lets assume a Tier 2 vendor is using an open source component, and a vulnerability is disclosed.

First the vendor needs to know they are using that specific open source component. Next they need to be monitoring sources in order to know about the newly reported vulnerability. Then they need to re-factor and test their code to remediate the issue.

When all this is done, the software update needs to go to the OEM or Tier 1 vendor, be incorporated into an update of that entitys component and, ultimately, be updated in each consumers vehicle.

Updating presents its own challenges. When security researchers demonstrated in 2015 that they could hack a Jeep over the Internet to hijack its brakes and transmission, it posed a security risk serious enough that Chrysler recalled 1.4 million vehicles to fix the bug that enabled the attack. Recall is in quotes because Chrysler didnt actually require owners to bring their vehicles to a dealer. Instead, they were sent a USB drive with a software update they could self-install. But how many owners are comfortable updating the software in their cars?

>See also:How cyber attackers will shift gear once connected cars hit the road

Vehicles can be updated during routine service, of course, but probably only if the service is provided by an authorised dealer, a prospect that decreases as a vehicle ages. Over-the-air updates of software are still the exception rather than the rule, and may require that the vehicle be stopped for safety reasons. After all, we probably dont want a software reboot when a vehicle is moving at highway speed.

Your cell phone may have a practical life of 2-3 years, but receives regular operating systems updates and perhaps hundreds of app updates each year. The laptop Im using to write this likewise receives regular updates and patches, and will likely be replaced after 3-5 years. This is the typical lifecycle software vendors are used to addressing.

A modern car, however, is in design for years prior to production, and the average vehicle may be on the road for 10-15 years. Supporting software over that period of time requires a different thought process. Vendors (and open source communities) need to be considered in light of the operational risk they present. Questions vendors need to ask include:

How sure are you that the components you are using will be supported by the open source community in the future? Are you prepared to provide ongoing support for projects if the community (or vendor) abandons them? What does the release cycle look like? How many vulnerabilities has the component had over the last few years compared to the size of the code base? Is the community security-aware?

>See also:Everything you need to know about car hacking

When a supplier or auto OEM is not aware all the open source in use in its products software, it cant defend against attacks targeting vulnerabilities in those open source components. Any organisation leveraging connected car technology will need to examine the software eco-system its using to deliver those features, and account for open source identification and management in its security program.

Both auto OEMS and their suppliers should adopt management practices that inventories open source software; that maps software against known vulnerabilities as well as alerting to new security threats; that identifies potential licensing and code quality risks; and that can maximise the benefits of open source while effectively managing risks.

Sourced byMike Pittenger, VP, Security Strategy, Black Duck Software

The UKs largest conference for tech leadership, TechLeaders Summit, returns on 14 September with 40+ top execs signed up to speak about the challenges and opportunities surrounding the most disruptive innovations facing the enterprise today. Secure your place at this prestigious summit byregisteringhere

Read more:
Open source security challenges in cars - Information Age

Webinar: Application Security in the Age of Open Source – IT Business Edge (blog)

Live Webinar:June 20, 2017 @ 1:00 p.m. ET / 10:00 a.m. PT

Register now to attend this event.

Open source software is the foundation for application development worldwide, comprising 80 to 90% of the code in today's applications. Its value in reducing development costs, speeding time to market and accelerating innovation is driving adoption, but the explosion in open source use has not been accompanied by effective security and management practices.

A 2017 Black Duck analysis of code audits conducted on 1,071 applications found that 97% contained open source, but 67% of the applications had open source vulnerabilities, half of which were categorized as "severe."

Join IT industry veteran Lenny Liebmann and Black Duck VP of Security Strategy Mike Pittenger for a discussion of best practices in open source security and management to reduce application security risk.

Topics discussed in the event will include:

Register now to attend this event.

Continue reading here:
Webinar: Application Security in the Age of Open Source - IT Business Edge (blog)