The open source licence debate: what we need to know – Open Source Insider – ComputerWeekly.com

As we have already noted on Computer Weekly Open Source Insider, open source grew, it proliferated and it became something that many previously proprietary-only software vendors embraced as a key means of development.

But the issue of how open source software is licenced is still the stuff of some debate.

Open Source Insider has already looked at the issues relating to dead projects (that are still walking and running) and the need for workable incentivisation models.

Chief operating officer (COO) for GitHub Erica Brescia noted that, from her perspective, she is seeing an increasing tension between open source projects and those that are building services on top of open source, such as cloud vendors with their database services.

Brescia notes that licenses applied to open source projects a decade ago did not consider the possibility of a cloud vendor delivering an as-a-Service SaaS layer using the project without contributing back to it, which is leaving some open companies in a difficult position.

Computer Weeklys Cliff Saran wrote, With friends like AWS, who needs an open source business? and noted thataNew York Timesarticle suggested that Amazon Web Services (AWS) was strip-mining open source projects by providingmanaged services based on open source code,without contributing back to the community.

We have also looked at the security aspects of open source licencing.

Exec VP at software intelligence company Cast is Rado Nikolov for his money, the open source licencing debate also has a security element in it.

Large organisations using open source code from GitHub, xs:code and other sources range from Walmart to NASA, collectively holding billions of pieces of sensitive data. Although open source code packages can be obtained at low or no cost, their various intellectual property and usage stipulations may lead to expensive legal implications if misunderstood or ignored, said Niklov.

Ilkka Turunen, global director of solutions architecture at DevSecOps automation company Sonatype further reminded us that there are 1001 ways of commercialising open source software but when releasing open source, the developer has a choice of publishing it under a license that is essentially a contract between them and the end user.

So theres security, theres fair and just contributions back to the community, theres layering over open for commercial use, theres the complexity of just so many open source licences existing out there to choose from and theres even concerns over whether trade sanctions can affect open source projects and see them becoming bifurcated along national borders.

Open source is supposed to be built around systems of meritocracy and be for the benefit of all, we must work hard to ensure that we can do this and shoulder the nuances of licensing to keep open source software as good as it should be let the debate continue.

More here:
The open source licence debate: what we need to know - Open Source Insider - ComputerWeekly.com

MongoDB: Riding The Data Wave – Seeking Alpha

MongoDB (MDB) is a database software company which is benefiting from the growth in unstructured data and leading the growth in non-relational databases. Despite MongoDB's recent rise in share price, its current valuation is modest given its strong position in a large and attractive market.

There has been an explosion in the growth of data in recent years with this growth being dominated by unstructured data. Unstructured data is currently growing at a rate of 26.8% annually compared to structured data which is growing at rate of 19.6% annually.

Figure 1: Growth in Data

(source: m-files)

Unstructured data refers to any data which despite possibly having internal structure is not structured via pre-defined data models or schema. Unstructured data includes formats like audio, video and social media postings and is often stored in non-relational database like NoSQL. Structured data is suitable for storage in a traditional database (rows and columns) and is normally stored in relational databases.

Mature analytics tools exist for structured data, but analytics tools for mining unstructured data are nascent. Improved data analytics tools for unstructured data will help to increase the value of this data and encourage companies to ensure they are collecting and storing as much of it as possible. Unstructured data analytics tools are designed to analyze information that doesn't have a pre-defined model and include tools like natural language processing.

Table 1: Structured Data Versus Unstructured Data

(source: Adapted by author from igneous)

Unstructured data is typically stored in NoSQL databases which can take a variety of forms, including:

Unstructured data can also be stored in multimodel databases which incorporate multiple database structures in the one package.

Figure 2: Multimodel Database

(source: Created by author)

Some of the potential advantages of NoSQL databases include:

Common use-cases for NoSQL databases include web-scale, IoT, mobile applications, DevOps, social networking, shopping carts and recommendation engines.

Relational databases have historically dominated the database market, but they were not built to handle the volume, variety and velocity of data being generated today nor were they built to take advantage of the commodity storage and processing power available today. Common applications of relational databases include ERP, CRM and ecommerce. Relational databases are tabular, highly dependent on pre-defined data definitions and usually scale vertically (a single server has to host the entire database to ensure acceptable performance). As a result, relational databases can be expensive, difficult to scale and have a relatively small number of failure points. The solution to support rapidly growing applications is to scale horizontally, by adding servers instead of concentrating more capacity in a single server. Organizations are now turning to scale-out architectures using open software technologies, commodity servers and cloud computing instead of large monolithic servers and storage infrastructure.

Figure 3: Data Structure and Database Type

(source: Created by author)

According to IDC, the worldwide database software market, which it refers to as structured data management software, was $44.6 billion in 2016 and is expected to grow to $61.3 billion in 2020, representing an 8% compound annual growth rate. Despite the rapid growth in unstructured data and the increasing importance of non-relational databases, IDC forecasts that relational databases will still account for 80% of the total operational database market in 2022.

Database management systems (DBMS) cloud services were 23.3% of the DBMS market in 2018, excluding DBMS licenses hosted in the cloud. In 2017 cloud DBMS accounted for 68% of the DBMS market growth with Amazon Web Services (AMZN) and Microsoft (MSFT) accounting for 75% of the growth.

MongoDB provides document databases using open source software and is one of the leading providers of NoSQL databases to address the requirements of unstructured data. MongoDB's software was downloaded 30 million times between 2009 and 2017 with 10 million downloads in 2017 and is frequently used for mobile apps, content management, real-time analytics and applications involving the Internet of Things, but can be a good choice for any application where there is no clear schema definition.

Figure 4: MongoDB downloads

(source: MongoDB)

MongoDB has a number of offerings, including:

Figure 5: MongoDB Platform

(source: MongoDB)

Functionality of the software includes:

MongoDB's platform offers high performance, horizontal scalability, flexible data schema and reliability through advanced security features and fault-tolerance. These features are helping to attract users of relational databases with approximately 30% of MongoDB's new business in 2017 resulting from the migration of applications from relational databases.

MongoDB generates revenue through term licenses and hosted as-a-service solutions. Most contracts are 1 year in length invoiced upfront with revenue recognized ratably over the term of the contract although a growing number of customers are entering multiyear subscriptions. Revenue from hosted as-a-service solutions is primarily generated on a usage basis and is billed either in arrears or paid up front. Services revenue is comprised of consulting and training services which generally result in losses and are primarily used to drive customer retention and expansion.

MongoDB's open source business model has allowed the company to scale rapidly and they now have over 16,800 customers, including half of the Global Fortune 100 in 2017. Their open source business model uses the community version as a pipeline for potential future subscribers and relies on customers converting to a paid model once they require premium support and tools.

Figure 6: Prominent MongoDB Customers

(source: Created by author using data from MongoDB)

MongoDB's growth is driven largely by its ability to expand revenue from existing customers. This is shown by the expansion of Annual Recurring Revenue (ARR) overtime, where ARR is defined as the subscription revenue contractually expected from customers over the following 12 months assuming no increases or reductions in their subscriptions. ARR excludes MongoDB Atlas, professional services and other self-service products. The fiscal year 2013 cohort increased their initial ARR from $5.3 million to $22.1 million in fiscal year 2017, representing a multiple of 4.1x.

Figure 7: MongoDB Cohort ARR

(source: MongoDB)

Although MongoDB continues to incur significant operating losses the contribution margin of new customers quickly becomes positive, indicating that as MongoDB's growth rate slows the company will become profitable. Contribution margin is defined as the ARR of subscription commitments from the customer cohort at the end of a period less the associated cost of subscription revenue and estimated allocated sales and marketing expense.

Figure 8: MongoDB 2015 Cohort Contribution Margin

(source: MongoDB)

MongoDB continues to achieve rapid revenue growth driven by an increasing number of customers and increased revenue per customer. Revenue growth has shown little sign of decline which is not surprising given the size of MongoDB's market opportunity. Revenue per customer is modest and MongoDB still has significant potential to expand the number of Global Fortune 100 customers.

Figure 9: MongoDB Revenue

(source: Created by author using data from MongoDB)

Figure 10: MongoDB Customer Numbers

(source: Created by author using data from MongoDB)

MongoDB's revenue growth has been higher than other listed database vendors since 2017 as a result of their expanding customer base and growing revenue per customer. The rise of cloud computing and non-relational databases has a large impact on relational database vendors with DBMS growth now dominated by cloud computing vendors and non-relational database vendors.

Figure 11: Database Vendor Revenue

(source: Created by author using data from company reports)

MongoDB's revenue growth is relatively high for its size when compared to other database vendors, but is likely to begin to decline in coming years.

Figure 12: Database Vendor Revenue Growth

(source: Created by author using data from company reports)

MongoDB's revenue is dominated by subscription revenue and this percentage has been increasing over time. This relatively stable source of income holds MongoDB in good stead for the future, particularly if customers can be converted to longer-term contracts.

Figure 13: MongoDB Subscription Revenue

(source: Created by author using data from MongoDB)

MongoDB generates reasonable gross profit margins for an enterprise software company from its subscription services, although these have begun to decline in recent periods. Likely as the result of the introduction of the entry level Atlas offering in 2016 and possibly also as a result of increasing competition.

Figure 14: MongoDB Gross Profit Margin

(source: Created by author using data from MongoDB)

MongoDB has exhibited a large amount of operating leverage in the past and is now approaching positive operating profitability. This is largely the result of declining sales and marketing and research and development costs relative to revenue. This trend is likely to continue as MongoDB expands, particularly as growth begins to decline and the burden of attracting new customers eases.

Figure 15: MongoDB Operating Profit Margin

(source: Created by author using data from MongoDB)

Figure 16: MongoDB Operating Expenses

(source: Created by author using data from MongoDB)

Although MongoDB's operating profitability is still negative it is in line with other database vendors and should become positive within the next few years. This is supported by the positive contribution margin of MongoDB's customers after their first year.

Figure 17: Database Vendor Operating Profit Margins

(source: Created by author using data from company reports)

MongoDB is yet to achieve consistently positive free cash flows, although appears to be on track as the business scales. This should be expected based on the high margin nature of the business and the low capital requirements. Current negative free cash flow is largely a result of expenditures in support of future growth in the form of sales and marketing and research and development.

Figure 18: MongoDB Free Cash Flow

(source: Created by author using data from MongoDB)

Competitors in the database vendor market can be broken into incumbents, cloud platforms and challengers. Incumbents are the current dominant players in the market, like Oracle (ORCL), who offer relational databases. Cloud platforms are cloud computing vendors like Amazon and Microsoft that also offer database software and services. Challengers are pure play database vendors who offer a range of non-relational database software and services.

Table 2: Database Vendors

(source: Created by author)

Incumbents

Incumbents offer proven technology with large set of features which may be important for mission critical transactional applications. This gives incumbents a strong position, particularly as relational databases are expected to continue to retain the lion's share of the database market in coming years. Incumbent players that lack a strong infrastructure-as-a-service platform though are poorly positioned to capture new applications and likely to be losers in the long run. This trend is evidenced by Teradata's (TDC) struggles since the advent of cloud computing and non-relational databases.

Cloud Platforms

Cloud service providers are able to offer a suite of SaaS solutions in addition to cloud computing, creating a compelling value proposition for customers. In exchange for reducing the number of vendors required and gaining access to applications designed to run together, database customers run the risk of being locked into a cloud vendor and paying significantly more for services which could potentially be inferior.

Challengers

Dedicated database vendors can offer best in breed technology, low costs and multi-cloud portability which helps to prevent cloud vendor lock-in.

The DBMS is typically broken into operational and analytical markets. The operational DBMS market refers to databases that are tied to a live application whereas the analytical market refers to the processing and analyzing of data imported from various sources.

Figure 19: Database Market Competitive Landscape

(source: Created by author)

Gartner assesses MongoDB as a challenger in the operational database systems market due primarily to a lack of completeness of vision. The leaders are generally large companies which offer a broader range of database types in addition to cloud computing services. MongoDB's ability to succeed against these companies will be dependent on them being able to offer best in class services and/or lower cost services.

View original post here:
MongoDB: Riding The Data Wave - Seeking Alpha

Kitware Offers Latest Innovations in Healthcare Simulation with Updates to Interactive Medical Simulation Toolkit and Pulse Physiology Engine -…

Clifton Park, NY, Jan. 17, 2020 (GLOBE NEWSWIRE) -- Kitware, a leader in open source software research and development, has released the latest versions of two of its popular medical training and simulation toolkits the Interactive Medical Simulation Toolkit (iMSTK) 2.0 and the Pulse Physiology Engine (Pulse) 2.3. Updates to these toolkits include improved models and functionality based on feedback from user and developer communities. Kitware will showcase these latest features and improvements at the International Meeting on Simulation in Healthcare (IMSH) in San Diego, January 18-22 at booth 912.

Both iMSTK and Pulse provide the technology to build virtual simulators that can help practicing surgeons, medical students, residents, and nurses to rehearse or plan medical procedures. For example, iMSTK has been used to help medical professionals prepare for biopsies, resectioning, radiosurgery, and laparoscopy without compromising patient safety in the operating room. It can also help accredit potential surgeons in basic skills for laparoscopy, endoscopy or robotic surgery. Pulse provides necessary physiologic feedback for clinicians training to provide life-saving medical treatment, such as for hemorrhage, tension pneumothorax, airway trauma, ventilator use and settings, and anaphylaxis.

"Kitware's medical computing team is dedicated to advancing research solutions in the medical community," said Andinet Enquobahrie, the director of medical computing at Kitware. "Whether we are collaborating with a university on research, working with our communities to improve our software platforms, or partnering with another company to integrate our software into their products and projects, our goal is to provide application developers the tools they need to develop powerful applications for medical skill training."

iMSTK 2.0 Improves Features, Efficiency of Physics, Collision and Rendering Modules

iMSTK is a free, open source toolkit that offers product developers and researchers all the software components they need to build and test virtual simulators for medical training and planning. Release 2.0 offers improved functionality with many new features as well as refactored modules that address the ease-of-use, and extendability of the API. Specifically, it has greatly improved the features as well as the efficiency of the physics, collision modules, and rendering modules.

Here are some release highlights:

Pulse 2.3 Improves Models and Functionality to Advance the Engine for Customer Needs

Pulse is a free, open source physiology engine that is used to rapidly prototype virtual simulation applications. These applications simulate whole-body human physiology through adult computational physiology models. Release 2.3 includes updates that were the result of Kitware's work with users to improve models and functionality of the engine.

Here are some release highlights:

For more information about iMSTK, visit the iMSTK website. For more information about Pulse, visit the newly redesigned Pulse website or sign up for the Pulse newsletter. To receive the latest updates on all of Kitware's software platforms, subscribe to our blog.

About Kitware

Since 1998, Kitware has been providing software research and development services to customers ranging from startups to Fortune 500 companies, including government and academic laboratories worldwide. Kitware's core areas of expertise are computer vision, data and analytics, high-performance computing and visualization, medical computing, and software process. The company has grown to more than 150 employees, with offices in Clifton Park, NY; Arlington, VA; Carrboro, NC; Santa Fe, NM; and Lyon, France. For more information visit kitware.com.

Read this article:
Kitware Offers Latest Innovations in Healthcare Simulation with Updates to Interactive Medical Simulation Toolkit and Pulse Physiology Engine -...

Qualys offers GPS guidance for developers at the application security crossroads – ComputerWeekly.com

All developers care deeply about application [development] security.

Okay, thats perhaps not always strictly true lets try again.

All developers care deeply about application functionality and speed, which they then carry through to a secondary level of concern related to Ops-level application manageability, flexibility and security.

How then should we engage with programmers on aspects of security, especially as it now straddles something of a crossroads brought about by the move to increasingly cloud-native cloud-first application development?

Security specialist Qualys [pronounced: KWAL-IS) has attempted to address the application development security subject head-on by hosting what probably ranks as the first tech event of 2020.

Qualys Security Conference London 2020 ran this week in London with the tagline: application security at a crossroads and isnt it just?

The company billed the event as an opportunity to explore the profound impact of digital transformation on the security industry and what it means for practitioners, partners and vendors.

Qualys is clearly focused on gaining attention from CIOs, CSOs and CTOs; but at ground level, the company says it works with network managers, cloud developers and security developers or, as they are known these days, DevSecOps practitioners.

So for developers then as we have noted before on the Computer Weekly Developer Network, the Qualys Web Application Scanning (WAS) 6.0 product now supports Swagger version 2.0 to allow programmers to streamline [security] assessments of REST APIs and get visibility of the security posture of mobile application backends and Internet of Things (IoT) services.

NOTE: Swagger is an open source software framework backed by a considerable ecosystem of tools that helps developers design, build, document and consume RESTful web services.

Qualys president and chief product officer Sumedh Thakar used his London keynote slot to deliver a piece he called The Evolution of the Qualys Platform: Unveiling the Latest Updates and Next-Gen Initiatives.

Speaking at the London show this January Thakar suggests that the process of digital transformation has moved from being a prototyping exploratory part of the business to, now in 2020, being something that IT development teams are truly rolling out.

Banks are now looking at technologies that would allow users to open an account simply by taking a selfie, said Thakar and so this will mean that these processes (which essentially run on applications) need to run on a secure backbone. The infrastructure that organisations will run on has become super-hybrid in order to be able to join all these new digital services together.

Cloud, containerisation and refactoring applications to be mobile friendly are just some of the major changes that need to happen in digitally disruptive environments.

Thakar is perhaps suggesting that if we can show developers that there are automated intelligence layers in place that will work across hybrid infrastructures and reduce the Mean Time To Remediation (MTTR), then developers might in fact take more interest in the security aspect of the systems they are working to engineer in the first place.

Thakar used a number of real world examples (from bank accounts that can be opened with nothing more than a selfie to intelligent motion-sensing doorbells) in an attempt to justify and validate the need for Qualys security technologies. With all examples tabled, Thakar led the audience forward to think about how system responses should be actioned.

He explained that the evolution of the Qualys platform has come about because SIEM, SOAR and log file analytics solutions (such as Splunk) were either never built to support a [security] data model that could be driven by Machine Learning (ML) or were not actually designed for security in the first place. and log file analytics is acting on historical data so it is very much after the event

NOTE: Security Information & Event Management - were always designed as log correlation specialists. Security Orchestration Automation & Response again was too much of a point solution (but which Qualys is adding as a function directly as a playbook anyway.)

As programmers design and evolve an image in the cloud, these developers will only need to make one single API call to bring Qualys security layers to bear upon their cloud native applications, due to the companys proximity to both Microsoft Azure and to Google Cloud Platform.

New (in terms of products) in 2020 is Qualys Respond, which includes an agent to deploy patches automatically to users devices so again, this allows applications to feature remediation controls more intuitively.

Other developer tools from the company include the ability to use Qualys Browser Recorder, a free Google Chrome browser extension, to review scripts for navigating through complex authentication and business workflows in web applications.

So then will developers ever truly embrace security issues and allow DevSecOps to put the Ops in operationalised?

Qualys would like to think so and engagement at the coal face along with an option to explain how complex authentication, the use of optimised security agents and streamlined security assessments/audits can be made easy dare we suggest almost joyful will (very arguably) ultimately really make a difference for developers.

See the original post here:
Qualys offers GPS guidance for developers at the application security crossroads - ComputerWeekly.com

Expected Growth In Open Source Software Market And Its Global Impact – BulletintheNews

Headline : Global Open Source Software Market With Complete Insights On Key Players, Top Product Types, Applications Over 2019-2026

The Global Open Source Software Research Report 2019-2026 provides qualitative and quantitative data on key Open Source Software Market elements. an entire industry performance analysis and competitive landscape view are studied from 2014-2026. The key factors nalyzed during this report are CAGR value, global, regional and country-level analysis with pportunities in Open Source Software. Allthe preciousinsights like production, capacity, ex-factory price, revenue, market share are analyzedduring thisreport.the entirefocusis obtainableon competitive landscape, major market players, strategies, SWOT analysis, PEST analysisto raisedgauge Open Source Software Industry insights.

Click Here for free of charge PDF Sample File of the report: @ https://reportscheck.biz/report/37101/global-open-source-software-industry-market-research-report/#sample-report

Major players covered during thisreport are:

ComiitAlfrescoRethinkDBContinuentTranscendCleversafeOracleIntelClearCenterCanonicalRedpill LinproAcquiaRed HatIBMCompiereFOSSIDAstaroOpenText

The market dynamics section analyzes the drivers, restraints, growth opportunities and various developmental aspects of Open Source Software Market. With regards to the highest players key data in terms of product portfolio, company overview, finances, investments, and business strategies are offered. The quantitative chemical analysis of Open Source Software Industry analyzes the geographical presence, type, applications, players, sales and rate of growth s of Open Source Software Industry during historical and forecast period.

The revenue, growth rate, market size, application, sales revenue, Y-O-Y rate of growth (base year) in Open Source Software Market is studied. The import-export policies in Open Source Software industry, latest trends, investment plans and policies, marketing analysis is conducted. The segmented market study supported top Open Source Software product types, applications and key players will provide a classified outlook. Also, Porters five forces analysis will find out the potential threats and consumer analysis to know the threats from new entrants.

The major product sorts ofOpen Source Software are classified as follows:

SharewareBundled SoftwareBSD(Berkeley Source Distribution)Other

The applications of Open Source Software are classified as follows:

BFSIManufacturingHealthcareRetail

Know More About This Report Or invite Custom Content :@ https://reportscheck.biz/report/37101/global-open-source-software-industry-market-research-report/#table-of-content

Market Outlook And Research Method:

Regional market outlook during this industry is obtainable supported Open Source Software Market presence in North America, Europe, China, Japan, India, Southeast Asia , South America, MEA and remainder of the countries. the entire market status, prospect and revenue share is studied over 2014-2026. The Open Source Software market concentration ratio, production sites, product types, revenue, trends, mergers & acquisition also because the expansion is analyzed. The manufacturing analysis , raw materials analysis, price trend, key suppliers, manufacturing chain, and industry analysis is conducted.

We gather data on Open Source Software Industry from primary and secondary research which is then classified supported Top-Down and Bottom-Up approach. Other sources like industry magazines. Paid database sources, SEC filings, government associations are wont to validate the info . We also offer an executive summary for our clients to measure the newest trends and upcoming industry plans.

Lastly, the marketing channel, distributors, customers of Open Source Software are identified. The last section of the report offers region-wise production and revenue forecast also as forecast supported type and applications from 2019-2026.

Contact Us

Olivia Martin

T: 831-679-3317

Inquiry: [emailprotected]

Website:https://reportscheck.biz/

Read more here:
Expected Growth In Open Source Software Market And Its Global Impact - BulletintheNews

Is open source culture the answer to our technology woes? – RTE.ie

Opinion: open source means to be open about the source of knowledge that enables anyone to make something

Eventually, we will all get bored of seeing technology being depicted as either the saviour or the Lucifer of our own existence. There are only so many articles on the wonders of technological progress one could possibly digest and only so many episodes of Black Mirror to tell us how messed up our lives could get in the near future. When that time comes, we will begin to seriously consider the fact that technology is not just a tool or a service, but a medium through which individual and social identities can shape and be shaped.

Butwhat remains unexplained for now is why such a shaping power is blindly handed over and over again to private corporations regardless of the number of times they have misused it.We also need to explain why we keep pointing the finger at big tech corporations for the way things are, while also expectingthem to solve the mess they enabled.

Delegating responsibilities to others will only perpetuate the problem, not resolve it. Pretending to solve the issue by addressing the psychosis of tech giants is likely to be a lost cause. We, on the other hand, could be better off addressing our own Stockholm syndrome.

From RT Lyric FM's Culture File, Open Source Aran Sweater is a project to create an archive of Aran knitting

For that, open source culture is likely to be the most effective, if not the only, therapy. Open source means to be open about the source of knowledge that enables anyone to make something. With regard to technology, one important element of that source, but certainly not the only one,is represented by the code used to generate a given piece of software, AKA the source code.

But you would be mistaken to think that the ability to read and write code is a necessary requirement to access this alternative technological world. In fact, open source should be understood in its broader sense of open knowledge. Should one wish, everyone can contribute in many ways such as by sharing, translating and editing instructions, creating tutorials andengaging with the ethical issues at stake in our technological society. Contrary to how things were 30 years ago, open source software is today as user-friendly and good-looking as any other proprietary and close-source counterpart. The ability to read and write code is certainly useful, but not necessary when using open source alternatives.

Realising that technology is a social endeavour makes its communal ownership and development a necessity. The alternative would be reading series like Black Mirror as prophecies rather than high quality entertainment (yes, I am a big fan of Black Mirror,just in case it was missed).

If education is about enabling individuals to grow as free thinkers and responsible citizens, the adoption of open source technology within universities is a must

Creating the fertile ground for the flourishing of such an open culture cannot be left to chance. It must be actively sought and defended by society at large. Educational institutions, from primary schools to universities, have a big responsibility in that regard.They should be guarantors for the accessibility and openness of knowledge and despise the idea of a "used-user". I leave you to judge if this is the case today. In my experience as an academic,universities are light-years away from even considering this as an issue, let alone considering their role in it.

There are no compromises here. If education is about knowledge and enabling individuals to grow as free thinkers and responsible citizens, the adoption of open source technology within universities is a must.The Humboltian model of holistic education comes once again to the fore as one of the most sensible to follow.

The International Conference on Live codingtakes place at theUniversity of Limerick from February 5th to 7th and is an example of how a community of artists shapes itself through the open source culture. Attendance is free of charge

The views expressed here are those of the author and do not represent or reflect the views of RT

Read more here:
Is open source culture the answer to our technology woes? - RTE.ie

Linus Torvalds, creator of the Linux operating system, warned developers not to use an Oracle-owned file system because of the company’s ‘litigious…

The renowned programmer Linus Torvalds has warned users of Linux, the popular open source operating system he built, not to use an Oracle file system because of possible legal actions, Phoronix's Michael Larabel first reported.

The file system, called ZFS, was built by Sun Microsystems, which has since been acquired by Oracle. Torvalds wrote in an online forumon Jan. 6that he does not feel "safe" in adding ZFS code to the Linux project because of Oracle's tendency to file lawsuits against other companies, including an ongoing legal brawl between Google and Oracle.

"Other people think it can be ok to merge ZFS code into the kernel and that the module interface makes it ok, and that's their decision," Torvalds wrote. "But considering Oracle's litigious nature, and the questions over licensing, there's no way I can feel safe in ever doing so."

Oracle's lawsuit against Google, which is over "stealing" Java technology for Google's Android system and could cost Google as much as $9 billion, is set to go to the Supreme Court this year. Google and its supporters in the case have argued that an Oracle victory in this case would have a chilling effect on software innovation.

Through that lens, Torvalds' comments highlight how Oracle's standing in the matter might affect its reputation in the open source software world, potentially steering programmers away from its products and services.

What's more, ZFS and Linux have different licenses, and according to an FAQ about ZFS on Linux, the combination of the two licenses could potentially cause problems and prevent users from using code exclusively available under one license with code exclusively available under the other.

Torvalds wrote that there is "no way" he can merge any ZFS code until he gets an official letter from Oracle signed by its main legal counsel, or by executive chairman and CTO Larry Ellison saying that it's fine to do so.

Torvalds said that if other users choose to add ZFS, "they are on their own," as he can't maintain it.

"Don't use ZFS," Torvalds wrote. "It's that simple. It was always more of a buzzword than anything else, I feel, and the licensing issues just make it a non-starter for me. The benchmarks I've seen do not make ZFS look all that great."

Oracle declined comment.

Got a tip? Contact this reporter via email at rmchan@businessinsider.com, Signal at 646.376.6106,Telegram at @rosaliechan, orTwitter DM at @rosaliechan17. (PR pitches by email only, please.) Other types of secure messaging available upon request. Youcan alsocontact Business Insider securely via SecureDrop.

The rest is here:
Linus Torvalds, creator of the Linux operating system, warned developers not to use an Oracle-owned file system because of the company's 'litigious...

Gates Foundation mobile money projects in developing world to go live this year – ComputerWeekly.com

Major mobile money projects in Africa and Asia being assisted by the Bill & Melinda Gates Foundation will go live as early as the middle of this year, giving some of the worlds poorest people access to financial services.

The foundation, set up by Microsoft founder Bill Gates and his wife Melinda, helps these projects in various ways. It provides philanthropic capital and technical assistance, as well as open source software in some cases.

Kosta Peric, deputy director of financial services for the poor at theBill & Melinda Gates Foundation, said projects in Pakistan, Tanzania and eight West African countries could go live as early as the middle of this year. These projects will this year move from deployment and development to actually serving the poor populations, he said.

For poor communities, access to financial services is vital to enable them to become economically active. For example, it can help farmers get paid for their produce, buy fertiliser and receive subsidies. Or a woman in Africa with a mobile phone could receive a salary payment, send money to other wallets, buy things, pay bills and receive social security.

These projects will generate an impact on hundreds of millions of people that will be able to connect to and use payments to improve their lives and integrate with the economy, said Peric.

The foundation has already supported the recent go-live for the bCashin Bangladesh. Mobile phones are a vital enabler, and Peric said that of the 1.7 billion unbanked people in the world, more than 90% have a simple mobile phone.

In mid-2020, a project to help create a national real-time payments platform in Pakistan is expected to go live, as is a platform in Tanzania. Later this year, a mobile project covering eight West African countries is expected to be rolled out.

Some of the projects use the Mojaloop open sourcesoftware developed by Ripple, Dwolla, ModusBox, Software Group and Crosslake Technologies for the Gates Foundation. This is great because it allows projects to kick-start development with something that already exists, said Peric. Other projects develop their own software.

Capital from the Gates Foundation de-risks projects and entices commercial players to get involved, with the aim for the projects to be self-sustaining.

The services that are already live are being run commercially, which means they can be sustained without external contributions. These have proved that it is possible to serve even the poorest people at a profit, said Peric.

Huge progress has already been made, he pointed out. In Africa alone, the mobile money projects that the Gates Foundation has supported will enable 200 million people to connect to financial services. The foundation wants to play its part in connecting all of the 400 million adults in Africa, said Peric.

Peric is a computer scientist by profession, and spent more than 23 years working for the Society for Worldwide Interbank Financial Telecommunication (Swift).

His last role at Swift was head of innovation in its Innotribe organisation, which innovates on behalf of the hundreds of banks that own it. He was the architect of the current Swift network, and first became interested in financial inclusion during his time at Innotribe.

Originally posted here:
Gates Foundation mobile money projects in developing world to go live this year - ComputerWeekly.com

Alteryx: Intuitive Data Analytics Solution Driven By Industry Tailwinds – Seeking Alpha

In recent years the volume of data being generated and stored globally has exploded and as companies with advanced analytic capabilities tend to outperform their peers across a broad range of metrics, analytics is now a required competency. Firms must improve their analytic capabilities and access to advanced tools to remain competitive, which is leading to rapid growth in areas like database, search and analytic software.

Figure 1: Analytic Capabilities and Performance

(source: Bain)

The amount and diversity of data (type, format, and source location) are rapidly increasing and this is driving the need for efficient tools to create and maintain data pipelines which convert raw data into monetizable insights. While there are now a wide range of opensource tools available, like Python and R, which have powerful and flexible analytic capabilities they require some knowledge of programming and are more commonly used by people with a background in computer science or data science. Alteryx offers easy to use and intuitive self-service analytic software, which is targeted at business analysts not data scientists and primarily aims to replace tools like spreadsheets, not specialized data science software. While there may be some trade offs in capabilities like flexibility this is more than made up for by its ease of use by non-specialists, allowing access to data insights to be democratized.

Traditional tools have a number of weaknesses for use as self-service analytics tools by business analysts, including:

Figure 2: Percentage of Organizations Using Data Analysis Tools

(source: HBR)

Self-service data preparation using spreadsheet software remains common today with approximately 8% of employees using spreadsheets for self-service analytics. There are an estimated 21 million advanced spreadsheet users worldwide, who on average spend 26 hours per week working on spreadsheets. These spreadsheets were not designed for modern big data requirements and as a result are inefficient, causing an estimated $60 billion of lost productivity in the U.S. every year by advanced spreadsheet users.

Table 1: Estimated Cost of Inefficient Use of Spreadsheets

(source: Created by author using data from Alteryx)

The global market for big data and analytics software is large and growing rapidly. IDC estimated the market size to be $49 billion in 2016 and projected it to grow at a rate of 10.5% annually through 2021. Within the big data and analytics software market Alteryx's software addresses the business intelligence and analytic tools, analytic data integration and spatial information analysis markets, which collectively represented approximately $19 billion in 2016 and were projected to grow at a rate of approximately 8.8% through 2021. Alteryx also estimates that there is an additional $10 billion opportunity their platform can address by replacing spreadsheets for advanced data preparation and analytics.

Alteryx (AYX) offers self-service data analytics software which is designed to improve the productivity of business analysts by bringing a fragmented data analytic pipeline into one service. The functionality of Alteryx's platform includes accessing various data sources, cleaning and preparing data, and performing a variety of analyses. The software aims to replace traditional tools by offering ease of use, speed, sophistication of analysis and an intuitive user interface with a visual workflow. Alteryx's ultimate goal is to make their platform as ubiquitous in the workplace as spreadsheets are today.

Alteryx promotes the following virtues of their platform:

The software can be licensed for use on a desktop or server, or it can be delivered through a hosted model. Subscription periods for the platform generally range from one to three years with fees typically billed annually in advance and revenue recognized ratably over the term of the contract. Revenue is also generated from professional services, including training and consulting.

Alteryx's platform includes:

Alteryx continues to achieve high revenue growth through increased customer numbers and expansion of revenue per customer. This revenue growth is yet to show significant signs of decline, indicating Alteryx still has significant room to grow before reaching market saturation. Alteryx's revenue growth rate is broadly in line with other SaaS companies which offer software related to data.

Figure 3: Alteryx Revenue

(source: Created by author using data from company reports)

Figure 4: Alteryx Revenue Growth

(source: Created by author using data from company reports)

Alteryx is pursuing a number of growth strategies including:

Figure 5: Alteryx Customers

(source: Created by author using data from Alteryx)

Alteryx employs a "land and expand" business model which aims to increase revenue per customer over time. Customers are often offered a free trial which is then followed by an initial subscription to the platform. Alteryx then try to increase use of the platform across departments, divisions, and geographies of the organization as the benefits of the platform are realized. This can be seen in the customer cohort data where revenue for each cohort expands significantly over time.

Figure 6: Annualized Subscription Revenue by Customer Cohort

(source: Alteryx)

Figure 7: Alteryx Net Expansion Rate

(source: Created by author using data from company reports)

Figure 8: Alteryx Revenue per Customer

(source: Created by author using data from company reports)

Alteryx's gross profit margin is high, even by the standards of enterprise software companies and this is likely to lead to high operating profit margins as Alteryx continues to scale. Alteryx's ability to achieve and maintain such high gross margins is indicative of a strong competitive position in the market.

Figure 9: Alteryx Gross Profit Margin

(source: Created by author using data from company reports)

Alteryx has exhibited significant operating leverage in the past and are likely to achieve consistently positive operating profits going forward. Based on Alteryx's gross profit margin typical operating expenses for enterprise software companies Alteryx is likely to eventually achieve an operating profit margin of above 30% which along with Alteryx's high growth rate support current valuation multiples.

Figure 10: Alteryx Operating Profit Margin

(source: Created by author using data from company reports)

Most of Alteryx's operating leverage is being achieved through reduced sales and marketing expenses relative to revenue. This trend is likely to continue as Alteryx continues to grow and build a stable base of subscription customers. As a subscription software provider Alteryx's sales and marketing expenses should be expected to be a large burden as the company establishes a market presence, particularly while the company is growing rapidly.

Figure 11: Alteryx Operating Expenses

(source: Created by author using data from company reports)

Not only is Alteryx likely to achieve a high level of profitability as the business scales, it is also likely to generate significant free cash flow. Alteryx's business has relatively low capital expenditure requirements and in recent years Alteryx has exhibited the ability to consistently generate free cash flow despite their current high growth rate.

Figure 12: Alteryx Free Cash Flow

(source: Created by author using data from Alteryx)

Alteryx faces competition from a range of companies including incumbents offering traditional tools, specialized self-service data analytics software providers, open source software and cloud storage providers who will look to continue expanding their offerings to provide customers with holistic solutions.

Alteryx has been assessed by Gartner in the Data Science and Machine Learning Platforms Magic Quadrant, although there is overlap with the Analytics and Business Intelligence Platforms Magic Quadrant. Leaders in business intelligence include Microsoft (MSFT), Tableau (CRM) and Qlik whilst leaders in machine learning include KNIME, RapidMiner and TIBCO.

Figure 13: Gartner Magic Quadrant for Analytics and Business Intelligence Platforms

(source: Sisense)

Gartner rates Alteryx relatively low on completeness of vision as they are not a standout vendor in terms of automation, deep learning or the Internet of Things and they do not appeal to expert data scientists. This is a reflection of Alteryx's niche strategy, offering an intuitive and easy to use platform for non-experts. While reduced functionality may limit adoption amongst data scientists, Alteryx's strategy has the potential to help them reach a much broader base of users who place greater value on ease of use.

Figure 14: Gartner Magic Quadrant for Data Science and Machine Learning Platforms

(source: Alteryx)

The Forrester Wave assessment of the business intelligence and machine learning software markets is broadly similar to the Gartner assessment. Alteryx was not included in the Predictive Analytics and Machine Learning Solutions assessment as it is a data blending tool with the ability to run R scripts and does not have native machine learning capability.

Traditional Tools

Traditional tools like spreadsheets are likely to remain common for more basic applications but as software requirements increase specialized tools are likely to be more widely adopted.

Self-Service Data Analytics

There are a wide range of self-service data analytics tools available which are designed for different applications. Visualization tools like Tableau and Spotfire are designed primarily to allow users to visually explore datasets. Specialized self-service data analytics software like SAS are designed to allow powerful and flexible quantitative and visual analysis of data sets but tend to be less intuitive and have a steeper learning curve.

Open Source Software

There are a wide range of open source tools available for data analytics including R, Python, Pytorch and TensorFlow. These tools are less intuitive and require a basic knowledge of programming which limits their widespread adoption. It is possible that these tools will be made easier to use over time increasing their adoption by general business analysts. It is also likely that a basic level of programming knowledge will become more common amongst knowledge workers leading to more widespread adoption.

Cloud Computing

The major cloud computing providers already offer an assortment of business intelligence and data analytics services and I expect these offerings to be expanded and more closely integrated with their cloud offerings in the future as these companies seek to offer customers holistic solutions. I believe Microsoft and Amazon (AMZN) are likely to be the most competitive in this area. Google (GOOG) is also likely to have a strong service offering but their weak enterprise DNA may continue to hold them back even if they have technically superior solutions.

For Alteryx to be successful as a specialized provider of self-service data analytic software it must offer customers a compelling value proposition where the ease of use and efficiency of the software justify the higher cost relative to traditional tools. Alteryx must also ensure that their software offers customers sufficient benefit to justify the cost and complexity of having an additional service provider when they could use a cloud computing vendor as a one stop provider of all services.

Despite Alteryx's high EV/S ratio I believe the company is significantly undervalued due to its strong prospects going forward. Both revenue and profit margins are likely to continue improving significantly in coming years which I believe will result in an increase in the share price despite the inevitable multiple contraction as growth slows. Based on a discounted cash flow I estimate an intrinsic value of $195 per share.

Disclosure: I am/we are long AYX. I wrote this article myself, and it expresses my own opinions. I am not receiving compensation for it (other than from Seeking Alpha). I have no business relationship with any company whose stock is mentioned in this article.

Read more from the original source:
Alteryx: Intuitive Data Analytics Solution Driven By Industry Tailwinds - Seeking Alpha

US Bioeconomy Is Strong, But Faces Challenges; Expanded Efforts in Coordination, Talent, Security, and Fundamental Research Are Needed – National…

Jan. 14, 2020

WASHINGTON The U.S. is a clear leader in the global bioeconomy landscape, but faces challenges from decentralized leadership, inadequate talent development, cybersecurity vulnerabilities, stagnant investment in fundamental research, and international competition, according to Safeguarding the Bioeconomy, a new report from the National Academies of Sciences, Engineering, and Medicine. As a major driver of scientific discoveries, spanning fields from agriculture to pharmaceuticals, a vulnerable bioeconomy puts the countrys economy at risk. The report recommends steps the U.S. should take to mitigate these risks and sustain a strong bioeconomy, including forming a coordinating body within the Executive Office of the President to ensure coordination across the science, economic, regulatory, and security agencies.

While historical investments in the bioeconomy protect U.S. leadership in some areas, the report says maintaining that leadership will require careful analysis of policies and features that undergird the bioeconomy, continued commitment from the federal government to invest in science, and support for international cooperation and collaboration.

The bioeconomy is defined as economic activity that is driven by research and innovation in the life sciences and biotechnology, and that is enabled by technological advances in engineering and in computing and information sciences. An original analysis by the committee that authored the report values the bioeconomy at more than 5 percent of the gross domestic product, or more than $950 billion.

Americans benefit every day from the bioeconomy in terms of the food we eat, the health care we receive, and the products we buy, said committee chair Thomas M. Connelly, executive director and CEO of the American Chemical Society. With the right policies and coordination, we can keep the bioeconomy strong and maintain a competitive edge in the face of new threats and global challenges.

Establishing Bioeconomy Coordination

No one government agency currently has the mandate to monitor, assess, promote, or protect the bioeconomy. To make large-scale coordination possible, the report recommends the Executive Office of the President develop a government-wide strategic coordinating body to safeguard and realize the potential of the bioeconomy. This coordinating body which should develop and update strategies for sustaining and growing the sector should be presided over by senior White House leadership and include representation from science, economic, regulatory, and security agencies.

International Competition and Talent

The report says talent development at all levels should be a high priority for investment, and recommends attracting and retaining scientists from around the world. Inadequate funding for universities and training programs could diminish the countrys ability to develop and retain a skilled technical workforce. The report emphasizes that policies created to counter risks from foreign talent could actually adversely affect the bioeconomy, as historically the U.S. has benefitted from the ability to attract and retain international students and scientists to its universities and STEM workforce. The report recommends that any policies to mitigate possible risks posed by foreign researchers should be formulated with input from leading scientists.

While the United States remains among the worlds leaders in public investment in the biological sciences, federal investments have stagnated at a time when other countries, notably China and South Korea, are increasing theirs. To maintain competitiveness, the U.S. should place a high priority on investing in basic biological science, engineering, and computing and information sciences.

Global supply chains and reliance on single-sourced materials or components can disrupt bioeconomy value chains, the report says. The report also cites examples where non-domestic parties have invested in American companies with the goal of acquiring intellectual property. To guard against these risks, the report recommends, scientists and economists should work with the federal government to identify vital value chains and to assist the Committee on Foreign Investment in the United States in assessing the national security implications of foreign transactions involving the bioeconomy.

Differences in how nations approach regulation of bioeconomy products, data sharing agreements and practices, and industrial mergers and acquisitions also pose risks, the report says. Government agencies should facilitate bilateral and multilateral discussions among countries in order to drive economic growth, reinforce governance mechanisms, and create a level playing field.

Cybersecurity and Data Access

Inadequate cybersecurity practices and protections expose the bioeconomy to significant risks, because the sector is reliant on open source software, large and potentially sensitive data sets, and Internet communications. For example, the report notes, open source software that underpins many bioeconomy applications may not have been designed with bioeconomy security requirements in mind. The report says all stakeholders should adopt best practices for securing information systems from digital intrusion, exfiltration, or manipulation. To protect the value and utility of biological information databases, government agencies should invest in their modernization, curation, and integrity.

Investments in Fundamental Research

Limits to the U.S.s ability to conduct fundamental research, either through inadequate funding, restrictive research regulations, or the inability to develop and attract a skilled workforce, can erode our ability to produce breakthrough scientific results that fuel the bioeconomy. Falling behind in life science applications of computational and informational science is a particular risk. The report says erosion in support for U.S. government investment in fundamental research is a concern that must be addressed.

The study undertaken by the Committee on Safeguarding the Bioeconomy: Finding Strategies for Understanding, Evaluating, and Protecting the Bioeconomy While Sustaining Innovation and Growth was sponsored by the Office of the Director of National Intelligence. The National Academies are private, nonprofit institutions that provide independent, objective analysis and advice to the nation to solve complex problems and inform public policy decisions related to science, technology, and medicine. They operate under an 1863 congressional charter to the National Academy of Sciences, signed by President Lincoln. For more information, visit nationalacademies.org.

Resources:Interactive HighlightsFollow us:Twitter @theNASEMInstagram @thenasemFacebook @NationalAcademiesFollow the conversation using: #bioeconomy

Contact:Megan Lowry, Media Relations OfficerOffice of News and Public Information202-334-2138; e-mail news@nas.edu

Read the original here:
US Bioeconomy Is Strong, But Faces Challenges; Expanded Efforts in Coordination, Talent, Security, and Fundamental Research Are Needed - National...