MY TAKE: PKI, digital certificates now ready to take on the task of securing digital transformation – Security Boulevard

Just five years ago, the Public Key Infrastructure, or PKI, was seriously fraying at the edges and appeared to be tilting toward obsolescence. Things have since taken a turn for the better.

Related: Why PKI is well-suited to secure the Internet of Things

PKI is the authentication and encryption framework on which the Internet is built. The buckling of PKI a few years back was a very serious matter, especially since there was nothing waiting in the wings to replace PKI. Lacking a reliable way to authenticate identities during the data transfer process, and also keep data encrypted as it moves between endpoints, the Internet would surely atrophy and digital transformation would grind to a halt.

The retooling of PKI may not be sexy to anyone, outside of tech geeks. Nonetheless, it is a pivotal chapter in the evolution of digital commerce. One of several notable contributors was DigiCert, the worlds leading provider of digital certificates and certificate management solutions.

I had a chance to interview Brian Trzupek, DigiCerts senior vice president of emerging markets products, at the companys Security Summit 2020 in San Diego recently. For a full drill down on our discussion, please give the accompanying podcast a listen. Here are a few key takeaways:

PKIs expanding role

PKI revolves around the creation, distribution and management of digital certificates issued by companies known as certificate authorities, or CAs. In the classic case of a human user clicking to a website, CAs, like DigiCert, verify the authenticity of the website and encrypt the data at both ends.

Today, a much larger and rapidly expanding role for PKI and digital certificates is to authenticate devices and encrypt all sensitive data transfers inside highly dynamic company networks. Were not just talking about website clicks; PKI comes into play with respect to each of the millions of computing instances and devices continually connecting to each other the stuff of DevOps and IoT. It can be as granular as a microservice in a software container connecting to a mobile app, for instance. Each one of these digital hookups requires PKI and a digital certificate to ensure authentication.

Much like the Internet, PKI evolved somewhat haphazardly in the first two decades of this century to enable website activity and it has come a long, long way since. PKIs core components derive from open source, corporate and entrepreneurial beginnings. By 2015 or so, the early pioneer PKI services companies had made their profits and had gotten themselves swallowed up by tech conglomerates in a wave of consolidation.

In late 2017, DigiCert announced it would acquire Symantecs PKI division for $1 billion. At the time, Symantec very much wanted out of having anything to do with PKI; Google had just announced plans to distrust all Symantec-issued certificates, after a long tussle with the security vendor for failing to meet industry standards. DigiCert took the best of what Symantec had and combined it with tech that DigiCert did well, and worked feverishly to modernize PKI.

Trzupek

Symantec just didnt spend a whole lot of time actually integrating those businesses, Trzupek told me. They had acquired all of these PKI systems, order-entry systems, e-commerce systems, validation systems. . . it was like a million tiny freestanding companies and we had to try to figure out how to consolidate all of that.

Platform challenges

A lot has transpired over the past two years. The CA/Browser Forum, an industry standards body founded in 2005, accelerated initiatives to drive better practices and guidelines. Outside of the CAB Forum, many industries, from healthcare to automotive to manufacturing, have created standards and implemented digital certificate protections through global PKI practices that strengthen device security

Taken together these efforts have brought a semblance of order to the topsy-turvy world of enterprise PKIs. Companies had come to rely on a hodge podge of systems to authenticate remote workers and contractors, while at the same time delving deeper into DevOps, and also pressing forward with wider use of IoT systems.

What we saw across all of that was a platform problem, Trzupek says. People were trying to use PKI and certificates in many different kinds of ways and all of this was being jammed through very old legacy tools.

For its part, DigiCert responded by sending Trzupek on the road to visit 70 PKI customers in 12 nations and listen closely to what was on their minds. DigiCert used that feedback as the basis to design leading-edge PKI deployment and management tools and services, built on a flexible, scalable platform for speed and efficiency.

The first step is to take a very manual inventory of what the parent company is doing with PKI, and what all of the sub-entities and subdivisions are doing with PKI, just figuring out who manages those projects and what PKI is being used for, Trzupek says. Then theres an organizational component where you can consolidate management of PKIs and do things like standardizing tools.

Future use cases

Innovations to help companies more efficiently manage sprawling PKI deployments continue to advance, and none too soon. Large and mid-sized enterprises are stepping up their use of DevOps and embracing philosophies like fail fast, the notion of quickly deploying minimumally viable software to learn where it works or fails, and then iterating and remediating the shortcomings.

This is how dynamic services are getting spun up; such services are capable of scaling up to serve high volume demand, cheaply and very quickly, and then wind down just as quickly. DigiCert is focusing on putting PKI at the nerve center of these types of scenarios, where short-lived certificates, with low latency and high availability, come into play.

A lot of places need dynamic scale related to consumption, and they need that environment to be trusted, and thats where PKI comes in, Trzupek says. As we look to the future, its all about getting more dynamic so we can interoperate with that world and produce certificates as they need them.

Its encouraging that PKI is once again on solid footing, were certainly going to need it, going forward. Data is the new oil, futurist and theoretical physicist, Dr. Michio Kaku, told attendees of DigiCert Security Summit 2020. Following the mainstreaming of steam power, then electricity and then the Internet, were today on the brink of the fourth wave giant technical leaps forward, observes Kaku, author of The Future of the Mind: The Scientific Quest to Understand, Enhance, and Empower the Mind.

Kaku argues that silicon chip-based computing has maxed out and will very soon be replaced by quantum computers which manipulate atoms to make massive calculations. Quantum computers can rather easily break the strongest encryption we have today. The good news is that the tech community has factored this into long term planning for the care and feeding and future viabilityof PKI.

A major public-private effort is underway to revamp classical cryptography, and ultimately replace it with something called post-quantum-cryptography, or PQC. DigiCert happens to be in the thick of this effort and has already begun offering strategies for companies to future proof sensitive systems for the coming of quantum computing.

Devices being put into service today, like cars and airplanes and IoT systems that have embedded sensors have long term life cycles, says Avesta Hojjati, DigiCerts head of research and development. Were striving to protect those devices, right now, against threats that are coming in the next five to 10 years.

In an environment where fail fast is the philosophy ushering us into the quantum computing era, there is a huge role for robust, reliable and continually improving PKI. We appear to be on that path. Ill keep watch.

Acohido

Pulitzer Prize-winning business journalist Byron V. Acohido is dedicated to fostering public awareness about how to make the Internet as private and secure as it ought to be.

(LW provides consulting services to the vendors we cover.)

*** This is a Security Bloggers Network syndicated blog from The Last Watchdog authored by bacohido. Read the original post at: https://www.lastwatchdog.com/my-take-pki-digital-certificates-now-ready-to-take-on-the-task-of-securing-digital-transformation/

Go here to read the rest:
MY TAKE: PKI, digital certificates now ready to take on the task of securing digital transformation - Security Boulevard

SpeQtral, ITB and Kennlines Capital Group, Signs Memorandum of Understanding to Develop Quantum Secure Networks in Indonesia to thwart Eavesdroppers -…

Commencing with a signing ceremony of a Memorandum of Understanding on Thursday, 20th February 2020, in the Soesilo Soedarman Hall, Sapta Pesona Building, the three institutions officially started the collaboration. This ceremony focused on a commitment to foster awareness of the benefits of Quantum Communication Technology for broad telecommunications network security in Indonesia.

The seminar was attended by officials from the government, telecommunication companies, satellite-based companies, banks and academics from several universities. This seminar will be followed by a technical workshop involving experts in quantum communications and computing from ITB and SpeQtral, as well as government officials.

Quantum technologies make use of the ability to control and manipulate small objects governed by quantum physics, such as atoms, photons and electrons. This subject includes quantum computing, quantum communications, and quantum sensing. Of these, quantum communications is the most advanced regarding its physical implementation, and its immediate application to secure communications. Quantum communications utilizes the quantum property of photons to create a unique ability to detect eavesdroppers and create quantum channels for secure communications.

Chune Yang Lum as Chief Executive Officer (CEO) of SpeQtral said, We believe that quantum communications have the power to transform the security of the worlds network. He also explained that one application of the new technology of quantum communication is Quantum Key Distribution (QKD), which can be used to securely distribute encryption keys that are safe from eavesdroppers.

Dean of SEEI-ITB, Prof. Dwi H. Widyantoro added that QKD is a technology that can provide the best network security system in the protection of communication data. Future networks need QKD to provide the best security. Fortunately, QKD systems are available today. QKD was invented thirty years ago and developed in universities and research labs. The technology is now finally ready to deliver in Indonesia.

Along with the advancing development of QKD technology throughout the world, Mirza Whibowo Soenarto as Chairman of the Kennlines Capital Group said that this technology has the potential to be practiced in the fields of government and defense, telecommunications, network security, banking and enterprise systems. Indonesia needs networks that meet the highest levels of security to minimize losses caused by unwanted parties and QKD is a solution to these problems.

Through utilizing SpeQtrals reputation as one of the companies that excel in the research of quantum communications technology and satellite telecommunications technology, this collaboration aims to widely introduce the technology and raise awareness of the benefits of quantum communications technology and telecommunications network security in Indonesia. SpeQtral warmly welcomes the collaboration with ITB & Kennlines Capital Group and we hope that together, we can enhance the resilience of the communications infrastructure against future eavesdropping threats in Indonesia, Chune Yang said in the Quantum Information Seminars & Workshop 2020 press conference session.

Dwi also announced the enthusiasm for the collaboration itself. He revealed that through this collaboration, ITB could play a role as a center for research and development of Quantum Technology in Indonesia and even Southeast Asia. Mirza added, The signing ceremony of the memorandum of understanding expects to foster motivation for the academics and the public in general, to contribute to the development of Quantum Communication in Indonesia.

Most activities in quantum development are being led by developments under concerted state-sponsored efforts. But there are also commercial developments led by other privately funded companies (including SpeQtral) and larger telecommunication and satellite operators. According to some independent research organizations, the quantum communications market size is expected to top $5B in the next five years.

Link:
SpeQtral, ITB and Kennlines Capital Group, Signs Memorandum of Understanding to Develop Quantum Secure Networks in Indonesia to thwart Eavesdroppers -...

New report: Quantum Computing Market Size position and size report for 2019 to 2023 recently published – Instant Tech News

Quantum Computing Market research now available at Brand Essence Research encompasses an exhaustive Study of this business space with regards to pivotal industry drivers, market share analysis, and the latest trends characterizing the Quantum Computing industry landscape. This report also covers details of market size, growth spectrum, and the competitive scenario of Quantum Computing market in the forecast timeline.

This report for Quantum Computing Market discovers diverse topics such as regional market scope, product market various applications, market size according to specific product, sales and revenue by region, manufacturing cost analysis, Industrial Chain, Market Effect Factors Analysis, market size forecast, and more.

Request for Sample of this [emailprotected] https://brandessenceresearch.biz/Request/Sample?ResearchPostId=153106&RequestType=Sample

Web Established Key players in the market are:

D-Wave Systems Inc., Qxbranch, LLC, International Business Machines Corporation (IBM), Cambridge Quantum Computing Ltd, 1qb Information Technologies Inc., QC Ware Corp., Magiq Technologies Inc., Station Q Microsoft Corporation, Rigetti Computing, Research at Google Google Inc.

Presenting an inherent outline of the competitive and geographical frames of reference pertaining to the Quantum Computing market:

Request Customization of this Report: https://brandessenceresearch.biz/Request/Sample?ResearchPostId=153106&RequestType=Methodology

Market segment by Type, the product can be split into

Type II-A, Type II

Market segment by Application, split into

Simulation, Optimization, Sampling

Market segment by Regions/Countries, this report covers

United States

Europe

China

Japan

Southeast Asia

India

Central & South America

The geographical spectrum of the business and its consequence on the Quantum Computing market:

More Details on this Report: https://brandessenceresearch.biz/Request/Sample?ResearchPostId=153106&RequestType=MarketShares

The report outlines the regulatory framework surrounding and governing numerous aspects of the market. At the end, Quantum Computing industry development rival view, the industry scenario, samples, research conclusions are described. The important examination incorporated from 2014 to 2019 and till 2023 makes the report helpful assets for industry officials, promoting, sales, directors, experts, trade consultants, and others looking for key industry information with clearly given tables and charts.

Read More Report:

https://www.marketwatch.com/press-release/sperm-bank-market-size-industry-analysis-report-regional-outlook-application-development-potential-price-trends-competitive-market-share-forecast-20192025-2020-02-20

https://www.marketwatch.com/press-release/europe-regenerative-medicine-market-size-trends-by-top-manufacturers-cagr-status-demands-analysis-with-future-prospects-to-2025-2020-02-20

https://www.marketwatch.com/press-release/polyethylene-terephthalate-pet-market-size-share-revenue-business-growth-demand-and-application-market-research-report-to-2025-2020-02-20

About Us:

We publish market research reports & business insights produced by highly qualified and experienced industry analysts. Our research reports are available in a wide range of industry verticals including aviation, food & beverage, healthcare, ICT, Construction, Chemicals and lot more. Brand Essence Market Research report will be best fit for senior executives, business development managers, marketing managers, consultants, CEOs, CIOs, COOs, and Directors, governments, agencies, organizations and Ph.D. Students.

Contact US:

https://brandessenceresearch.biz/

Brandessence Market Research & Consulting Pvt ltd.

Kemp House, 152 160 City Road, London EC1V 2NX

+44-2038074155

[emailprotected]

Read the original here:
New report: Quantum Computing Market Size position and size report for 2019 to 2023 recently published - Instant Tech News

Quantum Computing Technologies Market: Industry Players Analysis, New Innovation, Growth Prospects, Size, Growth, Revenue, Development Policy,…

The Quantum Computing Technologies Market report provides an analysis of Quantum Computing Technologies Industry share, development policy, size, growth, trends, regional outlook and 2026 forecast analysis. It also highlights the drivers, restraints, and opportunities of the market during the said period. The study provides a complete perspective on the evolution of the global Quantum Computing Technologies market throughout the above mentioned forecast period in terms of revenue (US$ Bn)

Get Sample Copy of this Report https://prominentmarketresearch.com/sample-report/133980

The Quantum Computing Technologies Market report comprises a detailed value chain analysis, which provides a comprehensive view of the global market. The Porters Five Forces model for the market has also been included to help understand the competitive landscape in the Quantum Computing Technologies market.

The report also highlights opportunities and future scope in the Quantum Computing Technologies market at the global and regional level. The study encompasses market attractiveness analysis, wherein the service is benchmarked based on market size, growth rate, and general Quantum Computing Technologies industry share.

The key players covered in this study

Do You Have Any Query Or Specific Requirement? Ask to Our Industry Expert @ https://prominentmarketresearch.com/inquiry-report/133980

The study includes profiles of major companies operating in the global Quantum Computing Technologies market. Market players have been profiled in terms of attributes such as company overview, financial overview, business strategies, and recent developments.

Regional analysis is another highly comprehensive part of the research and analysis study of the global Quantum Computing Technologies market presented in the report. This section sheds light on the sales growth of different regional and country-level Quantum Computing Technologies markets.

The key regions and countries covered in this report are:

Market segment by Type, the product can be split intoSoftwareHardware

Market segment by Application, split intoGovernmentBusinessHigh-TechBanking & SecuritiesManufacturing & LogisticsInsuranceOther

In order to compile the Quantum Computing Technologies market research report, we conducted in-depth interviews and discussions with a number of key industry participants and opinion leaders.

We reviewed key players product literature, annual reports, press releases, and relevant documents for competitive analysis and Quantum Computing Technologies market understanding. Secondary research also includes a search of recent trade, technical writing, internet sources, statistics data from government websites, trade associations, and agencies.

This has proven to be the most reliable, effective, and successful approach for obtaining precise market data, capturing industry participants insights, and recognizing business opportunities.

Get Complete Report in Your Inbox @ https://prominentmarketresearch.com/checkout/133980

Quantum Computing Technologies Market Key Stakeholders:

Key Points from Table of Content:

1 Quantum Computing Technologies Market Overview

2 Global Quantum Computing Technologies Market Competition by Manufacturers

3 Global Quantum Computing Technologies Capacity, Production, Revenue (Value) by Region (2015-2020)

4 Global Quantum Computing Technologies Supply (Production), Consumption, Export, Import by Region (2015-2020)

5 Global Quantum Computing Technologies Production, Revenue (Value), Price Trend by Type

6 Global Quantum Computing Technologies Market Analysis by Application

7 Global Quantum Computing Technologies Manufacturers Profiles/Analysis

8 Quantum Computing Technologies Manufacturing Cost Analysis

9 Quantum Computing Technologies Industrial Chain, Sourcing Strategy and Downstream Buyers

10 Quantum Computing Technologies Marketing Strategy Analysis, Distributors/Traders

11 Quantum Computing Technologies Market Effect Factors Analysis

12 Global Quantum Computing Technologies Market Forecast (2020-2026)

13 Quantum Computing Technologies Market Research Findings and Conclusion

14 Appendix

If you need anything more than these then let us know and we will prepare the report according to your requirement.

About Us

Prominent Market Research has an extensive coverage of diligence verticals of qualitative and quantitative reports across all the industries. In case your needs are not met by syndicated reports untaken by the foremost publishers, we can help you by proposing a customized research elucidation by liaising with different research interventions saving your valuable time and money. We have experienced and trained staff that helps you navigate different options and lets you choose preeminent research solution at most effective cost.

Contact Us

Michael, Sales Manager

Prominent Market Research

7309 Woodward Ave,

Apt 107, Woodridge, Illinois, USA, 60517

Phone: USA +1-630-361-6262

Email: [emailprotected]

Corporate Sales: [emailprotected]

Read the rest here:
Quantum Computing Technologies Market: Industry Players Analysis, New Innovation, Growth Prospects, Size, Growth, Revenue, Development Policy,...

Global Deep Learning Chip Market (2019 to 2027) – Drivers, Restraints, Opportunities and Trends – ResearchAndMarkets.com – Business Wire

DUBLIN--(BUSINESS WIRE)--The "Deep Learning Chip Market to 2027 - Global Analysis and Forecasts By Chip Type; Technology; Industry Vertical" report has been added to ResearchAndMarkets.com's offering.

The global deep learning chip market accounted for US$ 2.04 Bn in 2018 and is expected to grow at a CAGR of 30.0% over the forecast period 2019-2027, to account for US$ 21.31 Bn in 2027.

The increasing investments in deep learning chip start-ups, prominence of quantum computing, and real time consumer behavior insights & increased operational efficiency are few of the factors driving the deep learning chip market worldwide. However, lack of infrastructure & technology know-how in third world countries and dearth of skilled workforce may restrain the future growth of market. Despite these limitations, rising adoption of cloud-based computing across industries is anticipated to offer ample growth opportunities for the players operating in the deep learning chip market during the forecast period.

The market for deep learning chip has been segmented on the basis of chip type, technology, industry vertical, and geography. The deep learning chip market based on chip type is led by GPU segment and is expected to continue its dominance in the forecast period. The deep learning chip market on the basis of technology is segmented into system-on-chip, system-in-package, multi-chip module, others.

The System-on-Chip technology led the deep learning chip market and it is anticipated to continue its dominance during the forecast period. The market for deep learning chip by industry vertical is further segmented into media & advertising, BFSI, it & telecom, retail, healthcare, automotive & transportation, and others. The BFSI sector is expected to hold the lion's share in the year 2018 and is expected to continue its dominance till 2027.

Reasons to Buy

Key Topics Covered:

1. Introduction

2. Key Takeaways

3. Research Methodology

4. Deep Learning Chip Market Landscape

4.1 Market Overview

4.2 PEST Analysis

4.3 Ecosystem Analysis

4.4 Expert Opinions

5. Deep Learning Chip Market - Global Market Analysis

5.1 Global Deep Learning Chip Market Overview

5.2 Global Deep Learning Chip Market Forecast and Analysis

5.3 Market Positioning- Top Five Players

6. Deep Learning Chip market - Key Industry Dynamics

6.1 Key Market Drivers

6.1.1 Increasing investments in deep learning chip start-ups

6.1.2 Prominence of Quantum Computing

6.1.3 Real time consumer behaviour insights and increased operational efficiency

6.2 Key Market Restraints

6.2.1 Dearth of skilled workforce

6.2.2 Lack of infrastructure and technology know-how in third world countries

6.3 Key Market Opportunities

6.3.1 Rising adoption of cloud-based computing across industries

6.3.2 Adoption of deep learning chips in edge devices is expected to boom in the forecast period

6.4 Future Trends

6.4.1 ASICs and application-specific custom/hybrid deep learning chips will be the future of deep learning chip market

6.5 impact analysis of Drivers and restraints

7. Deep Learning Chip Market Analysis - By Chip Type

7.1 Overview

7.2 Deep Learning Chip Market Breakdown, By Chip Type, 2018 & 2027

7.3 GPU

7.4 ASIC

7.5 FPGA

7.6 CPU

7.7 Others

8. Deep Learning Chip Revenue and Forecasts to 2027 - Technology

8.1 Overview

8.2 Deep Learning Chip Market Breakdown, By Technology, 2018 & 2027

8.3 System-on-Chip

8.4 System-in-Package

8.5 Multi-Chip Module

8.6 Others

9. Deep Learning Chip Market Analysis - By Industry Vertical

9.1 Overview

9.2 Deep Learning Chip Market Breakdown, By Industry Vertical, 2018 & 2027

9.3 Media & Advertising

9.4 BFSI

9.5 IT & Telecom

9.6 Retail

9.7 Healthcare

9.8 Automotive & Transportation

9.9 Others

10. Deep Learning Chip Market - Geographic Analysis

10.1 Overview

10.2 North America Deep learning chip Market Revenue and Forecast to 2027

10.3 Europe Deep Learning Chip Market Revenue and Forecast to 2027

10.4 APAC Deep Learning Chip Market Revenue and Forecasts to 2027

10.5 Middle East and Africa Deep Learning Chip Market Revenue and Forecasts to 2027

10.6 South America Deep Learning Chip Market Revenue and Forecasts to 2027

11. Industry Landscape

11.1 Overview

11.2 Market Initiative

11.3 Merger and Acquisition

11.4 New Development

12. Deep Learning Chip Market - Company Profiles

12.1 Advanced Micro Devices, Inc.

12.1.1 Key Facts

12.1.2 Business Description

12.1.3 Products and Services

12.1.4 Financial Overview

12.1.5 SWOT Analysis

12.1.6 Key Developments

12.2 Alphabet Inc. (Google)

12.3 Amazon.com, Inc.

12.4 Baidu, Inc.

12.5 Huawei Technologies Co., Ltd.

12.6 Intel Corporation

12.7 NVIDIA Corporation

12.8 Qualcomm Incorporated

12.9 Samsung electronics Co., Ltd.

12.10 Xilinx, Inc.

For more information about this report visit https://www.researchandmarkets.com/r/oe4ufz

Follow this link:
Global Deep Learning Chip Market (2019 to 2027) - Drivers, Restraints, Opportunities and Trends - ResearchAndMarkets.com - Business Wire

The Linux Foundation reveals the most commonly used open-source software components – SDTimes.com

The Linux Foundation is addressing structural and security complexities in todays modern software supply chains with the release of the Vulnerabilities in the Core, a preliminary report and census II of open-source software.

The report was put together by the Linux Foundations Core Infrastructure Initiative and the Laboratory for Innovation Science at Harvard (LISH).

RELATED CONTENT:Report: The benefits of open-source software go beyond costThe realities of running an open-source community

The Census II report addresses some of the most important questions facing us as we try to understand the complexity and interdependence among open source software packages and components in the global supply chain, said Jim Zemlin, executive director at the Linux Foundation. The report begins to give us an inventory of the most important shared software and potential vulnerabilities and is the first step to understand more about these projects so that we can create tools and standards that results in trust and transparency in software.

Based on the foundation and labs analysis, the team found the following ten packages as the most used free and open-source software packages.

The report also details the most commonly used non-JavaScript packages, which includes:

Based on these packages, the researchers were able to determine some common problems. For instance, they found the naming schema for software components were unique, individual and inconsistent. The effort required to untangle and merge these datasets slowed progress on the current project significantly. Despite the considerable effort that went into creating the framework to produce these initial results for Census II, the challenge of applying it to other data sets with even more varied formats and naming standards still remains, the report stated.

Open source is an undeniable and critical part of todays economy, providing the underpinnings for most of our global commerce. Hundreds of thousands of open source software packages are in production applications throughout the supply chain, so understanding what we need to be assessing for vulnerabilities is the first step for ensuring long-term security and sustainability of open source software, said Zemlin.

Additionally, there is an increasing importance of individual developer account security. A majority of top packages were found to be hosted under individual accounts, which can mean they are more vulnerable to attack.

Lastly, the researchers found the persistence of legacy software in the open source space. According to them, this can lead to compatibility problems, and financial and time-related costs.

FOSS was long seen as the domain of hobbyists and tinkerers. However, it has now become an integral component of the modern economy and is a fundamental building block of everyday technologies like smart phones, cars, the Internet of Things, and numerous pieces of critical infrastructure, said Frank Nagle, a professor at Harvard Business School and co-director of the Census II project. Understanding which components are most widely used and most vulnerable will allow us to help ensure the continued health of the ecosystem and the digital economy.

In order to determine the top packages and projects, the foundation worked with software composition analysis and app security companies like Snyk and Synopsys.

Considering the ubiquity of open source software and the essential role it plays in the technology powering our world, it is more important than ever that we take a collaborative approach to maintain the long term health of the most foundational open source components, said Tim Mackey, principal security strategist for the Synopsys Cybersecurity Research Center. Identifying the most pervasive FOSS components in commercial software ecosystems, combined with a clear understanding of both their security posture and the communities who maintain them, is a critical first step. Beyond that, commercial organizations can do their part by conducting internal reviews of their open source usage and actively engaging with the appropriate open source communities to ensure the security and longevity of the components they depend on.

Go here to read the rest:
The Linux Foundation reveals the most commonly used open-source software components - SDTimes.com

Open source growing within innovative companies – JAXenter

Red Hat has been at the forefront of the global open source discussion, fighting for software freedom in the U.S Supreme Court, and offering free tech products for cloud infrastructure, automation, AI, and much more. After conducting research and interviewing IT leaders from around the world, Red Hat released a report examining the state of enterprise open source in 2020.

950 IT leaders, unaware that Red Hat was the research sponsor, were surveyed about their practices and opinions on enterprise open source software.

Notably, 95% of IT leaders agree that enterprise open source software is important.

Lets review some of the statistics revealed in Red Hats study.

SEE ALSO: Leading your team of young developers: 5 tips for helping them grow within their careers

With the rise of open source software used for both private programming and in the enterprise, proprietary software is declining. In two years, usage of proprietary software has dropped across the board and IT decision-makers expect that this trend will continue in the next two years.

In 2020, just 42% of software used in the enterprise is proprietary, compared to 55% in 2019. IT-leaders anticipate that in two years, this number will decrease even further.

According to the report:

Maybe it doesnt surprise you that proprietary software is losing favorexpensive and inflexible proprietary software licenses result in high capital expenditures (CapEx) and vendor lock-in. However, the rate at which organizations are abandoning proprietary software is notable, especially given how slowly change usually comes to the enterprise software space. Remarkably, enterprise open source is expected to rise from 36% to 44% over the next two years.

The benefits of using open source software seem obvious, namely that they are, of course, freely available. However, its lack of a price tag isnt the main thing that IT leaders love.

According to the survey, respondents believe that higher-quality software is the number one benefit of enterprise open source. FOSS software is often better than proprietary options, with better security, cloud-native technologies, and cutting edge solutions.

What important areas are businesses using FOSS for in 2020?

In particular, teams are using open source software to modernize their infrastructure, in application development, and in DevOps.

Despite the rise in usage and amount of solutions it provides, there are still some barriers and concerns about using open source code in the enterprise. While IT leaders agree that innovative companies turn to FOSS, some perceived concerns remain.

Even though security was listed as a top benefit, 38% of respondents cite that code security is a top barrier to using enterprise open source solutions and technologies.

SEE ALSO: DevOps report card: Security must be part of the software delivery cycle

What does the future hold for FOSS? If the past is in any indication, it will continue to maintain its hold. Back in 2008, Linus Torvalds said in an interview with InformationWeek:

I think that Open Source can do better, and Im willing to put my money where my mouth is by working on Open Source, but its not a crusade its just a superior way of working together and generating code. Its superior because its a lot more fun and because it makes cooperation much easier (no silly NDAs or artificial barriers to innovation like in a proprietary setting), and I think Open Source is the right thing to do the same way I believe science is better than alchemy.

Read the full report from Red Hat for all of the insights and commentary.

Visit link:
Open source growing within innovative companies - JAXenter

Databricks CEO: Managing open source in the cloud is hard – ComputerWeekly.com

This is a guest post for Computer Weekly Open Source Insider written by Ali Ghodsi in his capacity as co-founder and CEO at data science, big data processing and machine learning company Databricks.

Databricks was founded by the creators of Apache Spark. The company was founded to provide an alternative to the MapReduce system and provides a just-in-time cloud-based platform for big data processing clients.

The companys technology is used by developers, data scientists and analysts to help users integrate the fields of data science, engineering and the business behind them across the machine learning lifecycle.

Ghodsi writes as follows

The open source community around the world is continuing to grow. According to an Octoverse report, over 1.3 million first time contributors joined the open source community in 2019. Plus, over 3.6 million software code repositories depend on each of the top 50 open source projects.

So although open source communities are thriving, how are open source powered software vendors faring?

Despite initial skepticism, today investors and end users see the value and potential, of open source software, leading to whole list of open source companies receiving billions in funding every year.

However, now that the open source business model has proven its worth, new challenges have become apparent.

But theres a red herring out there i.e the suggestion that public cloud vendors are hurting open source software

There is a growing perception that cloud vendors are exploiting open source without giving anything back and that open source vendors are hitting back by changing licenses.

But the real issue is that its extremely hard to manage and run a high quality managed service in the cloud and not all open source companies are good at it.

Red Hat enjoyed huge success by becoming the prime open source enterprise vendor at a time where on-premise was the only deployment method for businesses. However, the on-premise paradigm was fundamentally different from the SaaS paradigm. In the former, most of the value of the vendor came from support, training, and services.

They were heavily reliant on human expertise and services, which came with higher churn and lower upsells because these vendors could easily be replaced independent of the software. In contrast, the SaaS open source business model requires the SaaS vendor to be responsible for a host of additional things like providing security and reliability guarantees, and automatic software upgrades.

The two business models are very different, as both the strategic relevance and the level of engineering required by a SaaS-based model are much higher.

Today, as cloud adoption continues to take off, open source vendors are realising that they need to shift to having a SaaS-based offering. But they know cloud vendors are naturally better at operating cloud hosted software. They perceive this as a threat and thus might attempt to block the cloud vendors out by changing the licensing terms.

The reality is open source software itself has zero intrinsic monetisation value because anyone can use it, so there will always be a requirement for open source vendors to determine the value beyond the software.

We believe this value lies in the vendors ability to deliver open source software as a service. The cloud is an inevitability these vendors will need to embrace and prove their performance at scale to cope with the increase in demand for edge computing. Instead of wasting time on pushing cloud providers away, they should be focused on building great SaaS offerings.

Limiting the license of their software will just lead to less adoption and community-driven innovation around those open source projects, which poses a far bigger existential threat to their business.

In recent years, Microsoft, Google and AWS have been very engaged with open source communities and the positive approach from the worlds biggest tech firms is a marked change to how they behaved in the past. In Spring last year, Google announced seven partnerships with open source vendors a landmark statement that open source has arrived for enterprises. Microsoft, fuelled by its strategic mandate, has also hand-picked open source companies to keep innovation vibrant.

Microsoft and Googles pioneering partnership approach is the benchmark for how the big tech giants can help open source tech companies. They are treating them as partners rather than third-party providers by directly integrating them on their cloud platforms and providing clarity in billing and support, all on one interface.

For example, one of the fastest-growing and most broadly used AI and data services on Azure is Azure Databricks, a service provided through a deep partnership between an open source vendor and Microsoft. Today, customers process over two exabytes per month on Azure Databricks with millions of server-hours spinning up every day.

The next frontier will be centred around how open source businesses handle data it captures or creates, its value and the ecosystem built around it. Were only at the beginning of this, and its exciting to see the emergence of economies forming around data itself.

Databricks CEO Ghodsi: its extremely hard to manage and run a high quality managed service in the cloud.

See the original post:
Databricks CEO: Managing open source in the cloud is hard - ComputerWeekly.com

‘Community-based’ Open Source on the Rise – EnterpriseAI

via Shutterstock

As more enterprises embrace open source software for applications ranging from security and cloud management to databases and analytics, the steady shift away from proprietary software is coalescing around a community-based open source movement.

According to an annual snapshot on the state of enterprise open source tools released by open source leader Red Hat, expensive proprietary software licenses and fear of vendor lock-in are driving the enterprise embrace of open source code.

As more hyper-scalers contribute code to cloud management and other projects, the Red Hat survey estimates that community-based open source software usage will reach 21 percent of companies surveyed by 2022.

For instance, frequent code contributor Google (NASDAQ: GOOGL) released a stable version of a programming language called Dart in December 2019. Less than two months later, a recent survey of popular search queries found the object-oriented tool for running applications on multiple platforms was among the fastest growing programming languages.

At the same time, the rise of open source development centered around key stakeholders has raised concerns that the very same vendors of proprietary software are acting with what observers call enlightened self-interest to gain a foothold in burgeoning open source communities.

Microsofts 2018 acquisition of the GitHub collaboration platform, for example, raised immediate concerns about the future direction of open source development on GitHub, which at the time of the $7.5 billion acquisition was used by more than 28 million developers. For the most part, Microsoft (NASDAQ: MSFT) has so far made good on its pledge to maintain GitHubs developer-first ethos.

Microsoft and other hyper-scalers have gravitated toward open source tools as community-based efforts have improved the quality and security of code. Indeed, the Red Hat survey found that more than half of respondents cited security and cloud management as the key reasons for shifting to open source tools. Along with cost, open source adopters also cited the growing number of cloud-native projects and the ability to safely leverage open source tech.

Infrastructure modernization was the leading use case for open source (60 percent), followed by application development (53 percent) and DevOps (52 percent). Those modernization efforts include the greater use of micro-services running on multi-cloud deployments. That enterprise trend has been driven open source orchestration tools like the de facto standard Kubernetes platform, originally developed by Google.

Indeed, micro-services such as application containers and other cloud-native technologies are driving the enterprise shift to open source. The survey found that 56 percent of those polled expected to increase use of containers over the next 12 months.

While security was the top use case for open source, survey respondents also cited lingering concerns about code security as the leading barrier to open source adoption. Across the four geographic regions covered by the open source survey, an average of 38 percent of respondents citedcode security as a concern.

Red Hats response? Security might refer to a belief that the availability of source code makes software more susceptible to attacksalthough thats rarely the way in which vulnerabilities are exploited.

The results indicate a market environment driven by collaborative innovation, said Red Hat CEO James Whitehurst, who was named president of parent company IBM (NYSE: IBM) in a management shakeup announced at the end of January.

Red Hats enterprise open source study is based on interviews with 950 IT executives, including 400 in the U.S.

Related

About the author: George Leopold

George Leopold has written about science and technology for more than 30 years, focusing on electronics and aerospace technology. He previously served as executive editor of Electronic Engineering Times. Leopold is the author of "Calculated Risk: The Supersonic Life and Times of Gus Grissom" (Purdue University Press, 2016).

Original post:
'Community-based' Open Source on the Rise - EnterpriseAI

Supporting an open source operating system: a Q&A with the FreeBSD Foundation – Techradar

When discussing alternative operating systems to Microsofts Windows or Apples macOS, Linux often comes to mind. However, while Linux is a recreation of UNIX, FreeBSD is more of a continuation. The free and open source operating system was initially developed by students at the University of California at Berkeley which is why the BSD in its name stands for Berkeley Software Distribution.

FreeBSD runs on its own kernel and all of the operating systems key components have been developed to be part of a single whole. This is where it differs the most from Linux because Linux is just the kernel and the other components are supplied by third parties.

To learn more about FreeBSD and its ongoing development, TechRadar Pro spoke to the executive director of the FreeBSD Foundation, Deb Goodkin.

What excites me the most is getting a large financial contribution from a commercial user. Not only does it help us continue the work we are doing, but it also validates the work the community and the Foundation are doing.

Besides that, other aspects that excite me are working with this community, watching people grow within the community, advocating for FreeBSD, working with my team on developing new/improved ways to help the Project and community, and being able to constantly learn and grow in my job.

The FreeBSD Foundations purpose is to support the FreeBSD Project. While were an entirely separate entity, we step in to fill critical needs of the project. To support the development of FreeBSD, we have software developers on staff to quickly step in to fix bugs, implement workarounds to hardware issues, and implement new features and functionality. They also review many of the software changes, providing constructive feedback to continuously help improve the code. In addition, we provide the FreeBSD infrastructure that is hosted around the world and provide staff to oversee continuous integration and quality assurance efforts, to improve testing and code coverage.

Id be remiss if I didnt include our advocacy efforts in supporting development of FreeBSD. We attend technical conferences around the world, giving FreeBSD talks and workshops, to recruit more users and contributors to the Project. Increasing the number of contributors allows more people to step in to help in various areas of the Project, and this will help us with long-term sustainability. New users help test FreeBSD, with their various use cases, helping to identify issues or providing their input on how to improve FreeBSD.

We are 100% dependent on donations to support these efforts, so we are constantly reaching out to commercial users and the community to give a financial contribution.

I wouldnt refer to FreeBSD being the antithesis of Linux, since we both have similarities and are both Unix-like.But the similarities do lend people to think that FreeBSD is a Linux Distribution. However, that is not the case. FreeBSD is descended from the Unix developed at the University of California, Berkeley in the 1970s. Linux, on the other hand, was built as an open source alternative to UNIX. The similarities do make it easier for Linux developers to get involved with FreeBSD.

Currently there are over 400 active developers and thousands of contributors. FreeBSD works on 32- and 64-bit Intel / AMD x86, 32- and 64-bit Arm, RISC-V, PowerPC, Sparc64, and MIPS CPUs, and cloud providers like AWS, Azure, and GCP. There are tens of millions of deployed systems.

As with other BSDs, the FreeBSD base system is an integrated operating system distribution that is developed and released as a cohesive whole by a single team, which is in contrast to the Linux approach of distributions picking up the kernel from one source, the C library from another, the userland tools from another and so on.

FreeBSD operates on the Principle of Least Astonishment. In other words, dont break things that work. Because the OS doesnt change without good reason, if you are basing your code or product on it, you don't have to constantly catch up everytime there is a new OS release. It also makes upgrading relatively painless.The licensing model is probably the biggest difference between the two. Linux is under the GNU General Public License (GPL), meaning in part, that any derivative work of a product released under the GPL must also be supplied with source code if requested. FreeBSD on the other hand, is under the copy-free BSD license. Its less restrictive: binary-only distributions are allowed and particularly useful for embedded platforms.

FreeBSD doesnt include a GUI in the initial install, because it follows the philosophy of starting out with only what you need to develop on FreeBSD. Since FreeBSD offers many GUIs through its ports and packages collection, this allows the user to select the one they want to use.

MacOS uses a significant amount of FreeBSD in its kernel and userland. More specifically they use the networking stack and a fair bit of userland, like libraries and utilities. For example, most of their command line is FreeBSD.

Some of the software development efforts going on include, improving performance and scalability, increasing the hardware support, adding OpenZFS Raid-Z Expansion functionality, improving graphics and desktop support, improving OpenJDK on FreeBSD and improved wifi support. In addition, there is exciting news coming out of the University of Cambridge with their CHERI (Capability Hardware Enhanced RISC Instructions) collaborative effort with Arm to create a CHERI/ARM processor (you can find out more here).

Other plans include increasing our advocacy efforts, by increasing the FreeBSD workshops and FreeBSD talks around the globe.

Theres no limitation on who should consider using FreeBSD! It is perfect for someone who cares about rock solid stability and high performance. It has ZFS for protecting your data. FreeBSD has a community that is friendly, helpful, and approachable, and it provides excellent documentation to easily find answers. There are over 30,000 open source software packages that are easy to install, allowing you to easily set up your environment without a lot of extras, and that includes many choices of popular GUIs. Finally, our philosophy of dont break things that work is very appealing.

Read the original:
Supporting an open source operating system: a Q&A with the FreeBSD Foundation - Techradar