Quantum mechanics is immune to the butterfly effect – The Economist

That could help with the design of quantum computers

Aug 15th 2020

IN RAY BRADBURYs science-fiction story A Sound of Thunder, a character time-travels far into the past and inadvertently crushes a butterfly underfoot. The consequences of that minuscule change ripple through reality such that, upon the time-travellers return, the present has been dramatically changed.

The butterfly effect describes the high sensitivity of many systems to tiny changes in their starting conditions. But while it is a feature of classical physics, it has been unclear whether it also applies to quantum mechanics, which governs the interactions of tiny objects like atoms and fundamental particles. Bin Yan and Nikolai Sinitsyn, a pair of physicists at Los Alamos National Laboratory, decided to find out. As they report in Physical Review Letters, quantum-mechanical systems seem to be more resilient than classical ones. Strangely, they seem to have the capacity to repair damage done in the past as time unfolds.

To perform their experiment, Drs Yan and Sinitsyn ran simulations on a small quantum computer made by IBM. They constructed a simple quantum system consisting of qubitsthe quantum analogue of the familiar one-or-zero bits used by classical computers. Like an ordinary bit, a qubit can be either one or zero. But it can also exist in superposition, a chimerical mix of both states at once.

Having established the system, the authors prepared a particular qubit by setting its state to zero. That qubit was then allowed to interact with the others in a process called quantum scrambling which, in this case, mimics the effect of evolving a quantum system backwards in time. Once this virtual foray into the past was completed, the authors disturbed the chosen qubit, destroying its local information and its correlations with the other qubits. Finally, the authors performed a reversed scrambling process on the now-damaged system. This was analogous to running the quantum system all the way forwards in time to where it all began.

They then checked to see how similar the final state of the chosen qubit was to the zero-state it had been assigned at the beginning of the experiment. The classical butterfly effect suggests that the researchers meddling should have changed it quite drastically. In the event, the qubits original state had been almost entirely recovered. Its state was not quite zero, but it was, in quantum-mechanical terms, 98.3% of the way there, a difference that was deemed insignificant. The final output state after the forward evolution is essentially the same as the input state before backward evolution, says Dr Sinitsyn. It can be viewed as the same input state plus some small background noise. Oddest of all was the fact that the further back in simulated time the damage was done, the greater the rate of recoveryas if the quantum system was repairing itself with time.

The mechanism behind all this is known as entanglement. As quantum objects interact, their states become highly correlatedentangledin a way that serves to diffuse localised information about the state of one quantum object across the system as a whole. Damage to one part of the system does not destroy information in the same way as it would with a classical system. Instead of losing your work when your laptop crashes, having a highly entangled system is a bit like having back-ups stashed in every room of the house. Even though the information held in the disturbed qubit is lost, its links with the other qubits in the system can act to restore it.

The upshot is that the butterfly effect seems not to apply to quantum systems. Besides making life safe for tiny time-travellers, that may have implications for quantum computing, too, a field into which companies and countries are investing billions of dollars. We think of quantum systems, especially in quantum computing, as very fragile, says Natalia Ares, a physicist at the University of Oxford. That this result demonstrates that quantum systems can in fact be unexpectedly robust is an encouraging finding, and bodes well for potential future advances in the field.

This article appeared in the Science & technology section of the print edition under the headline "A flutter in time"

Excerpt from:
Quantum mechanics is immune to the butterfly effect - The Economist

6 new degrees approved, including graduate degrees in biostatistics and quantum information science: News at IU – IU Newsroom

The Indiana University Board of Trustees has approved six new degrees, four of which are graduate level.

All of the new graduate degrees are on the Bloomington campus:

Also approved were a Bachelor of Arts in theater, film and television at IUPUI and a Bachelor of Science in accounting at IU East.

The master's and doctoral degrees in biostatistics are offered by the Department of Epidemiology and Biostatistics in the School of Public Health-Bloomington. They will focus on rural public health issues and specialized areas in public health research, such as the opioid epidemic.

Biostatistics is considered a high-demand job field. Both degrees are intended to meet the labor market and educational and research needs of the state, which is trying to reduce negative health outcomes. Biostatisticians typically are hired by state and local health departments, federal government agencies, medical centers, medical device companies and pharmaceutical companies, among others.

The Master of Science in quantum information science will involve an intensive, one-year, multidisciplinary program with tracks that tie into physics, chemistry, mathematics, computer science, engineering and business. It's offered through the Office of Multidisciplinary Graduate Programs in the University Graduate School. The degree was proposed by the College of Arts and Sciences, the Luddy School of Informatics, Computing and Engineering, and the Kelley School of Business.

Most of the faculty who will teach the classes are members of the newly established IU Quantum Science and Engineering Center.

Students who earn the Master of Science in quantum information science can pursue careers with computer and software companies that are active with quantum computation, and national labs involved in quantum information science, among other opportunities.

The Master of International Affairs is a joint degree by the O'Neill School of Public and Environmental Affairs and the Hamilton-Lugar School of Global and International Studies. The degree is the first of its kind offered by any IU campus and meets student demand for professional master's programs having an international focus.

Featured components of the degree include the study of international relations and public administration. Graduates can expect to find employment in the federal government, such as the Department of State, the Department of Treasury or the U.S. intelligence community, or with private-sector firms in fields such as high-tech, global trade and finance.

The Bachelor of Arts in theater, film and television combines existing programs and provides them a more visible home in the School of Liberal Arts at IUPUI. The degree features three distinct concentrations:

Applied theater is a growing field that emphasizes and works with organizations around issues of social justice, social change, diversity and inclusion.

IU East's Bachelor of Science in accounting degree, offered through the School of Business and Economics, helps meet projected high demand in the accounting industry. It also will prepare students to take the certified public accountant or certified managerial accountant exams, or enter graduate programs in accounting or business.

Go here to read the rest:
6 new degrees approved, including graduate degrees in biostatistics and quantum information science: News at IU - IU Newsroom

Quantum Computing Market Size by Top Companies, Regions, Types and Application, End Users and Forecast to 2027 – Bulletin Line

New Jersey, United States,- Verified Market Researchhas recently published an extensive report on the Quantum Computing Market to its ever-expanding research database. The report provides an in-depth analysis of the market size, growth, and share of the Quantum Computing Market and the leading companies associated with it. The report also discusses technologies, product developments, key trends, market drivers and restraints, challenges, and opportunities. It provides an accurate forecast until 2027. The research report is examined and validated by industry professionals and experts.

The report also explores the impact of the COVID-19 pandemic on the segments of the Quantum Computing market and its global scenario. The report analyzes the changing dynamics of the market owing to the pandemic and subsequent regulatory policies and social restrictions. The report also analyses the present and future impact of the pandemic and provides an insight into the post-COVID-19 scenario of the market.

Quantum Computing Market was valued at USD 193.68 million in 2019 and is projected to reach USD 1379.67 million by 2027, growing at a CAGR of 30.02% from 2020 to 2027.

The report further studies potential alliances such as mergers, acquisitions, joint ventures, product launches, collaborations, and partnerships of the key players and new entrants. The report also studies any development in products, R&D advancements, manufacturing updates, and product research undertaken by the companies.

Leading Key players of Quantum Computing Market are:

Competitive Landscape of the Quantum Computing Market:

The market for the Quantum Computing industry is extremely competitive, with several major players and small scale industries. Adoption of advanced technology and development in production are expected to play a vital role in the growth of the industry. The report also covers their mergers and acquisitions, collaborations, joint ventures, partnerships, product launches, and agreements undertaken in order to gain a substantial market size and a global position.

Quantum Computing Market, By Offering

Consulting solutions Systems

Quantum Computing Market, By Application

Machine Learning Optimization Material Simulation

Quantum Computing Market, By End-User

Automotive Healthcare Space and Defense Banking and Finance Others

Regional Analysis of Quantum Computing Market:

A brief overview of the regional landscape:

From a geographical perspective, the Quantum Computing Market is partitioned into

North Americao U.S.o Canadao MexicoEuropeo Germanyo UKo Franceo Rest of EuropeAsia Pacifico Chinao Japano Indiao Rest of Asia PacificRest of the World

Key coverage of the report:

Other important inclusions in Quantum Computing Market:

About us:

Verified Market Research is a leading Global Research and Consulting firm servicing over 5000+ customers. Verified Market Research provides advanced analytical research solutions while offering information enriched research studies. We offer insight into strategic and growth analyses, Data necessary to achieve corporate goals, and critical revenue decisions.

Our 250 Analysts and SMEs offer a high level of expertise in data collection and governance use industrial techniques to collect and analyze data on more than 15,000 high impact and niche markets. Our analysts are trained to combine modern data collection techniques, superior research methodology, expertise, and years of collective experience to produce informative and accurate research.

Contact us:

Mr. Edwyne Fernandes

US: +1 (650)-781-4080UK: +44 (203)-411-9686APAC: +91 (902)-863-5784US Toll-Free: +1 (800)-7821768

Email: [emailprotected]

View original post here:
Quantum Computing Market Size by Top Companies, Regions, Types and Application, End Users and Forecast to 2027 - Bulletin Line

Revenues from Quantum Key Distribution to Reach Almost $850 Million by 2025 – Quantaneo, the Quantum Computing Source

More details on the report, Quantum Key Distribution: The Next Generation A Ten-year Forecast and Revenue Assessment: 2020 to 2029 can be found at: https://www.insidequantumtechnology.com/product/quantum-key-distribution-the-next-generation-a-ten-year-forecast-and-revenue-assessment-2020-to-2029/

About the Report Inside Quantum Technology has covered the Quantum Key Distribution (QKD) market since 2014. We were the first industry analysis firm to predict that quantum security in mobile phones would become a significant revenue earner in the short-term. This report has been compiled from interviews from key players in the industry as well as with the assistance of government intelligence experts.

There have been some big developments in the QKD space since our previous report. The ITU-T standardization is near complete while both the US and UK governments have announced funding for large-scale quantum networks with QKD as a central component and the QuantumCTek IPO may be the beginning of the new public companies in this space.

This report contains ten-year forecasts of QKD for each of the major applications sections including national and civil government, the financial sector, telecommunications, data centers, utilities, infrastructure, mobile communications and possible consumer markets. There are also forecast broken out by end-user country and transportation type (satellite, fiber optic and free space). In addition, the report contains strategic profiles of the A list of QKD including ABB, Cambridge Quantum Computing, ID Quantique, KETS Quantum, MagiQ Technologies, Nokia, QuantumCTek, QuantumXChange, Qubitekk, Quintessence Labs, SK Telecom and Toshiba.

From the Report QKD for the data center with us soon: Adoption of QKD for conventional business communications is a small opportunity right now and we dont expect a real take off until 2025. By 2029 we expect the market for data center QKD to reach about $180 million. Early opportunities in data center QKD will be found in private firms that do business with governments, where QKD may actually be mandated someday. Another early target market are R&D centers, where so much high-tech data is vulnerable to theft.

QKD and he rise of China in QKD: SmarTechs estimates are that China currently accounts for about 36 percent of worldwide QKD revenues and will account for $1,329.0 of QKD revenues by 2029. The growing political and military rivalry between China and the US is a key driver for QKD deployment in both countries, especially as China is recording significant steps forward in this area. For example, Chinese researchers have successfully performed full QKD between two ground stations located 1200 km from each other with the aid of a satellite and the recent IPO of QuantumCTeK IPO shows.

QKD and PQC: Together at last: Until quite recently, Post Quantum Cryptography (PQC) were marketed as a rival to QKD. However, they are now seeming synergetic; so much so that some firms are offering both. What the IT world is coming to recognize is that PQC itself could ultimately succumb to the quantum threat if powerful quantum computers are built, this PQC may ultimately need QKD to survive. Note that Inside Quantum Technology has also recently published an analyst report on PQC markets [https://www.insidequantumtechnology.com/product/post-quantum-cryptography-pqc-a-revenue-assessment/ ]

Read more here:
Revenues from Quantum Key Distribution to Reach Almost $850 Million by 2025 - Quantaneo, the Quantum Computing Source

Global Quantum Key Distribution Market 2020-2029: Ten-year Forecasts and Revenue Assessments – PRNewswire

DUBLIN, Aug. 12, 2020 /PRNewswire/ -- The "Quantum Key Distribution: The Next Generation - A Ten-year Forecast and Revenue Assessment: 2020 to 2029" report has been added to ResearchAndMarkets.com's offering.

This report provides forecasts and analysis for key QKD industry developments. The author was the first industry analysis firm to predict that quantum security in mobile phones would become a significant revenue earner in the short-term. Phones using QRNGs were announced earlier this year and this report discusses how the mobile QRNG market will evolve.

There have been some big developments in the QKD space. In particular, the regulatory and financial framework for the development of a vibrant QKD business has matured. On the standardization and funding front, the ITU-T standardization is near complete while both the US and UK governments have announced major funding for large-scale quantum networks with QKD as a central component. And the QuantumCtek IPO may just be the beginning of the new public companies in this space.

The report contains forecasts of the hardware and service revenues from QKD in all the major end-user groups. It also profiles all the leading suppliers of QKD boxes and services. These profiles are designed to provide the reader of this report with an understanding of how the major players are creating QKD products and building marketing strategies for QKD as quantum computers become more ubiquitous.

Key Topics Covered:

Executive SummaryE.1 Key Developments Since our Last ReportE.2 Specific Signs that the Market for QKD is GrowingE.3 Evolution of QKD Technology and its Impact on the MarketE.3.1 Reach (Transmission Distance)E.3.2 Speed (Key Exchange Rate)E.3.3 Cost (Equipment)E.4 Summary of Ten-year Forecasts of QKD MarketsE.4.1 Forecasts by End-user SegmentE.5 Five Firms to Watch Closely in the QKD Space

Chapter One: Introduction1.1 Why QKD is a Growing Market Opportunity1.2 Overview of QKD Technological Challenges1.3 Goals and Scope of this Report1.4 Methodology of this Report1.5 Plan of this Report

Chapter Two: Technological Assessment2.1 Setting the Scene: QKD in Cryptography-land2.2 Why QKD: What Exactly does QKD Bring to the Cryptography Table?2.3 PQC's Love-Hate Relationship with QKD2.4 QKD's Technological Challenges2.5 QKD Transmission Infrastructure2.6 Chip-based QKD2.7 QKD Standardization: Together we are Stronger2.8 Key Takeaways from this Chapter

Chapter Three: QKD Markets - Established and Emerging3.1 QKD Markets: A Quantum Opportunity Being Driven by Quantum Threats3.2 Government and Military Markets - Where it all Began3.3 Civilian Markets for QKD3.4 Key Points from this Chapter

Chapter Four: Ten-year Forecasts of QKD Markets4.1 Forecasting Methodology4.2 Changes in Forecast Since Our Last Report4.2.1 The Impact of COVID-194.2.2 Reduction in Satellite Penetration4.2.3 Faster Reduction in Pricing4.2.4 Bigger Role for China?4.2 Forecast by End-User Type4.3 Forecast by Type of QKD Infrastructure: Terrestrial or Satellite4.4 Forecast of Key QKD-related Equipment and Components4.5 Forecast by Geography/Location of End Users

Chapter Five: Profiles of QKD Companies5.1 Approach to Profiling5.2 ABB (Switzerland/Sweden)5.3 Cambridge Quantum Computing (United Kingdom)5.4 ID Quantique (Switzerland)5.5 KETS Quantum Security (United Kingdom)5.6 MagiQ Technologies (United States)5.7 Nokia (Finland)5.8 QuantumCtek (China)5.9 Quantum Xchange (United States)5.10 Qubitekk (United States)5.11 QuintessenceLabs (Australia)5.12 SK Telecom (Korea)5.13 Toshiba (Japan)

For more information about this report visit https://www.researchandmarkets.com/r/lajrvk

About ResearchAndMarkets.comResearchAndMarkets.com is the world's leading source for international market research reports and market data. We provide you with the latest data on international and regional markets, key industries, the top companies, new products and the latest trends.

Research and Markets also offers Custom Research services providing focused, comprehensive and tailored research.

Media Contact:

Research and Markets Laura Wood, Senior Manager [emailprotected]

For E.S.T Office Hours Call +1-917-300-0470 For U.S./CAN Toll Free Call +1-800-526-8630 For GMT Office Hours Call +353-1-416-8900

U.S. Fax: 646-607-1907 Fax (outside U.S.): +353-1-481-1716

SOURCE Research and Markets

http://www.researchandmarkets.com

Excerpt from:
Global Quantum Key Distribution Market 2020-2029: Ten-year Forecasts and Revenue Assessments - PRNewswire

Digitalisation in Reliance Jio times: IoT, mobile internet to be key drivers of $5 trillion economy dream – Financial Express

ByShailesh Haribhakti and Arumugam Govindasamy

With a vertically integrated offering traversing Digital infrastructure, 5-G, customer interfaces, devices and technology, the stage is set for India to exploit the full power of Digitalisation.

The coming quantum computing world will bring the Internet of things(IOT) alive in India. IoT fuels a world of connected devices to make our lives easier: smart cities, fleet tracking, temperature monitoring and the digital transformation of agriculture. IoT has the potential to disrupt both business and policy. For instance, Amazon is using connected robots to locate products from its warehouse shelves and to bring them to workers, saving time and money. Similarly, the medical field is transformed by the use of connected devices to monitor the real-time health of patients.

During the current pandemic, IoT is a need-to-have. People and businesses are relying on IoT products such as remote connected health monitoring solutions, packaging and shipping trackers, and streaming devices the devices that are enabling remote work, telehealth, and distance learning. It also means that a tremendous amount of data is being transmitted, received, stored, and analyzed at the edge, or on devices. IoT devices are making it possible to eliminate dense gatherings of workers to avoid virus transmission.

Interwoven with the rise of IoT is an even stronger demand for processing and storage. As the pandemic built up, the need for latency free data transmission is realized. For example, business video conferencing requires low latency immersive HD video. The pipes simply arent large enough to do that with acceptable performance. At the same time, very little of that data you send and receive is worth storing, and most of it only has value for a small period of time. Of course, some of that data is actually tremendously valuable, but might require added AI software to extract it and of course thats immensely compute intensive.

With travel restrictions due to COVID-19 the use of virtual meetings is on the rise as multinational companies developing new technologies have the required expertise in different locations. This makes the move towards AR/VR and meeting virtually real.

The role of data infrastructure is important to ensure that mission-critical data can be transmitted, received, stored, and analyzed where its needed and when. Most important is a boost in connectivity. That kind of internet connectivity and speed is becoming increasingly available, and availability has begun to accelerate as demand continues to increase. Then theres 5G.

For several years, 5G has been getting a lot of hype because users want the ability to connect anywhere, and share large data files and videos, and 5G seemed set to deliver.

The pandemic has shed a light on ways that 5G, were it fully deployed globally, could help home-based workers and/or workers still onsite who are focused on mission-critical manufacturing and other work. 5G is a key driving force in helping IoT move forward enabling more reliable autonomous manufacturing processes via new standards for ultra-low latency in factories. The processing power required for 5G is tremendous, and along with that comes the requirements for data storage.

Every crisis leaves a long-lasting legacy in terms of faster innovation and a new normal. COVID-19 will accelerate the move to digital and to companies adopting IoT, AI/ML and 5G amongst other converging technologies to drive digital transformation.

According to the findings by McKinsey Global Institute IoT combined with mobile Internet will have substantial global economic impact of up to $20 trillion by 2025 and will be the key economic driver among disruptive technologies. For India to surpass the projected $5 Trillion economy, the digital India initiative with focus on IoT and mobile Internet will be the key force. With the frozen economy due to the pandemic, IoT will need to be empowered. The Technological disruption based on IoT and mobile internet will create innovation and entrepreneurship simultaneously.

IoT requires localized innovation, which requires Indian Government support in terms of providing appropriate funding, mobilizing private and public sectors for innovation and push for using homegrown technologies for better security, privacy and sustainability. All this and more will need to be catalysed by retraining, unlearning and relearning and a new pace of innovation. The Indian trained manpower alone is capable of delivering this growth. From a negative 9.5% to a positive 5% is a journey the only converging exponential technology can deliver.

See the original post here:
Digitalisation in Reliance Jio times: IoT, mobile internet to be key drivers of $5 trillion economy dream - Financial Express

More reasons SA is making itself a basket case in manufacturing – @AuManufacturing

Comment by Peter Roberts

I have previously argued that the South Australian government once the centre of the sector has given up on manufacturing.

It has no manufacturing minister, no manufacturing section in any department, no bureaucrat with that responsibility and no focus on the sector other than a few sexy areas and Canberras pet growth industries of defence and space.

At the time I had a back and forth with the Premier, Stephen Marshall which convinced me that he didnt see the problem in his government dropping an Industry and Skills department in favour of a renamed Innovation and Skills one.

He didnt get that no specific focus on manufacturing meant, well, no focus on manufacturing.

Now the states 300 companies in the electronics sector have been excluded from the SA State Growth Plan, a 10 year plan to be launched later this year.

The plan contains all the trendy and sexy buzz words Artificial Intelligence, Machine Learning, Data Analytics, Blockchain, Computer Vision, Virtual, Cyber Security, Internet of Things, Quantum Computing and Photonics.

But none of this is building on what the state has got in spades, and that is electronics.

This is a sector of 300 companies that employs 11,000 staff with $4 billion annual revenue and productivity at $343,600 per person, according to Electronics Industry Development Adelaide.

This is about three times that of all other SA manufacturing at $113,600.

Not only do we have an electronics sector in Adelaide, we are also building close to $100 billion worth of high tech defence equipment such as frigates and submarines in new facilities at Port Adelaide (pictured), which one might expect could include some electronics equipment.

Adding insult to injury my article also bemoaned SA dumping its bid for the 2026 Commonwealth Games because it was too expensive at $4 billion.

Since then a big group led by former federal minister Christopher Pyne and Olympians Anna Mears and Kyle Chalmers have been agitating for Marshall to show some vision and embrace the Games.

They presented updated figures to the government which showed it could be done for $1.1 billion.

The government has rejected their move, better perhaps to chase mirages of a quantum computing industry on North Terrace.

(Readers might excuse me my mentioning Adelaide again, but I recently relocated to my home state.)

Picture: Australian Naval Infrastructure//Hunter class frigate construction hall, Osborne

Subscribe to our free @AuManufacturing newsletter here.

Continue reading here:
More reasons SA is making itself a basket case in manufacturing - @AuManufacturing

news digest: Microsoft launches open source website, TensorFlow Recorder released, and Stackery brings serverless to the Jamstack – SD Times -…

Microsoft launched a new open source site, which features aims to help people get involved, explore projects and join the ecosystem.

The site also offers near real-time view of things that are happening across Microsofts projects on GitHub.

In addition, the site highlights Microsofts open-source projects such as Accessibility Insights, PowerToys and Windows Terminal.

More information is available here.

TensorFlow Recorder releasedGoogle announced that it open sourced the TensorFlow Recorder last week to make it possible for data scientists, engineers, or AI/ML engineers to create image-based TFRecords with just a few lines of code.

Before TFRecorder, users would have had to write a data pipeline that parsed their structured data, loaded images from storage, and serialized the results into the TFRecord format. Now, TFRecorder allows users to write TFRecords directly from a Pandas dataframe or CSV without writing any complicated code, according to Google in a post.

Data loading performance can be further improved by implementing prefetching and parallel interleave along with using the TFRecord format.

Stackery brings serverless to the JamstackStackery announced that it added the website resource to simplify the build process for static site generators like Gatsby.

This automates a lot of machinery within AWS to retrieve application source and build it with references to external sources, including: AWS Cognito Pools, GraphQL APIs, Aurora MySQL databases, and third-party SaaS services like GraphCMS.

The combination of JAMstack and serverless allows for powerful, scalable, and relatively secure applications which require very little overhead and low initial cost to build, Stackery wrote in a post.

Visual Studio Code updateVisual Studio Code version 1.48 includes updates such as Settings Sync now available for preview in stable, an updated Extensions view menu, and a refactored overflow menu for Git in the Source Control view.

It also includes the option to publish to a public or private GitHub repository and to debug within the browser without writing a launch configuration.

Preview features are not ready for release but are functional enough to use, Microsoft wrote in a post that contains additional details on the new release.

Source dependency reporting in Visual Studio 2019 16.7The new switch for the compiler toolset enables the compiler to generate a source-level dependency report for any given translation unit it compiles.

Additionally, the use of /sourceDependencies is not limited only to C++, it can also be used in translation units compiled as C. The switch is designed to be used with multiple files and scenarios under /MP, according to Microsoft in a post.

C++20 demands a lot more from the ecosystem than ever before. With C++20 Modules on the horizon the compiler needs to work closely with project systems in order to provide rich information for build dependency gathering and making iterative builds faster for inner-loop development, Microsoft stated.

Read the original:

news digest: Microsoft launches open source website, TensorFlow Recorder released, and Stackery brings serverless to the Jamstack - SD Times -...

The Risks Associated with OSS and How to Mitigate Them – Security Boulevard

Open source has become nearly ubiquitous with Agile and DevOps. It offers development teams the ability to quickly and easily scale their software development life cycles (SDLC). At the same time, open-source software (OSS) components can introduce security vulnerabilities, licensing issues, and development workflow challenges. Open-source risks include both licensing challenges and cyber threats from poorly written code that leads to security gaps. With the number of Common Vulnerabilities and Exposures (CVE) growing rapidly, organizations must define actionable OSS policies, monitor OSS components, and institute continuous integration/continuous deployment (CI/CD) controls to improve OSS vulnerability remediation without slowing release cycles.

Due to the need for rapid development and innovation, developers are increasingly turning to open-source frameworks and libraries to accelerate software development life cycles (SDLC). Use of open-source code by developers grew 40% and is expected to expand 14% year on year through 2023.

Agile and DevOps enable development teams to release new features multiple times a day, making software development a competitive differentiator. The demand for new and innovative software is brisk64% of organizations report an application development backlog (19% have more than 10 applications queued).

Beyond helping to accelerate development cycles, OSS enables organizations to lower costs and reduce time to market in many ways. Rather than writing custom code for large segments of applications, developers are turning to OSS frameworks and libraries. This reduces cost while enabling much greater agility and speed.

Despite all its benefits, OSS can present an array of risks with licensing limitations as well as security risks. Following is a quick look at some of these.

An area that organizations should not overlook in terms of risk is OSS licensing. Open source can be issued under a multitude of different licenses, or under no license at all. Not knowing the obligations that fall underneath each particular license (or not abiding by those obligations) can cause an organization to lose intellectual property or experience a monetary loss. While OSS is free, this does not mean it cannot be used without complying with other obligations. Indeed, there are over 1,400 open software licenses that software can fall under with a variety of stipulations restricting and permitting use.

With shift-left methodologies gaining traction, organizations are focused on finding and preventing vulnerabilities early in the software delivery process. However, open-source licensing issues will not show up at this stage unless software composition is analyzed. Waiting until right before release cycles to check on open-source licensing issues can incur significant development delaystime spent reworking code and checking it for vulnerabilities and bugs. Additionally, as development teams are measured on the speed and frequency of releases, these delays can be particularly onerous.

With the use of OSS, there is a possibility to introduce an array of vulnerabilities into the source code. The reality is that developers are under increasing pressure to write feature-rich applications within demanding release windows. When the responsibility of managing application security workflows and vulnerability management is added, including analysis of OSS frameworks and libraries, it becomes increasingly difficult for them to efficiently and effectively ensure security remains top of mind. In addition, for legacy application security models, code scanning as well as triage, diagnosis, and remediation of vulnerabilities requires specialized skill sets that developers are not commonly trained on.

A critical part of the problem is that legacy application security uses an outside-in model where security sits outside of the software and SDLC. However, research shows that security must be built into development processes from the very startand this includes the use of open-source frameworks and libraries.

Since OSS is publicly available, there is no central authority to ensure quality and maintenance. This makes it difficult to know what types of OSS are most widely in use. In addition, OSS has numerous versions, and thus older versions may contain vulnerabilities that were fixed in subsequent updates. Indeed, according to the Open Web Application Security Project(OWASP), using old versions of open-source components with known is one of the most critical web application security risks. Since security researchers can manually review code to identify vulnerabilities, each year thousands of new vulnerabilities are discovered and disclosed publicly, often with exploits used to prove the vulnerability exists.

But Common Vulnerabilities and Exposures (CVEs) are just a tip of the iceberg. Open source contains a plethora of unknown or unreported vulnerabilities. These can pose an even greater risk to organizations. Due to its rapid adoption and use, open source has become a key target for cyber criminals.

To effectively realize the many OSS benefits, development teams must implement the right application security strategies. It all starts with setting up the right policies.

Organizations use policy and procedures to provide guidance for proper usage of OSS components. This includes which types of OSS licensing are permitted, which type of components to use, when to patch vulnerabilities, and how to prioritize them.

To minimize the risk associated with licensing, organizations need to know which licenses are acceptable by use case and environment. And when it comes to security, application security teams need policies to help disclose vulnerabilities. For example, a component with a high severity vulnerability may be acceptable in an application that manages data that is neither critical nor sensitive and that has a limited attack surface. However, according to a documented policy, that same vulnerability is unacceptable in a public-facing application that manages credit card data and should be remediated immediately.

According to Gartner, one of the first steps to improving software security is to ensure that a software bill of materials (SBoM) exists for every software application. An SBoM is a definitive list of all serviceable parts (including OSS) needed to maintain an application. Since software is usually built by combining componentswith development frameworks, libraries, and operating system featuresit has a bill of materials that describes the bits that comprise it, just as much as hardware does.

A critical aspect of maintaining an effective software inventory is to ensure that it accurately and dynamically represents the relationships between components, applications, and serversso that development teams always know what is deployed, where each component resides, and exactly what needs to be secured. Once an SBoM is built, it needs to map to a reliable base of license, quality, and security data.

Since cyber criminals often launch attacks on newly exposed vulnerabilities in hours or days, an application security solution is needed to immediately protect against exploitation of open-source vulnerabilities. Security instrumentation embeds sensors within applications so they can protect themselves from the most sophisticated attacks in real time. This enables an effective open-source risk management programthe ability to deliver the quickest possible turnaround for resolving issues once they emerge. This includes providing insight into which libraries are in use by the application, which helps development teams to prioritize the fixes that pose the greatest likelihood of exploitation. Security teams can also leverage this functionality to foster goodwill with developers; too often, developers are overwhelmed by the sheer volume of findings presented by legacy software composition analysis (SCA) tools.

It is no surprise that automating some application security processes improves an organizations ability to analyze and prioritize threats and vulnerabilities. Last years Cost of a Data Breach Report from Ponemon Institute and IBM Security finds that organizations without security automation experience breach costs that are 95% higher than breaches at organizations that have fully deployed automation.

Another approach in securing the use of OSS in DevOps environments is to embed automated controls in continuous integration/continuous deployment (CI/CD) processes. OSS elements often do not pass the same quality and standards checks as proprietary code. Unless each open-source component is evaluated before implementation, it is easy to incorporate code containing vulnerabilities.

When properly operationalized, an open-source management solution can automatically analyze all dependencies in a project. If vulnerable components are detected in an application build, an automated policy check should trigger a post-build action failing or mark the build as unstable based on set parameters. Regardless of the specific process and tooling an organization has in place, the goal should always be to deliver immediate and accurate feedback to developers so that they can take direct action to keep the application secure and functional.

The many advantages of using open-source components in applications come with a costrisk exposures in both licensing and cybersecurity. As a favorite target of cyber criminals, open-source code vulnerabilities can become a moving target requiring constant vigilance to prevent bad actors from taking advantage. Successfully managing OSS increasingly depends on automated application security processes. Automation helps organizations track all the open-source components in use, identify any associated risks, and enable effective mitigation actions so that teams can safely use open source without inhibiting development and delivery.

For more information on what organizations need to seek when securing open source, read the eBook, The DevSecOps Guide to Managing Open-Source Risk.

Go here to read the rest:

The Risks Associated with OSS and How to Mitigate Them - Security Boulevard

Open Source: What’s the delay on the former high/middle school on North Mulberry? – knoxpages.com

EDITOR'S NOTE:This story is in response to a reader-submitted question throughOpen Source, a platform where readers can submit questions to the staff.

MOUNT VERNON When a reader asked through Open Source about the stoppage of demolition on the old high/middle school on North Mulberry Street, he wasn't the only one wondering what is going on. Councilmember Tammy Woods asked the same question during Monday night's city council meeting.

After years of uncertainty, promises, and unrealized plans, demolition finally began on June 19, only to come to a halt a few days later. After a seven-week hiatus, activity resumed this week.

When Safety-service Director Richard Dzik asked developer Joel Mazza early this week the reason for the delay, Mazza cited two reasons: vacation and illness.

The initial stoppage was due to the contractor, Jeff Page of Lucas-based Page Excavating, being on vacation a couple of weeks. Mazza has been on vacation the last couple of weeks, and the contractor has had employees out sick.

I have not seen any more of the building come down, but there has been activity, Dzik told Knox Pages on Thursday. They continue to deal with the debris.

When initially contacted, Page declined to comment other than to say crews were working at the school this week. In a series of text messages, however, he explained that his crew is separating the wood from the brick and block on the part of the building already demolished. When the current pile of rubble is sorted and removed, another section will be demolished, and the process resumed.

According to the demolition permit signed on Dec. 5, 2019, the proposed start date for demolition was Dec. 15, 2019, with a completion date of Mar. 31, 2020. Dzik said the contractor is given six months after signing the contract to start the work. An extension is possible.

According to our code, from the time the permit is issued, the contractor has 12 months to substantially complete the project, he said. I would hope it wouldn't drag out that long.

The current permit is only for demolition. Dzik said that to begin construction, Mazza will have to apply for a zoning permit and present plans for the project. Mazza plans to build an affordable housing option for renters that will include two-to-three-story town homes, flats, and a three-to-four-story apartment complex.

Our stories will always be free to read, but they aren't free to produce. We need your support. To help our news organization tell Knox County's story every day, join our team. Become a member today.

Read the original post:

Open Source: What's the delay on the former high/middle school on North Mulberry? - knoxpages.com