Linux Foundation Public Health projects first focused on Google-Apple Exposure Notification API – Healthcare IT News

The Linux Foundation has launched a new project to harness open source software to help global public health agencies fight back against COVID-19 and prepare for future public health challenges.

WHY IT MATTERSThe Linux Foundation Public Health Initiative has signed on seven premier members Cisco, doc.ai, Geometer, IBM, NearForm, Tencent and VMware as it works on two initial projects.

Covid Watch, Kiel University of Applied Sciences, and US Digital Response have also joined as nonprofit associate members.

HIMSS20 Digital

The collaborative is focusing first on exposure notification apps that use the Google Apple Exposure Notification system API and will subsequently expand to other testing, tracing and isolation challenges.

Its first initiatives COVID Shield and COVID Green are deployed right now in Canada and Ireland, respectively, and in a few U.S. states.

Both apps are available for other public health agencies and their IT partners able to be customized to their own individual needs and will soon be joined by other open source projects hosted by LFPH, officials said.

THE LARGER TRENDThe goal is for open source developers and others in the tech industry to help public health authorities, many of which, despite years of underinvestment, are now having to scale up their tracking capabilities during the pandemic, say Linux Foundation for Public Health officials.

Linux Foundation Public Health recently hosted a GAEN Symposium focused on theGoogle Apple Exposure Notification system, and plans to host similar events soon, as well as others focused on UI and UX, localization, and privacy and security, it says. More information can be found at lfph.io.

ON THE RECORD"To catalyze this open source development, Linux Foundation Public Health is building a global community of leading technology and consulting companies, public health authorities, epidemiologists and other public health specialists, privacy and security experts, and individual developers," said Dan Kohn, LFPH general manager, in a statement. "While we're excited to launch with two very important open source projects, we think our convening function to enable collaboration to battle this pandemic may be our biggest impact."

"During this grave global crisis, I'm committed to having all parts of the Linux Foundation community support LFPH," added Linux Foundation executive director Jim Zemlin. "Open source provides an architecture for global collaboration and that's what's needed to build, secure, and sustain critical components of our stressed public health infrastructure."

Twitter:@MikeMiliardHITNEmail the writer:mike.miliard@himssmedia.comHealthcare IT News is a publication of HIMSS Media

Link:
Linux Foundation Public Health projects first focused on Google-Apple Exposure Notification API - Healthcare IT News

A Staggering 21TB of Source Code Were Just Buried in The Arctic For an Unknown Future – ScienceAlert

If doomsday comes, know this: precautions have been taken. On an isolated Arctic archipelago, the Svalbard Global Seed Vault aka Norway's 'Doomsday Vault' holds over 1 million seed samples in a fortress-like bunker designed to be the most invulnerable seed bank in the world.

Svalbard protects more than just seeds, though. On the same remote mountain, an abandoned coal mine now exists as another vital safe-house: the Arctic World Archive, preserving the world's data of today for an uncertain tomorrow. And the facility just received a contribution that's truly mind-boggling in scope.

GitHub, often billed as the world's largest host of open source code, has successfully transported all of its active public code repositories (as of February this year) to the Arctic World Archive, as part of the company's ongoing efforts to establish the GitHub Arctic Code Vault.

(GitHub)

"Our mission is to preserve open source software for future generations by storing your code in an archive built to last a thousand years," Julia Metcalf, GitHub's director of strategic programs, explains on the company's blog.

The project, first announced last year, already saw one shipment to Svalbard in late 2019, with a deposit of 6,000 of the platform's most significant repositories of open source code.

The new shipment, painstakingly managed during the shutdowns and border closures of the coronavirus pandemic, goes even further, preserving a massive haul amounting to 21 terabytes of data, written onto 186 reels of a digital archival film called piqlFilm.

This purpose-built media designed to last for 500 years, with simulations suggesting it should last twice as long is now stored 250 metres deep, in a steel-walled container inside a sealed chamber in the Arctic World Archive.

The film, composed of silver halides on polyester, looks like a miniaturised print of QR codes, except every frame squeezes in some 8.8 million microscopic pixels, and each reel runs for almost 1 kilometre (about 3,500 ft), such is the gargantuan size of the data being stored.

(GitHub/YouTube)

"It can withstand extreme electromagnetic exposure and has undergone extensive longevity and accessibility testing," the piql company claims.

It's hoped that this extremely long-life media in conjunction with the Archive's natural isolation and engineered security will give the world's open source software the best chance of seeing a distant future where it may one day be needed by upcoming generations.

"It is easy to envision a future in which today's software is seen as a quaint and long-forgotten irrelevancy, until an unexpected need for it arises," the GitHub Archive website explains.

"Like any backup, the GitHub Archive Program is also intended for currently unforeseeable futures as well."

In those unforeseeable futures, it's hard to know exactly what future humans will make of the archive's coded contents, or how they might be able to access and use them.

For that reason, the vault will also contain a separate, human-readable reel, called the Tech Tree, explaining the technical history and cultural context of the archive's contents.

The Tech Tree won't just throw future humans into the world of 21st century open source code, but serve as a primer for what these programs are, and what kind of technology they run on.

"It will also include works which explain the many layers of technical foundations that make software possible: microprocessors, networking, electronics, semiconductors, and even pre-industrial technologies," Metcalf explains.

"This will allow the archive's inheritors to better understand today's world and its technologies, and may even help them recreate computers to use the archived software."

See the original post here:
A Staggering 21TB of Source Code Were Just Buried in The Arctic For an Unknown Future - ScienceAlert

Opening up new AI applications – ComputerWeekly.com

There seems no doubt that the open-systems, open-platform approach through the latter part of the last century liberated IT from its once proprietary and lofty perch, ushered in the universal connectivity of the internet, and put technology into the hands and pockets of everybody.

This is a good thing, not because of altruism, but fundamentally it stimulates and encourages innovation. Technology is one place where levelling up can be seen in action producing good results. Programming has not for a long time been the preserve of a select few assembly language programmers crafting the control systems of Apollo moon shots, but anyone, anywhere can visually or naturally create and combine logic to generate new outcomes.

The compute, connectivity and capacity power of technology enables it to do more of the digital mundane, presenting people with an ever-higher level of abstraction and empowering individuals to be more creative and productive. While it sometimes seems that new technology will simply replace people, employed properly, it can augment and enhance human endeavours, as any effective tool should.

Why, then, does artificial intelligence (AI) so often get portrayed as something far more insidious and threatening?

Part of the problem is the mystique. The IT industry, like many others, is prone to build up hype, which can portray some technology as so special that it needs to be restricted to a certain group of users, imbued with almost magical powers. Although specialist skills and training are vital, especially in the early days, most innovation only really blooms when new technology moves into mass adoption and use, often with surprising applications and unexpected consequences.

AI has been through a long gestation phase, but with machine learning applications in particular now being much better understood, it seems ripe for wider use. Supporting this requires the increased accessibility and diversity that comes from ease of use, but also from novel examples and use cases, and reduced financial barriers to entry. Increased usage breeds familiarity, then acceptance and understanding of what it can (and cant) do and where best to apply.

Driving this process of democratising AI involves creating communities of trusted partners to make a positive impact. Flexible technology models, such as the use of open source software and the delivery of capabilities dynamically such as AI as a service form part of the process, but alone are insufficient. It also requires a fundamental attitude shift towards ease and speed of adoption and the willingness to build on, and combine, the expertise of others, to deliver easily interpreted results that add clear value.

This democratisation process is not easy for some companies because it involves significant give, as well as take. One company on a mission to make AI accessible to everyone is the open source software company H2O.ai, headquartered in Mountain View, California. While its workforce is heavily oriented around highly experienced data scientists, it understands the mantra that data is a team sport and increasingly relies on a broader cohort of roles.

To be properly understood, AI needs to shift away from being technical discussions about data, machine learning and complex interpretation, to more about stories or business use-cases with immediate applicability.

Some verticals are frequently the earlier adopters, but as the technology opens up, patterns change. H2O.ais financial services customers have already applied AI to fraud detection, insurance underwriting and predicting customer actions, but right now, when business models are more uncertain and difficult to plan than ever, many processes in all types of enterprise would benefit from a little guidance.

Healthcare seems an obvious place to apply AI and get results quickly, for example to assist with staffing predictions, modelling virus spread and consequent demands on facilities. But recent step-changes in human behaviour, from shopping habits to working from home, make planning for logistics, supply and consumption more complex and introduce too many new variables.

It is here where a more open model of AI, able to encompass new models and multiple external data sets, and bring together diverse data science and domain expertise, can make a difference.

Along with many companies in the evolving AI sector, H2O.ai has built its success so far on targeting large organisations and data scientists, leading to a plethora of machine learning models and information dashboards. The next step in the democratisation process is the story of AI being told in the heart of businesses large and small, with intuitive and extensible AI-enabled services being applied to enhance business processes.

Guidance with some tasks coming from real-time analysis creating actionable advice, and automation of the most mundane activities, allowing more time for the application of the human mind, is where it can add most value.

AI is not about humans being replaced by super-smart machines, but being better supported by them. Ironically, the most important thing about AI is getting more humans using their own intelligence to apply it to an increasingly diverse range of applications. Open and business-oriented AI seems to be the way to go.

Follow this link:
Opening up new AI applications - ComputerWeekly.com

Adobe, IBM and Red Hat Announce Strategic Partnership to Advance Customer Experience Transformation – PRNewswire

SAN FRANCISCO andARMONK, N.Y. and RALEIGH, N.C., July 21, 2020 /PRNewswire/ --Adobe (Nasdaq: ADBE), IBM (NYSE: IBM) and Red Hat today announced a strategic partnership to help accelerate digital transformation and strengthen real-time data security for enterprises, with a focus on regulated industries. The intent of the partnership is to enable companies to deliver more personalized experiences across the customer journey, driving improved engagement, profitability and loyalty.

As companies undergo their digital transformations and move core workloads to the cloud, the entire C-suite is facing a re-framing of their roles to meet customer demands all while keeping security front and center. Chief Marketing Officers and Chief Digital Officers particularly those working in regulated industries such as banking and healthcare are finding that with the emphasis on data-driven marketing, they are now becoming stewards of critical enterprise and customer information. For these executives, the need to protect data while delivering meaningful customer experiences is paramount.

The partnership will initially focus on:

"Now more than ever companies are accelerating their efforts to engage customers digitally," said Anil Chakravarthy, executive vice president and general manager, Digital Experience, Adobe. "We are excited to partner with IBM and Red Hat to enable companies in regulated industries to meet this moment and use real-time customer data to securely deliver experiences across any digital touchpoint, at scale and compliant with regulations."

"The reality is that today, businesses across industries are operating in an experience first world where it is possible to gain immense value from data if trust and technology flexibility are central to the equation," said Bridget van Kralingen, senior vice president, IBM Global Markets. "It is with these principles as the focus of our partnership bringing Adobe's marketing expertise, IBM's industry domain knowledge and the open innovation of Red Hat that will give clients the confidence to use their data for new competitive advantage."

"Being competitive in the digital economy requires delivering innovation quickly," said Ashesh Badani, senior vice president, Cloud Platforms, Red Hat. "Through this collaboration, Adobe, IBM and Red Hat are enabling organizations to deliver great digital experiencesin any environmentwith flexibility and speed across the hybrid cloud, whether in on-premises data centers oracross multiplepublic clouds."

As part of the partnership IBM has named Adobe its "Global Partner for Experience" and will begin adopting Adobe Experience Cloud and its enterprise applications to transform its own global marketing.

About Adobe Experience ManagerAdobe Experience Manager is recognized by major industry analyst firms as the most advanced enterprise application for digital experience management, integrating scalable, secure and agile content management (CMS), digital asset management (DAM), digital signage management and customer communication (CCM) applications. Industry analysts have named Adobe a leader in over 25 major reports focused on experience more than any other technology company.

About Adobe Adobe is changing the world through digital experiences. For more information, visit http://www.adobe.com.

About IBMTo learn more about how IBM is working with Adobe and to enable enterprises to transform with cloud and AI technologies and services, visitwww.ibm.com/adobe-partnershipor engage with us on Twitter @ibm.

About Red HatRed Hatis the world's leading provider of enterprise open source software solutions, using a community-powered approach to deliver reliable and high-performing Linux, hybrid cloud, container, and Kubernetes technologies. Red Hat helps customers integrate new and existing IT applications, develop cloud-native applications, standardize on our industry-leading operating system, and automate, secure, and manage complex environments. Award-winningsupport, training, and consulting services make Red Hat a trusted adviser to the Fortune 500. As a strategic partner to cloud providers, system integrators, application vendors, customers, and open source communities, Red Hat can help organizations prepare for the digital future.

Press Contacts Stefan OffermannAdobe650-743-1326[emailprotected]

Hannah SlocumIBM[emailprotected]

John TerrillRed Hat[emailprotected]

SOURCE IBM

http://www.ibm.com

View original post here:
Adobe, IBM and Red Hat Announce Strategic Partnership to Advance Customer Experience Transformation - PRNewswire

Open Source Software Market 2020: Potential Growth, Challenges, and Know the Companies List Could Potentially Benefit or Loose out From the Impact of…

InForGrowth has added Latest Research Report on Open Source Software Market 2020 Future Growth Opportunities, Development Trends, and Forecast 2026. The Global Open Source Software Market market report cover an overview of the segments and sub-segmentations including the product types, applications, companies & regions. This report describes overall Open Source Software Market size by analyzing historical data and future projections.

The report features unique and relevant factors that are likely to have a significant impact on the Open Source Software market during the forecast period. This report also includes the COVID-19 pandemic impact analysis on the Open Source Software market. This report includes a detailed and considerable amount of information, which will help new providers in the most comprehensive manner for better understanding. The report elaborates the historical and current trends molding the growth of the Open Source Software market

Get Exclusive Sample Report on Open Source Software Market is available at https://inforgrowth.com/sample-request/6210788/open-source-software-market

Market Segmentation:

The segmentation of the Open Source Software market has been offered on the basis of product type, application, Major Key Players and region. Every segment has been analyzed in detail, and data pertaining to the growth of each segment has been included in the analysis

Top Players Listed in the Open Source Software Market Report areIntel

Based on type, report split into Shareware

Based on Application Open Source Software market is segmented into BMForum

Impact of COVID-19: Open Source Software Market report analyses the impact of Coronavirus (COVID-19) on the Open Source Software industry. Since the COVID-19 virus outbreak in December 2019, the disease has spread to almost 180+ countries around the globe with the World Health Organization declaring it a public health emergency. The global impacts of the coronavirus disease 2019 (COVID-19) are already starting to be felt, and will significantly affect the Open Source Software market in 2020

COVID-19 can affect the global economy in 3 main ways: by directly affecting production and demand, by creating supply chain and market disturbance, and by its financial impact on firms and financial markets.

Download the Sample ToC to understand the CORONA Virus/COVID19 impact and be smart in redefining business strategies. https://inforgrowth.com/CovidImpact-Request/6210788/open-source-software-market

Open Source SoftwareMarket: Key Questions Answered in Report

The research study on the Open Source Softwaremarket offers inclusive insights about the growth of the market in the most comprehensible manner for a better understanding of users. Insights offered in the Open Source Softwaremarket report answer some of the most prominent questions that assist the stakeholders in measuring all the emerging possibilities.

Get Special Discount Up To 50%,https://inforgrowth.com/discount/6210788/open-source-software-market

FOR ALL YOUR RESEARCH NEEDS, REACH OUT TO US AT:Address: 6400 Village Pkwy suite # 104, Dublin, CA 94568, USAContact Name: Rohan S.Email:[emailprotected]Phone: +1-909-329-2808UK: +44 (203) 743 1898

See original here:
Open Source Software Market 2020: Potential Growth, Challenges, and Know the Companies List Could Potentially Benefit or Loose out From the Impact of...

The Linux Foundation Is Making It Even Easier for Health Agencies to Use Apples COVID-19 Exposure Notification System – iDrop News

The battle against COVID-19 is far from over, and while some countries around the world have found themselves able to safely and slowly start easing lockdown restrictions, others have been facing second waves or even unfinished first waves, often as a result of moving too soon or not having the necessary health infrastructure in place to help prevent the spread of the novel coronavirus.

As countries like South Korea and Singapore discovered early on, contact tracing has become an important part of controlling the pandemic, and most health authorities around the world have been doing it in some form or another. Usually, this is just done the old-fashioned way, which involves interviewing those who are diagnosed with COVID-19 to find out who they may have been in contact with, and then attempting to notify those individuals of their possible exposure to the virus so they can come in and get tested.

Due to the complexities of adopting this on a wider scale, however, several government health agencies began to adopt digital methods of contact tracing. Singapore was among the first, with its TraceTogether app, which users could install on their iPhone or Android smartphone, where it would use Bluetooth to keep track of other smartphones that it came into close contact with.

The downside to apps like TraceTogether, however, is that not only did everybody need to download and install the app, but it also needed to be left running in the foreground to be truly effective, impacting battery life in the process. To address these kinds of problems, Apple and Google forged a landmark partnership to develop a way to do this kind of tracing that would effectively work in the background while also promising to protect users privacy.

Apple and Google did this by focusing on a decentralized approach, where the list of devices that you came into contact with would be stored only on your actual iPhone or Android smartphone, and would be done so in a way that would not give up any personally identifying information only randomized Bluetooth IDs. In the event a patient received a positive COVID-19 diagnosis, the system could use the randomized Bluetooth IDs stored on that persons iPhone to notify other users who may have been exposed to the virus.

Unfortunately, not everybody liked Apple and Googles decentralized approach, and countries like France actually got a bit hostile over the fact that Apple wasnt doing anything to help them do contact tracing their own way which generally meant collecting a centralized database to track the movements of their citizens. By contrast, Apple and Googles system doesnt use location services at all, and apps that plug into it are even forbidden from including their own location tracking features in fact thats one of a whole list of extra rules that governments and health agencies have to agree to before theyre allowed to use the Apple-Google API.

In fact, Apple went so far as to eschew the name contact tracing for its API to avoid the negative privacy connotations associated with the term, choosing instead to call it an Exposure Notification System, since of course, that is its real purpose.

This has led to a mishmash of contact tracing apps around the world, with some using the Apple-Google API and others using their own systems, with varying levels of success. Not surprisingly, however, its those that use the Apple-Google API that have been the most popular with end users, even though only a handful of countries have yet embraced the Apple-Google API.

Leaving aside political issues in countries like France and the U.K., one other reason for this relatively slow uptake may simply be the complexity of getting exposure notification apps up and running. Many government health agencies dont have the technical expertise to build these apps, and may even have a hard time finding developers with the necessary chops to pull it off, especially if theyre restricted to looking only within their own borders.

Now, however, The Linux Foundation is stepping up to help fill this gap in capabilities by partnering with several big players and two existing COVID-19 apps that use the Apple-Google API to give health authorities around the world a big head start in putting together their own apps.

Despite its obvious association with the Linux operating system, the Linux Foundation is a non-profit organization with the goal of promoting open source software technologies around the world in general technologies for which Linux is of course the main poster child. Now it has announced the new Linux Foundation Public Health initiative (LFPH) with seven Premier members to help public health authorities (PHAs) worldwide not only in the fight against COVID-19, but to also empower them to deal with future pandemics.

The new initiative involves some heavy players, including Cisco, doc.ai, Geometer, IBM, NearForm, Tencent, and VMware, plus the exposure notification apps COVID Shield and COVID Green. The initial focus of LFPH will be to empower applications to use the Google Apple Exposure Notification (GAEN) system, but it plans to expand beyond that to support all aspects of PHAs testing, tracing, and isolation activities.

The first of the two apps involved, COVID Shield, was developed by a volunteer team at Ottawa-based e-commerce firm Shopify in cooperation with the Canadian government and the Province of Ontario, and is already in the process of being rolled out in Canada; it was supposed to arrive for residents of Ontario on July 2, but has been pushed back to July 24, due to delays in getting it approved by the Canadian federal government, according to iPhone in Canada.

The second app, COVID Green, was developed by a team at NearForm, one of the Foundations Premier members, on behalf of the Irish Government. It was deployed by Irelands Health Services Executive two weeks ago, and has already been adopted by over 30 percent of the countrys adults.

The source code for both apps is being made available for public health agencies and their technology partners to use as templates for building their own exposure notification apps, and the Linux Foundation adds that other organizations are also expected to contribute the source code for their apps to LFPH in the coming weeks. In every case, however, the apps will be using the exposure notification system that was jointly developed by Apple and Google, which should hopefully help to spur even wider adoption of this technology.

At this point, while a few U.S. states have been working on their own apps, there has yet to be an app in the U.S. that uses the Apple and Google exposure notification API. In fact, although the Canadian province of Alberta was actually the first agency in North America to release a contact tracing app of any kind, Canadas COVID Shield will be the first to use the Apple-Google system.

Go here to see the original:
The Linux Foundation Is Making It Even Easier for Health Agencies to Use Apples COVID-19 Exposure Notification System - iDrop News

GitHub Just Sealed All Its Open Source Code in an Apocalypse-Proof Vault – Futurism

Locked Up

Earlier this month, the code management platform GitHub sealed away its archive of open source software in an Arctic vault so deep that they say it could survive a nuclear blast.

The mildly-outlandish idea behind the move, Engadget reports, is to give a boost to future generations after a hypothetical civilization-ending catastrophe. Should that happen, whatever civilization emerges from the ashes wont have to start from scratch and could instead tap the knowledge of modern-day coders and engineers.

Its been almost a year since GitHub announced its plan to store the code in the Arctic World Archive, an abandoned Norwegian coal mine protected by hundreds of meters of permafrost. The cache is stored on a type of microfilm that can be read with a physical magnifying glass.

Also sealed in the same mine are Vatican records, movies, and a vast array of other digital archives. And theyre in good company: The Doomsday Seed Vault is located on the same island of Spitsbergen.

Its difficult to imagine a societal catastrophe thats just cataclysmic enough that the most pressing need for a new society is to recover lost software. But it doesnt hurt to have a copy backed up just in case.

Still, as Engadget reports, the most obvious benefit for archiving the open-source software may be for the developers involved: Anyone who contributed to a project that made its way into the Arctic World Archive gets to display a little badge next to their username on GitHub.

READ MORE: GitHub is done depositing its open source codes in the Arctic [Engadget]

More on arctic vaults: The Melting Arctic Is Releasing Poison, Disease and Nuclear Waste

View original post here:
GitHub Just Sealed All Its Open Source Code in an Apocalypse-Proof Vault - Futurism

Advanced Cloud Engineer Bootcamp from The Linux Foundation Helps IT Professionals Move Into Cloud Careers – PRNewswire

SAN FRANCISCO, July 21, 2020 /PRNewswire/ -- Building on the popularity of its beginner Cloud Engineer Bootcamp launched last month, The Linux Foundation, the nonprofit organization enabling mass innovation through open source, today announced the availability of an Advanced Cloud Engineer Bootcamp program, designed to help experienced IT professionals move into cloud engineering roles in as little as six months. Additionally, the foundation has announced a new training course, LFS243 - Service Mesh Fundamentals, which will be available beginning July 31, will also be a part of this new bootcamp.

The Linux Foundation Advanced Cloud Engineer Bootcamp bundles self-paced eLearning courses with certification and dedicated instructor support for a comprehensive and well-rounded educational program. The training covers advanced IT concepts related to cloud, containers, service mesh, monitoring, logging and more, providing all the knowledge needed to move roles into cloud engineering. The specific courses and certification exam included, all of which are taken online, are:

Participants will also have access to a bootcamp-specific online forum to interact with other students and instructors, as well as live virtual office hours with course instructors five days per week. Candidates have unlimited access to the program for 12 months. Those who spend approximately 10 hours per week on the materials should expect to complete the bootcamp in about six months, enabling working professionals to take advantage of the program. Upon completion, participants will receive badges for the CKA certification exam and the entire bootcamp. Badges can be independently verified by potential employers at any time.

"Upon launching our Cloud Engineer Bootcamp last month, we received extensive feedback that more seasoned professionals would like a similarly defined pathway to becoming a cloud engineer," said Clyde Seepersad, SVP and general manager of training & certification at The Linux Foundation. "The Advanced Cloud Engineer Bootcamp is able to skip over many of the beginner concepts and go directly into advanced topics that require experience administering IT systems. This program will enable many professionals to gain new, highly in demand skills that will help them advance their careers. We also anticipate many companies taking advantage of this program to fill skills gaps within their organizations as they increase their use of cloud native tools."

"Cloud native tools like Kubernetes, Prometheus, Helm, and service meshes are enabling companies to deploy software efficiently and operate at unprecedented scale," said Priyanka Sharma, general manager, Cloud Native Computing Foundation. "This means the demand for talented cloud engineers is soaring, especially as business workflows continue to move online. We're eager to help our community advance their cloud native knowledge, and the Advanced Cloud Engineer Bootcamp is the most affordable and streamlined way to do so."

LFS243 - Service Mesh Fundamentalsis a new training course which is also now open for enrollment, with access to course materials becoming available for enrolled students on July 31. The course is included as part of the Advanced Cloud Engineer Bootcamp, or is available to take standalone. With the growth of microservices and Kubernetes production environments, there is a growing need to have tools to monitor and manage network traffic. This course explores the use of Envoy Proxy and Istio to take control of network access.

The Advanced Cloud Engineer Bootcamp is available for immediate enrollment. The standard $999 bootcamp fee provides unlimited access to the course for one year including all content and labs. Through July 31, 2020 the bootcamp is being offered at an introductory fee of $599. Interested individuals may enroll here. Enterprises interested in purchasing bulk bootcamp enrollments can request more information here.

About the Linux Foundation Founded in 2000, the Linux Foundation is supported by more than 1,000 members and is the world's leading home for collaboration on open source software, open standards, open data, and open hardware. Linux Foundation's projects are critical to the world's infrastructure including Linux, Kubernetes, Node.js, and more. The Linux Foundation's methodology focuses on leveraging best practices and addressing the needs of contributors, users and solution providers to create sustainable models for open collaboration. For more information, please visit us atlinuxfoundation.org.

The Linux Foundation has registered trademarks and uses trademarks. For a list of trademarks of The Linux Foundation, please see its trademark usage page:www.linuxfoundation.org/trademark-usage. Linux is a registered trademark of Linus Torvalds.

Media Contact:Dan BrownThe Linux Foundation415-420-7880[emailprotected]

SOURCE The Linux Foundation

http://www.linuxfoundation.org

Continued here:
Advanced Cloud Engineer Bootcamp from The Linux Foundation Helps IT Professionals Move Into Cloud Careers - PRNewswire

OpenStack Community Delivers Future of Bare Metal: White Paper Details Maturity and Adoption of Ironic Bare Metal as a Service – PR Web

The white paper includes case studies from users including StackHPC, SuperCloud, Red Hat, VEXXHOST and more.

AUSTIN, Texas (PRWEB) July 20, 2020

Today, the Ironic community published a white paper that highlights the scope, growth and maturity of the bare metal provisioning software. The white paper was developed by more than 26 contributors over 12 months and details all aspects of bare metal provisioning and lifecycle management via the OpenStack project. It provides information on performance, security, compliance and stack independence, as well as non-virtualizable resources associated with bare metal.

The white paper is a deep dive into the tools, clients and automation that demonstrate how the mature Ironic software delivers stable, production-proven bare metal compute instances, a popular option for deployment of container-based workloads because bare metal avoids the overhead and performance penalties common with full-featured hypervisors such as KVM.

*** Download now: Building the Future on Bare Metal: How Ironic Delivers Abstraction and Automation using Open Source Infrastructure here: https://www.openstack.org/bare-metal/how-ironic-delivers-abstraction-and-automation-using-open-source-infrastructure

Ironic Case Studies HighlightedThe white paper includes case studies from users including StackHPC, SuperCloud, Red Hat, VEXXHOST and more. Use cases highlighted in these stories include:

Julia Kreger, Ironic Project Team Lead, recalled an anecdote about hearing first-hand about the value of the Ironic software: At a conference a few years ago, I sat down to dinner next to someone I did not know. He started to tell me of his job and his long hours in the data center. He asked me what I did, and I told him I worked as a software engineer in open source. And he started talking about some tooling he recently found that took tasks that would normally take nearly two weeks for racks of servers, to just a few hours. He simply glowed with happiness because his quality of life and work happiness had exploded since finding this Bare Metal as a Service tooling called Ironic. As a contributor, this is why we contribute. To make those lives better.

The paper explores how the Open Infrastructure community has addressed the bare metal provisioning problem with entirely free open source software. It discusses the issues operators face in discovering and provisioning servers, how the OpenStack community has solved these issues with Ironic and the future of open infrastructure and hardware management, emphasizing the necessity of open source and the value of contributors continuing to build on top of strong foundations. For operators interested in deploying Ironic, they select a partner from the dozens of vendors in the Ironic Bare Metal Program.

About the OpenStack Foundation and IronicIronic is an open source project that fully manages bare metal infrastructure and is part of OpenStack. The OpenStack Foundation (OSF) supports the development and adoption of open infrastructure globally, across a community of over 100,000 individuals in 187 countries, by hosting open source projects and communities of practice, including datacenter cloud, edge computing, NFV, CI/CD and container infrastructure.

###

Share article on social media or email:

See the article here:
OpenStack Community Delivers Future of Bare Metal: White Paper Details Maturity and Adoption of Ironic Bare Metal as a Service - PR Web

GitHub is done depositing its open source codes in the Arctic – Yahoo Finance Australia

Last year, GitHub revealed its plan to store all of its open source software in an Arctic vault as part of its Archive Program. Now the code-hosting platform is done making sure future generations can access them even if civilization collapses within the next 1,000 years. In a blog post celebrating the undertakings success, GitHubs Director for Strategic Programs Julia Metcalf has revealed that the services code collection was deposited into the vault on July 8th, 2020 after delays caused by the coronavirus pandemic.

GitHubs archive partner Piql wrote 21TB of repository data onto 186 reels of piqlFilm a digital photosensitive archival film that can be read by a computer, or a human with a magnifying glass. You know, in case humanity suffers from global power outage. The service originally hoped to be done with the task by February, but it had to wait until it was possible for the Piql team to travel to the Norwegian archipelago of Svalbard, which only recently re-opened its borders. It also had to drop its plans to send its own team to the Arctic.

The collection now sits inside a chamber within a decommissioned coal mine, under hundreds of meters of permafrost. To recognize everyone who contributed to the software stored in the vault, GitHub is also rolling out a special badge thats displayed in the highlights section of a developers profile. Hovering over the badge shows the projects they contributed to, which ultimately became part of the Arctic Vault.

Read the rest here:
GitHub is done depositing its open source codes in the Arctic - Yahoo Finance Australia