GitHub will preserve open source code in an Arctic vault – Techworm

GitHub on Wednesday announced its plans to launch theArchive Code Vault with the aim to preserve open-source software for future generations for at least 1,000 years.

The code-sharing site is partnering with the Long Now Foundation, the Internet Archive, the Software Heritage Foundation, Arctic World Archive, Microsoft Research, the Bodleian Library, and Stanford Libraries to ensure the long-term preservation of the worlds open-source software.

Also Read- Best GitHub Alternatives

There is a long history of lost technologies from which the world would have benefited, as well as abandoned technologies which found unexpected new uses, from Roman concrete, or the anti-malarial DFDT, to the hunt for mothballed Saturn V blueprints after the Challenger disaster, according to the GitHub announcement. It is easy to envision a future in which todays software is seen as a quaint and long-forgotten irrelevancy until an unexpected need for it arises. Like any backup, the GitHub Archive Program is also intended for currently unforeseeable futures as well.

The company plans to store and preserve open-source software like Flutter and TensorFlow in an abandoned coal mine in Svalbard, Norway, in the event the earth is hit by possible doomsday scenarios like an apocalypse. The GitHub Arctic Code Vault is a data repository preserved in the Arctic World Archive (AWA), a very-long-term archival facility 250 meters deep in the permafrost of an Arctic mountain. The archive is located in a decommissioned coal mine in the Svalbard archipelago, closer to the North Pole than the Arctic Circle.

GitHub stores its data on specialized ultra-durable film, which is coated in iron oxide powder. This data can be read by a computer or a human with a magnifying glass in case of a global power outage. Remarkably, this film will last for 1,000 years.

Piql AS, a Norwegian data storage tech company that makes the special film reels, said that they should last for up to 750 years in normal conditions, and perhaps even 2,000 years if stored in a cold, dry, and low-oxygen cave.

The reels are stored in a white container and GitHub plans to leave 200 such platters with each carrying 120 gigabytes of open source software code, in the vault. Among the first data deposit, open-source software codes to be stored at the vault included the Linux and Android operating systems and 6,000 other important open source applications.

Were excited to partner with Piql to help preserve open-source software for future generations, said Kyle Daigle, director of special projects at GitHub.

Piqls custom film and archiving technologies will allow us to store terabytes of data on a durable medium designed to last for over 1,000 years. Were delighted that next year every active public GitHub repository will be written to this film, and safeguarded in the Arctic World Archive in Svalbard, for the centuries and generations to come.

GitHub is planning to capture a snapshot of every active public repository on 02/02/2020 and preserve that data in the Arctic Code Vault. The snapshot will include public code repositories as well as significant dormant repos as determined by stars, dependencies, and an advisory panel, according to GitHub.

The snapshot will consist of the HEAD of the default branch of each repository, minus any binaries larger than 100kB in size. Each repository will be packaged as a single TAR file, it adds.

For greater data density and integrity, most of the data will be stored QR-encoded. A human-readable index and guide will itemize the location of each repository and explain how to recover the data.

The advisory panel will include experts from a range of fields, including anthropology, archaeology, history, linguistics, archival science, and futurism.

Besides the GitHub Archive Program, the company is also working onMicrosoftsProject Silica to archive all active public repositories for over 10,000 years, by writing them into quartz glass platters using a femtosecond laser.

Original post:
GitHub will preserve open source code in an Arctic vault - Techworm

Global Database Software Market Outlook to 2030 – Market Projected to Reach $99.36 Billion by 2022 – ResearchAndMarkets.com – Business Wire

DUBLIN--(BUSINESS WIRE)--The "Database Software Global Market Report 2020" report has been added to ResearchAndMarkets.com's offering.

The global database software market was valued at about $67.52 billion in 2018 and is expected to grow to $99.36 billion at a CAGR of 10.1% through 2022. Major players in the market are Oracle, Microsoft, IBM, SAP and Amazon.

Increase in the amount of data generated by industries in their regular operations and integration of technologies including Internet of Things (IoT) in the process are expected to benefit the forecast period as it rises the need of using database software to handle vast amount of data with confidentiality, integrity, and availability. In addition, the introduction of customer-interface applications and the implementation of cloud computing technologies in industries including small scale enterprises have also increased the demand for database software. For example, companies including Coca-Cola has invested in FileMaker, a custom database software to deal with its large amount of data in a more efficient way. The use of FileMaker enables managers to analyze reports to get a glimpse of what is happening company-wide in-real time.

The increasing number of laws are expected to limit the growth of companies in the database software market during the forecast period. The laws include the Intellectual property (IPR) Act regime which restricts the expansion of software companies from developing countries expanding overseas and the law for software development in a number of countries do not permit the commercial distribution of any Open Source Software (OSS). The industry is also required to adhere to the new changes being made in EU's GDPR (European Union's General Data Protection Regulation) law which is effective since May, 25, 2018. For instance, around 60% of tech companies are not ready to comply with the EU GDPR according to a recent study as it restricts innovation in their products.

One of the major trends gaining traction in the database software industry is the use of Hybrid Transactional and Analytical Processing Systems (HTAP) to reduce the data processing time. HTAP is an in-memory database which allows user to store data without compromising transactions or analytical workloads and to perform online transaction processing and online analytical processing without duplicating data. This helps to reduce data processing and data retrieval time and also helps in real-time decision making. HTAP performs online-analysis simultaneously with data processing. IN 2016, IBM released IBM DB2 12, a HTAP platform that has in-memory capabilities and boosts real-time analytic processing 100 times over the previous versions.

The EU's General Data Protection Regulation (GDPR) break the split of regulations for the security of transactions in the European Union (EU). GDPR is a regulation that requires businesses to protect the personal data and privacy of EU citizens for transactions that occur within EU member states. This regulation came into effect in May, 2018. The regulation mandates the implementation of data protection practices and any breach in this context could be fined up to 20 million (1.1217 US-Dollar), or 4 percent of global annual turnover. Patent laws are applicable to the database software so as to enable the companies to effectively protect and leverage the commercial value of their innovations. For example, regulations related to protecting innovations in the field of software through patent rights has compiled IBM to apply for a patent of a database management system which uses block chain database technology.

In January 2019, Microsoft acquired Citus Data for an undisclosed amount. The acquisition of Citus is expected to enhance and scale Microsoft's capabilities in high-performance PostgreSQL databases. It also helps to improve performance scale for application developers as Citus is an open source extension to PostgreSQL that modifies PostgreSQL into a distributed database. Citus Data is a developer and distributor of database products. It focuses on open source PostgreSQL and on enhancing the performance of the enterprises by providing a horizontal scalable database. Citus Data was established in 2010 and is headquartered in San Francisco, California, USA.

Key Topics Covered

1. Executive Summary

2. Database Softwares Market Characteristics

3. Database Softwares Market Size and Growth

3.1. Global Database Softwares Historic Market, 2015 - 2019, $ Billion

3.1.1. Drivers of the Market

3.1.2. Restraints on the Market

3.2. Global Database Softwares Forecast Market, 2019 - 2022F, 2025F, 2030F, $ Billion

3.2.1. Drivers of the Market

3.2.2. Restraints on the Market

4. Database Softwares Market Segmentation

4.1. Global Database Softwares Market, Segmentation By Type, Historic and Forecast, 2015-2019, 2022F, 2025F, 2030F, $ Billion

4.2. Global Database Softwares Market, Segmentation By End User, Historic and Forecast, 2015-2019, 2022F, 2025F, 2030F, $ Billion

4.3. Global Database Softwares Market, Segmentation By Deployment, Historic and Forecast, 2015-2019, 2022F, 2025F, 2030F, $ Billion

5. Database Softwares Market Regional & Country Analysis

5.1. Global Database Softwares Market, Split By Region, Historic and Forecast, 2015-2019, 2022F, 2025F, 2030F, $ Billion

5.2. Global Database Softwares Market, Split By Country, Historic and Forecast, 2015-2019, 2022F, 2025F, 2030F, $ Billion

Companies Mentioned

For more information about this report visit https://www.researchandmarkets.com/r/58ayiy

Link:
Global Database Software Market Outlook to 2030 - Market Projected to Reach $99.36 Billion by 2022 - ResearchAndMarkets.com - Business Wire

Global Open Source Software Market 2019 Growth and Share Analysis By Top players, Application, and Types and Regional Forecast 2023 – BeetleVersion

Global Open Source Software Market Growth, Size, Share & Trend Analysis By Type (Shareware, Bundled Software, BSD(Berkeley Source Distribution), Other, ) Applications (Phpbb, BMForum, Phpwind, Other, ) Region, Competitive Insights, And Segment Forecasts, 2019 2023

The Global Open Source Software Market report mainly studies the market size, recent trends and development status of the Open Source Software market, as well as investment opportunities, government policy, market dynamics (drivers, restraints, opportunities), supply chain and competitive landscape. Technological innovation and advancement will further optimize the performance of the product, making it more widely used in downstream applications. Moreover, Porters Five Forces Analysis (potential entrants, suppliers, substitutes, buyers, industry competitors) provides crucial information for knowing the Open Source Software market.

Get Free PDF Sample Report: https://www.globalmarketers.biz/report/semiconductor-and-electronics/global-open-source-software-industry-market-research-report/8508#request_sample

Major Players Of Global Open Source Software Market

Companies:

Astaro CorpIBMRethinkDBCleversafeOracleIntelEpsonTranscendCanonicalActuateAlfresco Software IncAcquiaClearCenterContinuent Inc.Compiere Inc.

This report covers the Types as well as Application data for Open Source Software Market along with the country level information for the period of 2013-2023

Global Open Source Software Market Segmented By Types and By its Applications:

Type:

SharewareBundled SoftwareBSD(Berkeley Source Distribution)Other

Application:

PhpbbBMForumPhpwindOther

Any question or unique requirement? ask to our industry professional @ : https://www.globalmarketers.biz/report/semiconductor-and-electronics/global-open-source-software-industry-market-research-report/8508#inquiry-before-buying

Global Open Source Software Market Scope and Features

Global Open Source Software Market Introduction and Overview Includes Open Source Software market Definition, Market Scope and Market Size Estimation and region-wise Open Source Software Value and Growth Rate history from 2013-2023, Open Source Software market dynamics:Drivers, Limitations, challenges that are faced, emerging countries of Open Source Software, Industry News and Policies by Regions.

Industry Chain Analysis To describe upstream raw material suppliers and cost structure of Open Source Software, major players of Open Source Software with company profile, Open Source Software manufacturing base and market share, manufacturing cost structure analysis, Market Channel Analysis and major downstream buyers of Open Source Software.

Global Open Source Software Market Analysis by Product Type and Application It gives Open Source Software market share, value, status, production, Open Source Software Value and Growth Rate analysis by type from 2013 to 2018. Although downstream market overview, Open Source Software consumption,Market Share, growth rate, by an application (2013-2018).

Regional Analysis This segment of report covers the analysis of Open Source Software production, consumption,import, export, Open Source Software market value, revenue, market share and growth rate, market status and SWOT analysis, Open Source Software price and gross margin analysis by regions.

Competitive Landscape, Trends And Opportunities: It includes the provides competitive situation and market concentration status of major players of Open Source Software with basic information i.e company profile, Product Introduction, Market share, Value, Price, Gross Margin 2013-2019

Open Source Software Market Analysis and Forecast by Region Includes Market Value and Consumption Forecast (2013-2023) of Open Source Software market Of the following region and sub-regions including the North America, Europe(Germany, UK, France, Italy, Spain, Russia, Poland), China, Japan,Southeast Asia (Malaysia, Singapore, Philippines, Indonesia, Thailand, Vietnam) Middle East and Africa(Saudi Arabia, United Arab Emirates, Turkey, Egypt, South Africa, Nigeria), India, South America(Brazil, Mexico, Colombia)

Browse full report: @ https://www.globalmarketers.biz/report/semiconductor-and-electronics/global-open-source-software-industry-market-research-report/8508#table_of_contents

Table Of Content

1 Open Source Software Introduction and Market Overview

2 Industry Chain Analysis

3 Global Open Source Software Value (US$ Mn) and Market Share, Production , Value (US$ Mn) , Growth Rate and Average Price (US$/Ton) analysis by Type (2013-2019)

4 Open Source Software Consumption, Market Share and Growth Rate (%) by Application (2013-2019) by Application

5 Global Open Source Software Production, Value (US$ Mn) by Region (2013-2019)

6 Global Open Source Software Production (K Units), Consumption (K Units), Export (%), Import (%) by Regions (2013-2019) 7 Global Open Source Software Market Status by Regions

8 Competitive Landscape Analysis

9 Global Open Source Software Market Analysis and Forecast by Type and Application

10 Open Source Software Market Analysis and Forecast by Region

11 New Project Feasibility Analysis

12 Research Finding and Conclusion

13 Appendix

13.1 Methodology, Research Data Source

See original here:
Global Open Source Software Market 2019 Growth and Share Analysis By Top players, Application, and Types and Regional Forecast 2023 - BeetleVersion

Object Storage Company OpenIO Joins the iRODS Consortium – PR Web

LILLE, France (PRWEB) November 15, 2019

The iRODS Consortium, the foundation that leads development and support of the integrated Rule-Oriented Data System (iRODS) data management software, welcomes OpenIO as its newest Consortium member.

OpenIO is a software-defined, open source object storage solution designed for big data and artificial intelligence applications. OpenIO scales easily while delivering consistent performance and can be deployed on-premise, cloud-hosted or at the edge, and on any hardware mix that customers choose.

iRODS is free open source software for data discovery, workflow automation, secure collaboration, and data virtualization used by research and business organizations around the globe. iRODS allows users to automate data management by creating a unified namespace and a metadata catalog of all the data and users within the storage environment.

High-performance storage solutions like OpenIO derive a great deal of value from iRODS ability to efficiently organize data, said Jason Coposky, Executive Director, iRODS Consortium. Due to their focus on open source, our organizations goals are well aligned. Given that, we are well positioned to closely integrate iRODS and OpenIO as a competitive solution, which will benefit the broader open source community.

OpenIO plans to use iRODS to orchestrate the movement of data while taking advantage of all of the performance benefits and features of the OpenIO storage solution. Given iRODS strong track record managing academic research data, OpenIO Co-Founder and CTO Jean-Franois Smigielski also sees the collaboration as an opportunity to enhance OpenIOs ability to address the needs of academic researchers through an integration with iRODS.

OpenIO is an open core solution and we are firm believers in open source technology, said Smigielski. We decided to become an iRODS Consortium member because iRODS has the same open source philosophy, and because of the strong link that exists between storage tiers and orchestrators. We wish to work closely with iRODS so that we can take advantage of their evolving features and technology, which will enable OpenIO to propose new and innovative features as early as possible.

The iRODS Consortium guides development and support of iRODS, along with providing production-ready iRODS distribution and iRODS professional integration services, training, and support. The consortium is administered by founding member RENCI, a research institute for applications of cyberinfrastructure located at the University of North Carolina at Chapel Hill.

In addition to OpenIO, current members of the iRODS Consortium include Agriculture Victoria, Bayer, Cloudian, University of Colorado, Boulder, DataDirect Networks, Maastricht University, MSC, the U.S. National Institute of Environmental Health Sciences, NetApp, Quantum, RENCI, SURF, the Swedish National Infrastructure for Computing, SUSE, Texas Advanced Computing Center (TACC), University College London, University of Groningen, Utrecht University, the Wellcome Sanger Institute, and Western Digital.

About OpenIOFounded in 2015, OpenIO develops a hyper-scalable, open source object storage solution that is ideal for Big Data, AI and HPC use cases. OpenIOs grid of nodes architecture and unique ConscienceGrid technology enable our object store to scale instantly without rebalancing data and while maintaining consistent high performance. OpenIO supports S3 and can be deployed on-premise, cloud-hosted or at the edge. It can run on any hardware and has already attracted more than 40 customers worldwide, which include among others, Dailymotion, the CEA, and the IIJ (Internet Initiative Japan) service provider.

OpenIO, which received initial support from Georges Lotigier (CEO of Vade Secure), raised $5 million from Elaia, Partech Partners and Nord France Amorage in October 2017. In July 2019, the startup won the Pass French Tech 2018-2019, a national program to support hyper-growth companies. Based in Lille, France, OpenIO also has offices in Paris and Tokyo. For more information please visit: http://www.openio.io

About iRODSiRODS is open source data management software used by research, commercial, and governmental organizations worldwide. iRODS is released as a production-level distribution aimed at deployment in mission critical environments. It virtualizes data storage resources, so users can take control of their data, regardless of where and on what device the data is stored. Find out more at http://www.irods.org.

Press Contact:Teena TouchWaters Communications415-310-3125teenat@waterscomms.com

Share article on social media or email:

Read more from the original source:
Object Storage Company OpenIO Joins the iRODS Consortium - PR Web

GitHub will store all of its public open source code in an Arctic vault – Engadget

At its Universe Developer Conference two days ago, GitHub announced its Archive Program -- its plan to preserve all of its open source software for future generations. The program will see this data stored on an ongoing basis across various data formats and locations, including in the Arctic World Archive, a vault hidden 250 meters within an Arctic mountain in Svalbard. The Doomsday seed vault is just around the corner.

The data is stored on reels of film coated with iron oxide powder. It can be read by a computer or -- in the event of a global power outage -- a human with a magnifying glass. Crucially, this film will last for 1,000 years. Among the first data deposit at the vault is the source code for Android and Linux operations systems, as well as a range of programming languages, web platforms, cryptocurrencies and AI tools. GitHub is planning on having all active public repositories stored by February 2020.

The data will sit alongside digitally preserved national archives from around the world, including artworks, music, scientific breakthroughs, historical manuscripts and archaeological finds. Should some kind of apocalyptic event take place, all this data could well be used to help rebuild a global society. If not, it will at least act as a valuable time capsule. After all, just 20 years ago open source code was a very fringe idea -- now the world all but depends on it. Who knows what technology will look like in 1,000 years' time?

Read this article:
GitHub will store all of its public open source code in an Arctic vault - Engadget

What is a software bill of materials? – Security Boulevard

With a software bill of materials (software BOM), you can respond quickly to the security, license, and operational risks that come with open source use.

A software bill of materials is a list of all the open source and third-party components present in a codebase. A software BOM also lists the licenses that govern those components, the versions of the components used in the codebase, and their patch status.

Any organization that builds software needs to maintain a software BOM for their codebases. Organizations typically use a mix of custom-built code, commercial off-the-shelf code, and open source components to create software. As one principal architect of a leading software supply chain provider notes, We have over a hundred products, with each of those products having hundreds to thousands of different third-party and open source components. A software bill of materials allows organizations to track all the components in their codebases.

The concept of a software bill of materials derives from manufacturing, where a bill of materials is an inventory detailing all the items included in a product. In the automotive industry, for example, manufacturers maintain a detailed bill of materials for each vehicle. This BOM lists both parts built by the original equipment manufacturer itself and parts from third-party suppliers. When a defective part is discovered, the auto manufacturer knows precisely which vehicles are affected and can notify vehicle owners of the need for repair or replacement.

Similarly, smart organizations that build software maintain an accurate, up-to-date software BOM that includes an inventory of third-party and open source components to ensure their code is high quality, compliant, and secure.

Have your developers used open source components in your code? Theres better than a 90% chance that they have. Open source helps you shorten development time, increase speed of execution, and profitably deliver your products to your customers. Analysts such as Forrester and Gartner note that the vast majority of IT organizations use open source software for mission-critical workloads and that some applications comprise up to 90% open source components.

But few companies have much visibility into the open source they use. Even fewer can produce an accurate, up-to-date software bill of materials that includes open source components. A comprehensive software BOM lists all open source components in your applications as well as those components licenses, versions, and patch status.

Do you know whether the licenses for the open source components your applications include are permissive or viral? Are you using one of the top open source licenses or a one-off variant?

Failure to comply with open source licenses can put businesses at signicant risk of litigation and compromise of intellectual property (IP). In 95% of the scans our software audit services team conducts, we find open source that the target doesnt know was there. Furthermore, 68% of the codebases we audited in 2018 contained components with license conflicts. A software bill of materials lists the open source licenses that govern the components you use, allowing you to assess your legal and IP risk.

Do you know whether the open source components in your codebase are being maintained? Operational risk is an important consequence of open source use. Many open source components are abandoned. In other words, they no longer have a community of developers contributing to, patching, or improving them. In fact, a recent Gartner survey found that the long-term viability of open source projects was a top concern of development organizations.

When a component is inactive and no one is maintaining it, no one is addressing potential issues such as weaknesses and vulnerabilities. Our audit services team found that 85% of the codebases we scanned in 2018 contained open source components that were more than four years out-of-date or had no development activity in the last two years. A software bill of materials lists the versions of the open source components in your codebase, so you can determine whether youre using any outdated, potentially insecure code.

Do you know whether the open source components youre using have any known vulnerabilities? While the number of vulnerabilities in open source is small compared to proprietary software, over 7,000 open source vulnerabilities were discovered in 2018 alone. Over 50,000 have emerged over the past two decades. Our audit services team found that 60% of the codebases scanned in 2018 contained at least one open source vulnerability, and over 40% contained high-risk vulnerabilities.

Only a handful of open source vulnerabilitiessuch as those infamously affecting Apache Struts or OpenSSLare ever likely to be widely exploited. But when such an exploit occurs, the need for open source security becomes front-page news, as it did with the Equifax data security breach of 2017. A major contributing factor to Equifaxs breach was the companys lack of a comprehensive IT asset inventoryin other words, a software bill of materials. This made it difficult, if not impossible, for Equifax to know if vulnerabilities existed on its networks, a report on the incident concluded. If a vulnerability cannot be found, it cannot be patched.

Software composition analysis tools can generate a complete software BOM that tracks third-party and open source components and identifies known security vulnerabilities, associated licenses, and code quality risks.

Given that open source is an essential component of application development today, every software development team should use an effective software composition analysis (SCA) tool to inventory the open source and third-party components in their code.

Maintaining a software bill of materials is vital if you want to respond quickly to the security, license, and operational risks that can accompany open source software use.

Learn more about software composition analysis tools

See the original post here:
What is a software bill of materials? - Security Boulevard

GitHub gathers friends for a security code cleanse to scrub that software up to spec – The Register

GitHub, Microsoft's cloud version control service and gripe forum, has joined with a handful of like-minded partners to form GitHub Security Lab (GSL) to better find bugs in open source software.

Consisting of GitHub security researchers, third-party code maintainers and interested parties from partner companies, GSL aspires to provide a bit more organization to the daunting task of securing open source code.

In a phone interview with The Register, Jamie Cool, VP of product security at GitHub, said the overall theme of GitHub's security announcements involves focusing the power of community.

"We recognize this is a problem that no one company can solve, including GitHub," he said. "But it's incredibly important, which is why we're investing so much in it."

The core of the Security Lab is an internal GitHub security team. The partner companies, said Cool, all share a common interest in better software.

"We basically wanted to combine the energy these companies have to secure open source software," he said, explaining that they all have something tangible to contribute, such as expertise, research or tools.

For example, Google, he said, is bringing software fuzzing tools, while Trail of Bits, a security consultancy, has committed to devoting time to open source bug hunting when not otherwise engaged.

Initially, GSL intends to lead by example, having already driven the creation of more than 100 CVEs detailing flaws that need fixing. The hope is other companies signed on as partners will participate by contributing security research findings to the open source community.

These partners include: F5, Google, HackerOne, IOActive, JP Morgan, LinkedIn, Microsoft, Mozilla, NCC Group, Oracle, Trail of Bits, Uber, and VMWare.

GSL brings with it two practical tools: CodeQL, a language for querying databases generated from source code to find variations of vulnerable code patterns, and the GitHub Advisories Database, a public data set of security advisories from GitHub and accompanying remediation info.

Back in September, GitHub bought software analysis biz Semmle for its LGTM (Looks Good To Me) vulnerability query platform. CodeQL, used to search data from LGTM, represents the reconsidered branding for that tool.

With CodeQL now open to the public, any developer able to recognize a vulnerable code pattern can search for variations on that theme in source files converted to a CodeQL database. And Mozilla is offering an incentive for doing so: Firefox's maker, a GSL partner and admitted user of Semmle's tech, has just expanded its bug bounty program to accept static analysis queries crafted in CodeQL or as Clang-based checkers.

Cool said the hope is that CodeQL will not only help people find vulnerabilities but also avoid re-implementing those same flawed patterns.

In a blog post on Thursday, Cool explained the need to make security information easier to find and more comprehensive.

"Forty per cent of new vulnerabilities in open source dont have a CVE identifier when theyre announced, meaning theyre not included in any public database," Cool said. "Seventy per cent of critical vulnerabilities remain unpatched 30 days after developers have been notified."

To help address that gap, GitHub has a service called Security Advisories, just promoted from beta to general availability, that lets developers draft advisories, coordinate private discussion in the applicable project, collaborate on a fix in a temporary private project fork, and then publish the advisory, with a CVE number bestowed by GitHub, alongside patched code.

Along similar lines, Github's automated security update scheme, by which auto-detected vulnerabilities elicit a pull request fix, has graduated to general availability.

Also, Token scanning, introduced last fall in public beta, has attracted still more partners: GoCardless, HashiCorp, Postman, and Tencent. The service, which helps developers avoid publishing sensitive tokens in git commits, previously marked the participation of Alibaba Cloud, Atlassian, AWS, Azure, Discord, Dropbox, Google Cloud, Mailgun, npm, Proctorio, Pulumi, Slack, Stripe, and Twilio.

Sponsored: Technical Overview: Exasol Peek Under the Hood

See more here:
GitHub gathers friends for a security code cleanse to scrub that software up to spec - The Register

Geek of the Week: If theres roadwork ahead, Kurt Stiles uses 3D modeling and more to drive project – GeekWire

Kurt Stiles and his team at the Washington State Department of Transportation often take to the air to better illustrate the stories theyre telling on the ground. (Photo courtesy of Kurt Stiles)

If a new roadway or bridge or other infrastructure element in Washington state looks and drives exactly like youd hoped it would, perhaps Kurt Stiles and his team at the Washington State Department of Transportation are to thank.

Stiles and the Visual Engineering Resource Group (VERG) are the visual media professionals who use a variety of tools, such as aerial photography, 3D modeling and animation, to communicate the stages of all types of projects.

Our latest Geek of the Week spent 10 years in the military before going to school for civil engineering. He was helping to raise three boys and working full time at WSDOT when he discovered the world of 3D modeling and visualization in 1998. Today he leads the group he helped develop at the agency in 2008.

The tech for 3D modeling has grown tremendously. There is no excuse now we have tremendous tools to visually communicate infrastructure change, Stiles said. Our productions can tell any story, to any audience and at any scale. Decision making processes have improved, saving time and money. All stakeholders and the public alike have a deeper understanding which translates to improved consent.

Stiles points to a variety of projects which VERG has had a hand in, whether its photography work showing everything from highway overpasses to rest areas to ferry terminals, or drone footage of a mudslide. Video production and animation is especially useful to show renderings of completed projects, such as this video-game-like fly-by of Interstate-90 near Snoqualmie Pass:

Stiles is particularly proud of the teams 3D modeling work for whats called a diverging diamond interchange, a project being implemented for the first time in Washington, in Lacey.

This retrofitted interchange will handle much more daily traffic volume and do so in a much safer way, Stiles said. Moreover, the new interchange will provide improved, safer pedestrian and bike travel, too much better than what was there originally. This type of interchange design is very progressive and will be a hallmark project for other interchange retrofits to follow in Washington.

Modeling cars and trucks on conventional roadways is all fine and good, but what is VERG going to do when we get the flying vehicles were all waiting for?

That will be fun! Im sure we can animate all sorts of flying objects, Stiles said. But we will have to make sure there is a solid tax-structure to handle all those landing pads that are going to have to be built everyone will want one! Perhaps a new tax on leather flying jackets and goggles? Im sure that will work.

Learn more about this weeks Geek of the Week, Kurt Stiles:

What do you do, and why do you do it? I built and lead a visual communication content development group that is centered in 3D computer modeling, video production and commercial photography. We provide strategic communication content for infrastructure decision makers. They use it so they can get understanding, consent, funding, etc. from their stakeholders and constituents when building civil projects.

Whats the single most important thing people should know about your field? Civil infrastructure change needs to be first and foremost communicated correctly so all parties understand what the change is and why it has to happen. Twentieth-century problems of the built-environment cannot be fixed with 20th century technology. By using 3D modeling and other tools, tremendous insight can be gained in a precognitive way. A future view can be displayed showing the pros and cons, decisions can be made quicker and with increased understanding. Time and money is saved while the project moves forward in an accelerated way.

Where do you find your inspiration? Watching an underdog, any underdog, work hard, work long and then beat the ass off some self-righteous, privileged SOB.

Whats the one piece of technology you couldnt live without, and why? Blender. Open source software that you can make a living with. You can model anything the built environment needs. Remember to give back though with donations keep Blender open source!

Whats your workspace like, and why does it work for you? VERG works in an office like a lot of Geeks. We also have a lot of outside field work, too video shoots, helicopter photography, flying drones, etc. Its never dull in VERG.

Your best tip or trick for managing everyday work and life. (Help us out, we need it.) Setting and managing production expectations. Lead the conversation with your clients based upon their spoken need and youll never go wrong.

Mac, Windows or Linux? Windows.

Kirk, Picard, or Janeway? Picard.

Transporter, Time Machine or Cloak of Invisibility? Time machine. I wanna go back so I can get it right the second time.

If someone gave me $1 million to launch a startup, I would Run to the hills with the dough? No, Id do it, but its gotta be MY startup.

I once waited in line for A warm Coke in the Philippines, which I drank with fevered intent.

Your role models: Napoleon Bonaparte: Capability is worthless without opportunity.Gen. George Patton: Lead me, follow me, or get the hell outta the wayTony Robbins: There are only two options: make progress or make solutions

Greatest game in history: Chess.

Best gadget ever: Theyre all great, but not without WD-40.

First computer: Compaq Portable.

Current phone: Android S7 or Motorola DynaTAC CellStar, I cant remember which.

Favorite app: WAYZ.

Favorite cause: Dog rescues for any dog.

Most important technology of 2019: Gaming engines.

Most important technology of 2021: Gaming engines.

Final words of advice for your fellow geeks: Just do it. Suck it up, stand for something and take the risk. Feel free to draw a line in the sand, just be able to defend it. Take ownership no one else will and youll impress the hell out of people for it.

Website: Visual Engineering Resource Group

LinkedIn: Kurt Stiles

Read more:
Geek of the Week: If theres roadwork ahead, Kurt Stiles uses 3D modeling and more to drive project - GeekWire

‘Nu’? A Software Program That Reads Yiddish – New England Public Radio

New software for searching words in digitized Yiddish books many originally written in the 19th and early 20th centuries is about to be unveiled.

The search tool will be available via the Yiddish Book Center in Amherst, Massachusetts. Its digital library includes more than 10,000 books in Yiddish but the current ability to search them is limited.

Amber Clooney, the center's digital librarian, recently demonstrated just how limited the current search software is. She pulled up a website, and a Yiddish keyboard appeared on the screen. She typed in "tsig" the Yiddish word for goat.

Because it can search only titles and authors, the software delivered just three references.

Then Clooney pulled up a different screen that showed a beta version of the new search software.

"On the [new] site, we can search anywhere in the text of a book," she said. "It's great."

Within just a few seconds, about 6,000 references came up.

A decade in development

The search software is called Jochre, for Java Optical Character Recognition. To date, it's the most comprehensive Yiddish word search tool available, according to several computer scientists and scholars.

The Yiddish Book Center, which is a repository for more than a million hard copy books in Yiddish, expects to have Jochre up and running on its site by the end of the year.

Aaron Lanksy founded the Yiddish Book Center in 1980. He once thought writing optical character recognition (OCR) for Yiddish was a pipe dream.

"It was going to be inordinately complicated, and cost $10 million, minimum," Lansky said. "We figured this was never going to happen."

And it happened by chance.

An email arrived 10 years ago, out of the blue, Lansky said, from a benevolent software engineer in France named Assaf Urieli.

Lansky said Urieli wrote that he was a computational linguist living in the French Pyrenees, and had just invented Yiddish OCR. And Urieli wanted to donate it to the Yiddish Book Center so its books could be searchable.

Urieli remembered telling Lanksy he wanted to give him a demo.

"So we can talk about it, so we can see if we can maybe do a project together," Urieli said, speaking from France.

Both Lansky and Urieli are on a mission to make Jochre and Yiddish available to the world a world they say doesn't know much about how many millions of people once spoke the 1,000-year-old language.

Yiddish was almost erased by Stalin and Hitler, then almost lost when many Jews left Europe after World War II for the U.S., and when Israel didnt make Yiddish its national language.

Urieli's work began with an interest in his own family history. He grew up between South Africa, Ohio and Israel, and knew almost nothing about Yiddish until he was a young adult. He was already multilingual when he learned his great-grandparents and generations before them spoke Yiddish. He decided to learn the language and research his family.

Thats how Urieli discovered the Yiddish Book Center's digital library.

"Among other things, I was reading about the town where my grandmother was born in Lithuania, and I was thinking how nice it would be if I could actually perform a search among older books to find all of the references to this town," Urieli said. "And I thought: well, why not write the software?"

Urieli figured it would take the summer. But after three months, he said he hadn't made much progress.

"But I suppose I couldn't abandon it, either," he said. "The idea was too fascinating."

OCR is complex

In Yiddish, many of the documents are old and have stray marks. Software can easily misread what it thinks are characters. Jochre had to be taught and retaught otherwise.

Once Urieli got past some common code challenges, the Yiddish Book Center and other libraries began to build a dictionary of words and proper names.

Urieli has not been the only one trying to build a Yiddish OCR. In roughly the same time frame, computer scientist Raphael Finkel at the University of Kentucky had been developing his own version, one that requires more human editing than Jochre.

Finkel has already used Urielis program and he's impressed.

"It is searchable in a very fast way, which is nice," Finkel said. "It makes about the same rate of OCR mistake as mine does. It's the nature of the problem mistakes in understanding the text."

An example of how a simple Yiddish phrase, Finkel said, can be misunderstood is in the phrase, "This bothers me."

In different Yiddish dialects, the sound and letters change. In the word for "me," the first Yiddish letter is the same, but the last letters are completely different to the eye and ear, as well as to an OCR program.

While no software is perfect, Finkel said, Jochre is "a great advance."

The expectation is that scholars, cultural anthropologists, families and others can dig in deeper to Yiddish history and language. Digital libraries and researchers may be able to use the software on their own sites, with a little setup on their end to make it possible.

Urieli is an enthusiast of open source software.

"You do something not because you want to get rich off of it," he said, "but because it's something that you're passionate about, and you want to share with the world."

Jochre is designed to get smarter over time through use and corrections.

The Yiddish Book Center reports they're getting about five or six corrections a day from Yiddish speakers living around the world.

Original post:
'Nu'? A Software Program That Reads Yiddish - New England Public Radio

GitHub Universe 2019: GitHub for mobile, GitHub Archive Program and more announced amid protests against GitHubs ICE contract – Packt Hub

Yesterday, GitHub commenced its popular product conference GitHub Universe 2019 in San Francisco. The two-day annual conference celebrates the contribution of GitHubs 40+ million developers and their contributions to the open source community. Day 1 of the conference had many interesting announcements like GitHub for mobile, GitHub Archive Program, and more.

Lets look at some of the major announcements at the GitHub Universe 2019 conference.

Github for mobile is a beta app that aims to give users the flexibility to work and interact with the team, anywhere they want. This will enable users to share feedback on a design discussion or review codes in a non-complex development environment. This native app will adapt to any screen size and will also work in dark mode based on the device preference. Currently available only on iOS, the GitHub team has said that it will soon come up with the Android version of it.

Our world is powered by open source software. Its a hidden cornerstone of our civilization and the shared heritage of all humanity. The mission of the GitHub Archive Program is to preserve it for generations to come, states the official GitHub blog.

GitHub has partnered with the Stanford Libraries, the Long Now Foundation, the Internet Archive, the Software Heritage Foundation, Piql, Microsoft Research, and the Bodleian Library to preserve all the available open source code in the world. It will safeguard all the data by storing multiple copies across various data formats and locations. This includes a very-long-term archive called the GitHub Arctic Code Vault which is designed to last at least 1,000 years.

Read More: GitHub Satellite 2019 focuses on community, security, and enterprise

Last year, at the GitHub Universe conference, GitHub Actions was announced in beta. This year, GitHub has made it generally available to all the users. In the past year, GitHub Actions has received contributions from the developers of AWS, Google, and others. Actions has now developed as a new standard for building and sharing automation for software development, including a CI/CD solution and native package management.GitHub has also announced the free use of self-hosted runners and artifact caching.

In May this year, GitHub had announced the beta version of the GitHub Package Registry as its new package management service. Later in September, after gathering community feedback, GitHub announced that the service has proxy support for the primary npm registry.

Since its launch, GitHub Package has received over 30,000 unique packages that served the needs of over 10,000 organizations. Now, at the GitHub Universe 2019, the GitHub team has announced the general availability of GitHub Packages and also informed that they have added support for using the GitHub Actions token.

These were some of the major announcements at day 1 of the GitHub Universe 2019 conference, head over to GitHubs blog for more details of the event.

Major product announcements aside, one thing that garnered a lot of attention at the GitHub Universe conference was the protest conducted by the GitHub workers along with the Tech Workers Coalition to oppose GitHubs $200,000 contract with Immigration and Customs Enforcement (ICE). Many high-profile speakers have dropped out of the GitHub Universe 2019 conference and at least five GitHub employees have resigned from GitHub due to its support for ICE.

Read More: Largest women in tech conference, Grace Hopper Celebration, renounces Palantir as a sponsor due to concerns over its work with the ICE

Yesterday at the event, the protesting tech workers brought a giant cage to symbolize how ICE uses them to detain migrant children.

Tech workers around the world have extended their support to the protest against GitHub.

GitHub along with Weights & Biases introduced CodeSearchNet challenge evaluation and CodeSearchNet Corpus

GitHub acquires Semmle to secure open-source supply chain; attains CVE Numbering Authority status

GitHub Package Registry gets proxy support for the npm registry

GitHub updates to Rails 6.0 with an incremental approach

GitHub now supports two-factor authentication with security keys using the WebAuthn API

Original post:
GitHub Universe 2019: GitHub for mobile, GitHub Archive Program and more announced amid protests against GitHubs ICE contract - Packt Hub