Nominations open for 2021 Red Hat Innovation Awards – Yahoo Finance

Awards program recognizes Red Hat customers for their innovative and creative use of Red Hat technologies; Award recipients to be honored during Red Hat Summit 2021

Red Hat, Inc., the world's leading provider of open source solutions, today announced that it is accepting nominations for the 2021 Red Hat Innovation Awards.

Since 2007, the Red Hat Innovation Awards have recognized organizations from around the world and across industries for the transformative projects and outstanding results they have experienced with Red Hat's open source solutions. Open source has helped transform technology from the datacenter to the cloud and the Red Hat Innovation Awards showcase its transformative impact in organizations around the world.

Nominations should showcase successful IT implementation and projects that made a difference in organizations using open source. Entries for the Innovation Awards will be judged in five areas:

Submissions will be accepted until Sept. 13, 2020, and will be evaluated by a panel of business and open source technology experts including Ashesh Badani, senior vice president, Cloud Platforms, Red Hat; Caroline Chappell, research director and industry analyst, Analysys Mason; Stefanie Chiras, vice president and general manager, Red Hat Enterprise Linux business unit, Red Hat; Leigh Day, vice president, Marketing Communications, Red Hat; Alberto Iglesias Fraga, deputy director, INNOVADORES Supplement, La Razon; Sean Michael Kerner, freelance technology journalist; and Chris Wright, vice president and chief technology officer, Red Hat.

A total of five winners will be chosen. From those five winners, the 2021 Red Hat Innovator of the Year will be selected by the community through online voting, and will be announced during Red Hat Summit 2021 taking place in April of 2021. In addition, up to five honorable mention submissions may be recognized by the judges.

Additional Resources

Connect with Red Hat

About Red Hat, Inc.

Red Hat is the worlds leading provider of enterprise open source software solutions, using a community-powered approach to deliver reliable and high-performing Linux, hybrid cloud, container, and Kubernetes technologies. Red Hat helps customers integrate new and existing IT applications, develop cloud-native applications, standardize on our industry-leading operating system, and automate, secure, and manage complex environments. Award-winning support, training, and consulting services make Red Hat a trusted adviser to the Fortune 500. As a strategic partner to cloud providers, system integrators, application vendors, customers, and open source communities, Red Hat can help organizations prepare for the digital future.

Forward-Looking Statements

Certain statements contained in this press release may constitute "forward-looking statements" within the meaning of the Private Securities Litigation Reform Act of 1995. Forward-looking statements provide current expectations of future events based on certain assumptions and include any statement that does not directly relate to any historical or current fact. Actual results may differ materially from those indicated by such forward-looking statements. The forward-looking statements included in this press release represent the Company's views as of the date of this press release and these views could change. However, while the Company or its parent International Business Machines Corporation (NYSE:IBM) may elect to update these forward-looking statements at some point in the future, the Company specifically disclaims any obligation to do so. These forward-looking statements should not be relied upon as representing the Company's views as of any date subsequent to the date of this press release.

Red Hat and the Red Hat logo are trademarks or registered trademarks of Red Hat, Inc. or its subsidiaries in the U.S. and other countries. Linux is the registered trademark of Linus Torvalds in the U.S. and other countries.

View source version on businesswire.com: https://www.businesswire.com/news/home/20200803005414/en/

Contacts

Media: Allison ShowalterRed Hat, Inc.ashowalt@redhat.com +1 919-716-5113

Read this article:
Nominations open for 2021 Red Hat Innovation Awards - Yahoo Finance

Comprehensive Analysis On Open Source Software Market Based On Types And Application – Owned

Open Source Software Market Forecast 2020-2026

The Global Open Source Software Market research report provides and in-depth analysis on industry- and economy-wide database for business management that could potentially offer development and profitability for players in this market. This is a latest report, covering the current COVID-19 impact on the market. The pandemic of Coronavirus (COVID-19) has affected every aspect of life globally. This has brought along several changes in market conditions. The rapidly changing market scenario and initial and future assessment of the impact is covered in the report. It offers critical information pertaining to the current and future growth of the market. It focuses on technologies, volume, and materials in, and in-depth analysis of the market. The study has a section dedicated for profiling key companies in the market along with the market shares they hold.

The report consists of trends that are anticipated to impact the growth of the Open Source Software Market during the forecast period between 2020 and 2026. Evaluation of these trends is included in the report, along with their product innovations.

Get a PDF Copy of the Sample Report for free @ https://www.upmarketresearch.com/home/requested_sample/33063

The Report Covers the Following Companies:ge and distribute the software to anyone and for any purpose.New SW industry field as Cloud Big Data and IoT(Internet of Things) increases using open-source and expands the range.Globally introduction and application of open-source are increasingly grown and there is a lot of competition in the worldwide market.In 2017 the global Open Source Software market size was xx million US$ and it is expected to reach xx million US$ by the end of 2025 with a CAGR of xx% during 2018-2025.The key players covered in thi

By Types:SharewareBundled SoftwareBSD(Berkeley Source Distribution)

By Applications:BMForumphpBBPHPWind

Furthermore, the report includes growth rate of the global market, consumption tables, facts, figures, and statistics of key segments.

By Regions:

Grab Your Report at an Impressive Discount! Please click here @ https://www.upmarketresearch.com/home/request_for_discount/33063

Years Considered to Estimate the Market Size:History Year: 2015-2019Base Year: 2019Estimated Year: 2020Forecast Year: 2020-2026

Important Facts about Open Source Software Market Report:

What Our Report Offers:

Make an Inquiry of This Report @ https://www.upmarketresearch.com/home/enquiry_before_buying/33063

About UpMarketResearch:Up Market Research (https://www.upmarketresearch.com) is a leading distributor of market research report with more than 800+ global clients. As a market research company, we take pride in equipping our clients with insights and data that holds the power to truly make a difference to their business. Our mission is singular and well-defined we want to help our clients envisage their business environment so that they are able to make informed, strategic and therefore successful decisions for themselves.

Contact Info UpMarketResearchName Alex MathewsEmail [emailprotected]Organization UpMarketResearchAddress 500 East E Street, Ontario, CA 91764, United States.

Go here to read the rest:
Comprehensive Analysis On Open Source Software Market Based On Types And Application - Owned

Government and IBM strike deal for public cloud access – Digital Health

The Government has struck a deal with IBM to provide public sector organisations including the NHS with quicker access to cloud services.

The three-year public cloud agreement between Crown Commercial Services, the UK Cabinet Office and IBM, will allow public sector organisations to innovate with digital solutions and services.

Under the agreement all Central Government organisations including local authorities, education and the NHS can benefit from preferential commercial terms to help transform mission-critical workloads and develop new services.

It will allow the public sector to capitalise on the speed and agility of the public cloud while balancing the need for compliance and security.

Simon Tse, chief executive of Crown Commercial Servics (CCS), said: This agreement with IBM provides great value for public sector organisations as they continue to innovate and improve essential services for citizens throughout the UK.

IBMs public cloud is built on a foundation of open source software with more than 190 cloud-native APIs, such as AI, blockchain, Internet of Things, serverless and DevOps.

This will provide organisations greater flexibility to access services without vendor lock in, aiding the public sector manage higher value technology alongside key issues like data, security, services, and workflows.

Organisations will have access to a suite of solutions including IBM Multicloud Manager, IBM Cloud Paks, Red Hat OpenShift, Cloud Garages, Power Virtual Servers on Cloud, VMware and Cloud Migration Services.

Janine Cook, vice president for public sector at IBM UK and Ireland, said: As the public sector continues its rapid digital transformation, government organisations crossing many industries need a reliable, resilient and secure technology environment to meet the needs of citizens and address complex security and regulatory requirements.

An open hybrid cloud platform, built and managed with IBMs deep industry expertise, can allow the public sector to accelerate its innovation and offer a more agile way to develop new digital services and take the next step along their cloud journeys.

This follows Google Cloud signing a Memorandum of Understanding (MoU) with the UK government in June 2020, which will see public bodies, such as the NHS, benefit from discounts on products.

View original post here:
Government and IBM strike deal for public cloud access - Digital Health

The Value Proposition for Open Source Software at the Edge – ARC Viewpoints

Overview

It is becoming increasingly clear that for companies to make the full digital transformation and conduct their manufacturing and business operations across the Industrial IoT, cloud computing, and the operational intelligent edge, they must rely on a more open environment. This includes open source software, open standards, and certain degrees of interoperability; all with unique characteristics and distinct differences. The expansion and proliferation of open source organizations, such as the Linux Foundation and the OpenStack Foundation, attest to how much open source projects have become mainstream and a primary driver of IoT and intelligent edge applications and solutions.

The overall architecture of the digital transformation and the technologies, products and services, and business strategies that define it are gelling, with clear and distinct delineations. These include:

Much of this is being accomplished through open source projects and the suppliers and users that are developing applications and solutions for the Industrial IoT.

To operate successfully in this landscape, industrial and other organizations will require an understanding of open source software, its benefits and limitations; and how to leverage the value proposition of an open source development environment. For the most part, open source software and the Industrial IoT development community that uses it adhere to the open standards movement that evolved along with the software. Open standards often are a requirement for doing business; open source, in turn, is a choice made by users.

An IoT solution is an amalgamation of hardware, software, and networking capabilities. Emerging technologies and IoT devices will lead to the creation of more IoT applications for both consumers and industrial use cases. Edge computing is required for any successful and optimized IoT solution. There are various open source frameworks available in the IoT space. Cloud service providers also offer rich services for IoT solutions and edge computing.ARC recommends IoT solutions that are loosely coupled, modular, platform-independent, and based on open standards.

Data is the heart of an IoT system and should be processed as quickly as possible to allow the system to operate efficiently and as designed. An IoT system can generate very large amounts of data within seconds. This will depend on the various IoT devices assembled under the system to meet the business needs. The huge amounts of data generated from IoT devices or sources can easily consume the available network bandwidth and require excess data storage. It is crucial to aggregate and digitize the data at the periphery of the system, which can then be communicated to back-end systems. Today, with the emergence of AI algorithms embedded in the chips, processing can take place at the edge, significantly reducing the need to transport data to edge computing server layers.

Edge computing handles this, helping reduce the amount of data transported across cloud computing cloud infrastructure. These edge computing systems reside close to the IoT devices/data sources and enforce the required security. A major benefit of edge computing is that it improves the time to action, reduces response time, and optimizes the use of network resources. It also helps reduce latency and network bottlenecks.

Unlike proprietary software, open source technologies are completely customizable and scalable as the code is open it can be adjusted and modified to the businesss needs. Assuming the necessary toolkit is provided, OSS allows developers and enterprises to move between different frameworks without complications.

With a great number of automated protocols and functions, open source frameworks can save time for IoT engineers and tech-professionals. While data privacy and security are one of the primary concerns of any business, companies need to be aware of some remaining challenges when using OSS. These include:

ARC Advisory Group clients can view the complete report at ARC Client Portal

If you would like to buy this report or obtain information about how to become a client, pleaseContact Us

Keywords: Open Source Software (OSS), Edge Computing, Linux Foundation, OpenStack Foundation, IoT, Open Source Frameworks, ARC Advisory Group.

Continue reading here:
The Value Proposition for Open Source Software at the Edge - ARC Viewpoints

Hindi user guide, film with barcode: Heres how Github is archiving code for a thousand years – The Indian Express

Written by Nandagopal Rajan | New Delhi | Updated: July 31, 2020 6:56:41 pm Heres how Github is archiving code for a thousand years

A few centuries from now, someone could dig up a silver halide film plate in an ancient coal mine in the arctic circle and stare at the code of Microsoft MS-DOS with the same curiosity millennials today have for Egyptian hieroglyphics. The Github Archive Program at the Artic Vault in Norway has just been loaded with a backup of the open source software repository to last at least 1,000 years.

We believe it is worthwhile to preserve all the open source software because so much of our life today depends on open source software, whether it is my cell phone that I use to order groceries or watch a movie or communicate with my family and my friends all around the world, reasoned Thomas Dohmke, Vice President for Special Projects at GitHub.

In a video call with indianexpress.com, Dohmke said the vault in the archipelago of Svalbard, next to the Global Seed Vault, was part of Githubs layered approach to archiving all open source software. The vault deep in the permafrost forms the cold layer, where a snapshot taken on February 2, 2020 (2/2/2020) has been stored. Dohmke said the backup taken on the day was saved on a couple of hard drives and shipped to Githubs Norwegian partner Piql for printing on the film reels. Two weeks back, they shipped the files to the vault, he said, expressing a bit of disappointment at not being able to be there for the occasion because of the Covid travel restrictions. There is also a hot layer which is a live streaming backup. In the warm layer, backups are saved monthly and quarterly.

But planning an archive to last a millennia is much more than a tech challenge. While you need to keep in mind that the tech of today might make no sense to future generations, there is even more basic stuff to consider like the language to use future proof the concept. At GitHub we are all software developers and not archiving experts. So we looked for a panel of advisors and partners who have been advising us for the last year of what the right approaches to archive data would be, explained Dohmke.

So Github is now working with archaeologists, archivists, linguists and scientists to figure out whats the best way to approach the problem at hand. It also gets help from the Long Now Foundation, the Internet Archive, the Software Heritage Foundation, Arctic World Archive, Microsoft Research, the Bodleian Library, and Stanford Libraries, he added.

We looked at who else was doing something similar and found this company in Norway already offering archiving solutions. So we didnt invent the archive, we found someone already doing it. The old coal mine where the vault is situated already has archives from Unicef and the Vatican Library. It is also on a hill preventing any eventuality of flooding with say rising water levels or melting arctic ice. The film reels used in the Github archiving project were stress tested to see if they could survive a thousand years.

To ensure that whoever finds these archives after centuries is able to make sense of the project, Github got advisors from the Library of Alexandria in Egypt to help understand the cues archaeologists and linguists will look for. Another benefit is that the film we have used is readable to the human eye, he explained, so that you do not need compatible tech to make sense of the code itself. Then there is an introductory user guide to the archive, which along with English has Hindi, simplified Chinese, Arabic and Spanish translations. This note gives an insight into what is stored and an index of what is where.

Dohmke said that in a thousand years software development would be very different and people would have forgotten how things are done now and this is why they have provided a tech tree to introduce them to how software development is done now. Its basically like a whole library of books that explain all those basics. So in theory you know in a thousand years you should be able to make good use of all the software in the tech stack that we have today, he explained.

Also, to make it easier to save, the code has been converted into a type of binary barcode that can be decoded without a machine. Even if they have no idea about software development and no idea about technology, they should be able to understand whats in the archives.

While the archive does have forks of software, they have filtered out some specific file types like binary and exe files. All open source code that has had any activity in the last year, or some likes, then they are in the archive, he said on what all has been added in the archive. All that adds up to about 180 film reels and 20 terabytes on hard drives.

The Indian Express is now on Telegram. Click here to join our channel (@indianexpress) and stay updated with the latest headlines

For all the latest Technology News, download Indian Express App.

IE Online Media Services Pvt Ltd

Go here to see the original:
Hindi user guide, film with barcode: Heres how Github is archiving code for a thousand years - The Indian Express

Open Source Software Market 2020: Applications, Types and Growing Trends in Mar – News by aeresearch

The Global Covid-19 Impact on Open Source Software market gives detailed Evaluation about all the Important aspects related to the marketplace. The analysis on global Covid-19 Impact on Open Source Software economy, offers profound insights regarding the Covid-19 Impact on Open Source Software market covering all of the crucial aspects of the market. Moreover, the report offers historical information with future prediction over the forecast period. Various important factors such as market trends, earnings growth patterns market shares and demand and supply are contained in almost all the market research report for every industry. A number of the vital facets analyzed in the report contains market share, production, key regions, earnings rate in addition to key players.

The business intelligence summary of Open Source Software market is a compilation of the key trends leading the business growth related to the competitive terrain and geographical landscape. Additionally, the study covers the restraints that upset the market growth and throws light on the opportunities and drivers that are anticipated to foster business expansion in existing and untapped markets. Moreover, the report encompasses the impact of the COVID-19 pandemic, to impart a better understanding of this industry vertical to all the investors.

Key highlights from COVID-19 impact analysis:

Request Sample Copy of this Report @ https://www.aeresearch.net/request-sample/260243

A gist of the regional landscape:

Other highlights from the Open Source Software market report:

The main questions answered in the report are:

Request Customization on This Report @ https://www.aeresearch.net/request-for-customization/260243

Read the original:
Open Source Software Market 2020: Applications, Types and Growing Trends in Mar - News by aeresearch

Want to dare to be different? Start here, with SUSE – Armenian Reporter

The old adage in business technology says that no-one ever got fired for choosing IBM. That type of sentiment may well have held water a generation ago, when technology itself was something of a gamble to invest in, and an area only ever really ventured in by the cutting-edge corporate entity, or the generally foolhardy.

Today, choosing a stalwart solution provider might be relatively safe, but in many cases, its not indicative of the type of agility and foresight that todays businesses need. Of course, the older workhorses of technology (think Salesforce and Oracle) still offer significant value to many. Still, no-one ever chose those platforms and vendors to reflect their differences or to better promote their unique offering to markets that are open for disruption. But conversely, organizations today need not choose technology that no-ones heard of, or thats unproven. Theres a significant path being trodden by large, well-established companies that give organizations the type of agility and scalability they crave but without sacrificing stability, security, interoperability, and standards.

In this new age of software and hardware, open-source vendors provide cutting-edge technological platforms on which the next generation of businesses can base their offerings. But underpinning those platforms are the support, security, and developmental momentum that allow companies to become something different, without going out on a limb.

In the webinar above (sign up in the embedded form), the type of tech were talking about gets showcased by SUSE, one of the worlds biggest providers of open source solutions. Open source software already runs the internet, powers the vast majority of all cloud services, and is the technology on which companies base their entire technology stacks. Twenty years ago, people were asking, how can software thats free provide value? Todays business professionals know that open source solutions are superior. Heres why:

The future of software lies in the community of its users, its developers, testers, security overseers, and visionaries. That broad, varied, and resourceful community is only available in the open source arena. In a webinar presented by Brent Schroeder (CTO, SUSE) and Daniel Nelson (VP of Products & Solutions, SUSE), new business theory concepts are laid out, alongside how the balance is struck between open source software capabilities and a commercial mindset.

With the cloud, and the enterprise now being run the open-source way, isnt it time you too learn about the IBM alternatives and dare to be different?

Original post:
Want to dare to be different? Start here, with SUSE - Armenian Reporter

Ethereum (ETH) Down $1.17 Over Past 4 Hours, Started Today Up 0.25%; in an Uptrend Over Past 30 Days – CFDTrading

Ethereum 4 Hour Price Update

Updated July 30, 2020 03:20 AM GMT (11:20 PM EST)

Ethereum came into the current 4 hour candle down 0.37% ($1.17) from the open of the last 4 hour candle, marking the 4th candle in a row a decrease has occurred. On a relative basis, the last 4 hour candle were pretty good: Ethereum bested all 5 of the assets in the Top Cryptos class

The choppiness in the recent daily price action of Ethereum continues; to start today, it came in at a price of 318.27 US dollars, up 0.25% ($0.8) since the previous day. The change in price came along side change in volume that was down 20.46% from previous day, but up 47.08% from the Wednesday of last week. Relative to other instruments in the Top Cryptos asset class, Ethereum ranked 3rd since the previous day in terms of percentage price change. Lets take a look at the daily price chart of Ethereum.

The clearest trend exists on the 14 day timeframe, which shows price moving up over that time. For additional context, note that price has gone up 10 out of the past 14 days. As for those who trade off of candlesticks, we should note that were seeing pin bar pattern appearing here.

For laughs, fights, or genuinely useful information, lets see what the most popular tweets pertaining to Ethereum for the past day were:

This month I have:- Bought an S&P 500 option Earned interest at 750% Borrowed money for free Made markets Provided liquidity Traded on margin Made a USD business loan payment Joined a new farming DAO community And much more!All on Ethereum, only on Ethereum

If you work full time on Ethereum or Ethereum-based apps, and are still overweight BTC vs ETH, I apologize in advance for the FOMO youre going to feel over the next 1-2 years.Those of us using this thing every day can see the value of whats being created here.

Burning ETH (on top of the EIP-1559 fee burn) is a simple incentive-aligned way for dApps to rally the Ethereum marketing machine for extra growth. Total Value Burnt (TVB) may become a prime DeFi metric alongside Total Value Locked (TVL).

For a longer news piece related to ETH thats been generating discussion, check out:

Ethereums First ICO Blazes Trail To A World Without Bosses

If it werent for horses, Joey Krug might not have ever gotten into ethereum.They said, No, but if you go there and you muck stalls every day for a year, well get you a horse, he says.Now 26, Krug is the co-chief investment officer at Pantera Capital and a cofounder of Augur, an open-source no-limits betting platform built on the ethereum blockchain that lets anyone build any kind of betting market, without a bookie. Today, Krug and a team of open-source developers scattered around the world launched Version 2 of that platform, which amounts to a significant leap forward in the world of decentralized applications that function similar to the internet but without the need for trusted third parties.Its sort of like public infrastructure. Augur co-creator Joey Krug on Shimmer, the Palomino horse his parents got him after mucking horse The privately-held Forecast Foundation, based in Estonia, sold or distributed 11,000 REP tokens to be used on Augur, 80% of which went to the crowd, or people interested in participating in the prediction market, 16% of which went to the Augur founding team, including Buterin, and 4% of which went to support the foundation itself.Someday the foundation will run out of money and basically, kind of disappear and this becomes an ongoing community developed open source software project, he says, At which point, we could maybe create a for-profit entity on top that does actually try to aggressively make money. I report on how blockchain and cryptocurrencies are being adopted by enterprises and the broader business community.

Read the original:
Ethereum (ETH) Down $1.17 Over Past 4 Hours, Started Today Up 0.25%; in an Uptrend Over Past 30 Days - CFDTrading

Ask Hackaday: Why Did GitHub Ship All Our Software Off To The Arctic? – Hackaday

If youve logged onto GitHub recently and youre an active user, you might have noticed a new badge on your profile: Arctic Code Vault Contributor. Sounds pretty awesome right? But whose code got archived in this vault, how is it being stored, and whats the point?

On February 2nd, GitHub took a snapshot of every public repository that met one of the following criteria:

Then they traveled to Svalbard, found a decommissioned coal mine, and archived the code in deep storage underground but not before they made a very cinematic video about it.

For the combination of longevity, price and density, GitHub chose film storage, provided by piql.

Theres nothing too remarkable about the storage medium: the tarball of each repository is encoded on standard silver halide film as a 2d barcode, which is distributed across frames of 8.8 million pixels each (roughly 4K). Whilst officially rated for 500, the film should last at least 1000 years.

You might imagine that all of GitHubs public repositories would take up a lot of space when stored on film, but the data turns out to only be 21TB when compressed this means the whole archive fits comfortably in a shipping container.

Each reel starts with slides containing an un-encoded human readable text guide in multiple languages, explaining to future humanity how the archive works. If you have five minutes, reading the guide and how GitHub explains the archive to whoever discovers it is good fun. Its interesting to see the range of future knowledge the guide caters to it starts by explaining in very basic terms what computers and software are, despite the fact that de-compression software would be required to use any of the archive. To bridge this gap, they are also providing a Tech Tree, a comprehensive guide to modern software, compilation, encoding, compression etc. Interestingly, whilst the introductory guide is open source, the Tech Tree does not appear to be.

But the question bigger than how GitHub did it is why did they do it?

The mission of the GitHub Archive Program is to preserve open source software for future generations.

GitHub talks about two reasons for preserving software like this: historical curiosity and disaster. Lets talk about historical curiosity first.

There is an argument that preserving software is essential to preserving our cultural heritage. This is an easily bought argument, as even if youre in the camp that believes theres nothing artistic about a bunch of ones and zeros, it cant be denied that software is a platform and medium for an incredibly diverse amount of modern culture.

GitHub also cites past examples of important technical information being lost to history, such as the search for the blueprints of the Saturn V, or the discovery of the Roman mortar which built the Pantheon. But data storage, backup, and networks have evolved significantly since Saturn Vs blueprints were produced. Today people frequently quip, once its on the internet, its there forever. What do you reckon? Do you think the argument that software (or rather, the subset of software which lives in public GitHub repos) could be easily lost in 2020+ is valid?

Whatever your opinion, simply preserving open source software on long timescales is already being done by many other organisations. And it doesnt require an arctic bunker. For that we have to consider GitHubs second motive: a large scale disaster.

We cant predict what apocalyptic disasters the future may bring thats sort of the point. But if humanity gets into a fix, would a code vault be useful?

Firstly, lets get something straight: in order for us to need to use a code archive buried deep in Svalbard, something needs to have gone really, really, wrong. Wrong enough that things like softwareheritage.org, Wayback Machine, and countless other conventional backups arent working. So this would be a disaster that has wiped out the majority of our digital infrastructure, including worldwide redundancy backups and networks, requiring us to rebuild things from the ground up.

This begs the question: if we were to rebuild our digital world, would we make a carbon copy of what already exists, or would we rebuild from scratch? There are two sides to this coin: could we rebuild our existing systems, and would we want to rebuild our existing systems.

Tackling the former first: modern software is built upon many, many layers of abstraction. In a post-apocalyptic world, would we even be able to use much of the software with our infrastructure/lower-level services wiped out? To take a random, perhaps tenuous example, say we had to rebuild our networks, DNS, ISPs, etc. from scratch. Inevitably behavior would be different, nodes and information missing, and so software built on layers above this might be unstable or insecure. To take more concrete examples, this problem is greatest where open-source software relies on closed-source infrastructure AWS, 3rd party APIs, and even low-level chip designs that might not have survived the disaster. Could we reimplement existing software stably on top of re-hashed solutions?

The latter point would we want to rebuild our software as it is now is more subjective. I have no doubt every Hackaday reader has one or two things they might change about, well, almost everything but cant due to existing infrastructure and legacy systems. Would the opportunity to rebuild modern systems be able to win out over the time cost of doing so?

Finally, you may have noticed that software is evolving rather quickly. Being a web developer today who is familiar with all the major technologies in use looks pretty different from the same role 5 years ago. So does archiving a static snapshot of code make sense given how quickly it would be out of date? Some would argue that throwing around numbers like 500 to 1000 years is pretty meaningless for reuse if the software landscape has completely changed within 50. If an apocalypse were to occur today, would we want to rebuild our world using code from the 80s?

Even if we werent to directly reuse the archived code to rebuild our world, there are still plenty of reasons it might be handy when doing so, such as referring to the logic implemented within it, or the architecture, data structures and so on. But these are just my thoughts, and I want to hear yours.

The thought that there is a vault in the Arctic directly containing code you wrote is undeniably fun to think about. Whats more, your code will now almost certainly outlive you! But do you, dear Hackaday reader, think this project is a fun exercise in sci-fi, or does it hold real value to humanity?

Original post:
Ask Hackaday: Why Did GitHub Ship All Our Software Off To The Arctic? - Hackaday

CRM Startup vcita Opens Its Platform to Developers For Shared Innovation – Yahoo Finance

vcita, a customer relationship management (CRM) startup that develops cloud solutions for small businesses, has opened its platform to developers. The move enables SMBs to create custom applications that tap into vcitas suite of business management tools, driving greater efficiencies and fueling innovation.

vcita has been on a roll over the past quarter, partnering with Mastercard (NYSE:MA) to launch a platform for businesses embracing digital and integrating with Squares (NYSE:SQ) payment infrastructure. The decision to effectively decentralize vcitas platform, freeing small businesses and developers to build upon it, will bring vcitas CRM solution to a broader audience, while empowering SMBs to embrace digital tools that can unlock new capabilities and allow them to better serve their customers.

A newly launched developer hub will grant access to vcitas APIs, with detailed documentation and tutorials guiding devs through the process of building upon the CRM startups framework.

While open source software is a staple of the computing industry, its uncommon for startups to open their proprietary technology to third parties. vcita seems confident, however, that any drawbacks of doing so will be more than countered by the benefits of spurring a wave of innovation that will increase adoption of vcitas CRM tools while placing advanced cloud software in the hands of SMBs.

As vcita CBDO Adi Engel explains, We firmly believe that forming strategic alliances is the key to realizing our mission to include SME owners in the global shift towards a more digital economy. The sort of tools that small businesses will be able to access via vcitas APIs include solutions for scheduling, payment collection, and marketing campaigns. Not only will this help small businesses to go fully digital, but it will free them from expensive SaaS subscriptions that can quickly stack up, eroding profits.

vcitas decision to open source its technology arrives at a time when businesses the world over are embracing remote working after the effects of the coronavirus forced their hand. Many have found the arrangement favorable, for employers and employees alike. Google (NASDAQ:GOOG), for example, has just announced that its remote working program has been extended until the middle of 2021. The transition to remote working has increased demand for scheduling software and similar CRM tools developed by startups like vcita.

Research shows that employees who work remotely are happier and more productive. While that may not have been paramount in vcitas move to an open framework, its evidence of the inexorable shift to digital, and the need for purpose-built tools that can keep teams in the loop and help businesses adapt to a rapidly changing environment. Having thrown down the gauntlet to developers eager to build solutions that align with this narrative, vcita is aiming to accelerate adoption of software suited to a distributed workforce and global customer base.

Disclosure: No positions.

View post:
CRM Startup vcita Opens Its Platform to Developers For Shared Innovation - Yahoo Finance