Under pressure: Managing the competing demands of development velocity and application security – Security Boulevard

Nearly 50% of development teams knowingly release vulnerable code. Learn why vulnerabilities are overlooked and how you can improve application security.

The first software development team I worked on operated on the follow mantra:

Make it work.Then, make it fast.Then, make it elegant (maybe).

Meaning, dont worry about performance optimizations until your code actually does what its supposed to do, and dont worry about code maintainability until after you know it both works and performs well. Users generally have no idea how maintainable the code is, but they do know if the application is broken or slow. So more often than not, wed never get around to refactoring the codeat least not until the code debt started to impact application reliability and performance.

Today that developer mantra has two additional lines:

Ship it sooner.And while youre at it, make it secure.

As with application performance and reliability, delivering an application on time is easily quantified and observed. Everybody knows when you miss a deadlinesomething thats easy to do when your release cycles are measured in weeks, days, or even hours. But the security of an application isnt so easily observed or quantified, at least not until theres a security breach.

It should come as no surprise, then, that nearly half of the respondents to the modern application development security survey, conducted by Enterprise Strategy Group (ESG), state that their organizations regularly push vulnerable code to production. Its also not surprising that for over half of those teams, tight delivery schedules and critical deadlines are the main contributing factor. In the presence of a deadline, what can be measured is whats going to get done, and what cant be (or at least isnt) measured often doesnt.

However, we dont have time to do it doesnt really cut it when it comes to application security. This is demonstrated by the 60% of respondents who reported that their applications have suffered OWASP Top 10 exploits during the past 12 months. The competing demands of short release cycles and improved application security are a real challenge for development and security teams.

It doesnt have to be this way, and other findings in the survey report point to opportunities that teams have to both maintain development velocity and improve application security. Here are just a few:

Reject silver bullets. Gone are the days of security teams simply running DAST and penetration tests at the end of development. A consistent trend shown in the report is that teams are leveraging multiple types of security testing tools across the SDLC to address different forms of risk in both proprietary and open source code.

Integrate and automate. Software development is increasingly automated, and application security testing needs to be as well. Over half the respondents indicated that their security controls are highly integrated into their DevOps processes, with another 38% saying they are heading down that same path.

Train the team. Most developers lack sufficient application security knowledge to ensure their code isnt vulnerable. Survey respondents indicated that developer knowledge is a challenge, as is consistent training. Without sufficient software security training, developers struggle to address the findings of application security tests. An effective way to remedy this is to provide just-in-time security training delivered through the IDE with a solution like Code Sight.

Keep score. If what gets measured gets done, then its important to measure the progress of both your AppSec testing and security training programs. This includes tracking the introduction and mitigation of security bugs as well as improvements to both of these metrics over time, i.e., who is writing secure code and who isnt, and are they improving?

There are a number of other interesting findings and recommendations in the survey report, and they can help your team manage the competing pressures of release schedules and application security. You can check it out here, and you can also learn more by joining our upcoming webinar, Under Pressure: Building Security Into Application Development, where Ill be interviewing the survey reports author, Dave Gruber, senior analyst at Enterprise Strategy Group.

Under Pressure: Building Security into Application Development

Read more:

Under pressure: Managing the competing demands of development velocity and application security - Security Boulevard

Security professional launches a community-based website with open-sourced training programs dedicated to helping others in the industry – Security…

Security professional launches a community-based website with open-sourced training programs dedicated to helping others in the industry | 2020-09-28 | Security Magazine This website requires certain cookies to work and uses other cookies to help you have the best experience. By visiting this website, certain cookies have already been set, which you may delete and block. By closing this message or continuing to use our site, you agree to the use of cookies. Visit our updated privacy and cookie policy to learn more. This Website Uses CookiesBy closing this message or continuing to use our site, you agree to our cookie policy. Learn MoreThis website requires certain cookies to work and uses other cookies to help you have the best experience. By visiting this website, certain cookies have already been set, which you may delete and block. By closing this message or continuing to use our site, you agree to the use of cookies. Visit our updated privacy and cookie policy to learn more.

The rest is here:

Security professional launches a community-based website with open-sourced training programs dedicated to helping others in the industry - Security...

W3C Drops WordPress from Consideration for Redesign, Narrows CMS Shortlist to Statamic and Craft – WP Tavern

The World Wide Web Consortium (W3C), the international standards organization for the web,is redesigning its website and will soon be selecting a new CMS. Although WordPress isalready used to manage W3Cs blog and news sections of the website, the organization is open to adopting a new CMS to meet its list of preferences and requirements.

Studio 24, the digital agency selected for the redesign project, narrowed their consideration to three CMS candidates:

Studio 24 was aiming to finalize their recommendations in July but found that none of them complied with the W3Cs authoring tool accessibility guidelines. The CMSs that were better at compliance with the guidelines were not as well suited to the other project requirements.

In the most recent project update posted to the site, Studio 24 reported they haveshortlisted two CMS platforms. Coralie Mercier, Head of Marketing and Communications at W3C, confirmed that these include Statamic and Craft CMS.

WordPress was not submitted to the same review process as the Studio 24 team claims to have extensive experience working with it. In the summary of their concerns, Studio 24 cited Gutenberg, accessibility issues, and the fact that the Classic Editor plugin will stop being officially maintained on December 31st, 2021:

First of all, we have concerns about the longevity of WordPressas we use it. WordPress released a new version of their editor in 2018: Gutenberg. We have already rejected the use of Gutenberg in the context of this project due to accessibility issues.

If we choose to do away with Gutenberg now, we cannot go back to it at a later date. This would amount to starting from scratch with the whole CMS setup and theming.

Gutenberg is the future of WordPress. The WordPress core development team keeps pushing it forward and wants to roll it out to all areas of the content management system (navigation, sidebar, options etc.) as opposed to limiting its use to the main content editor as is currently the case.

This means that if we want to use WordPress long term, we will need to circumvent Gutenberg and keep circumventing it for a long time and in more areas of the CMS as time goes by.

Another major factor in the decision to remove WordPress from consideration was that they found no elegant solution to content localization and translation.

Studio 24 also expressed concerns that tools like ACF, Fewbricks, and other plugins might not being maintained for the Classic Editor experience in the context of a widespread adoption of Gutenberg by users and developers.

More generally, we think this push to expand Gutenberg is an indication of WordPress focusing on the requirements of their non-technical user base as opposed to their audience of web developers building custom solutions for their clients.

It seems that the digital agency W3C selected for the project is less optimistic about the future of Gutenberg and may not have reviewed recent improvements to the overall editing experience since 2018, including those related to accessibility.

Accessibility consultant and WordPress contributor Joe Dolson recently gave an update on Gutenberg accessibility audit at WPCampus 2020 Online. He reported that while there are still challenges remaining, many issues raised in the audit have been addressed across the whole interface and 2/3 of them have been solved. Overall accessibility of Gutenberg is vastly improved today over what it was at release, Dolson said.

Unfortunately, Studio 24 didnt put WordPress through the same content creation and accessibility tests that it used for Statamic and Craft CMS. This may be because they had already planned to use a Classic Editor implementation and didnt see the necessity of putting Gutenberg through the paces.

These tests involved creating pages with flexible components which they referred to as blocks of layout, for things like titles, WYSIWYG text input, and videos. It also involved creating a template for news items where all the content input by the user would be displayed (without formatting).

Gutenberg would lend itself well to these uses cases but was not formally tested with the other candidates, due to the team citing their extensive experience with WordPress. I would like to see the W3C team revisit Gutenberg for a fair shake against the proprietary CMSs.

The document outlining the CMS requirements for the project states that W3C has a strong preference for an open-source license for the CMS platform as well as a CMS that is long-lived and easy to maintain. This preference may be due to the economic benefits of using a stable, widely adopted CMS, or it may be inspired by the undeniable symbiosis between open source and open standards.

The industry has learned by experience that the only software-related standards to fully achieve [their] goals are those which not only permit but encourage open source implementations. Open source implementations are a quality and honesty check for any open standard that might be implemented in software

WordPress is the only one of the three original candidates to be distributed under anOSD-compliant license.(CMS code available on GitHub isnt the same.)

Using proprietary software to publish the open standards that underpin the web isnt a good look. While proprietary software makers are certainly capable of implementing open standards, regardless of licensing, there are a myriad of benefits for open standards in the context of open source usage:

The community of participants working with OSS may promote open debate resulting in an increased recognition of the benefits of various solutions and such debate may accelerate the adoption of solutions that are popular among the OSS participants. These characteristics of OSS support evolution of robust solutions are often a significant boost to the market adoption of open standards, in addition to the customer-driven incentives for interoperability and open standards.

Although both Craft CMS and Statamic have their code bases available on GitHub, they share similarly restrictive licensing models. The Craft CMS contributing document states:

Craft isnt FOSSLets get one thing out of the way: Craft CMS is proprietary software. Everything in this repo, including community-contributed code, is the property of Pixel & Tonic.

That comes with some limitations on what you can do with the code:

You cant change anything related to licensing, purchasing, edition/feature-targeting, or anything else that could mess with our alcohol budget. You cant publicly maintain a long-term fork of Craft. There is only One True Craft.

Statamics contributing docs have similar restrictions:

Statamic is not Free Open Source Software. It is proprietary. Everything in this and our other repos on Github including community-contributed code is the property of Wilderborn. For that reason there are a few limitations on how you can use the code:

Projects with this kind of restrictive licensing often fail to attract much contribution or adoption, because the freedoms are not clear.

In a GitHub issue requesting Craft CMS go open source, Craft founder and CEO Brandon Kelly said, Craft isnt closedsourceall the source code is right here on GitHub, and claims the license is relatively unrestrictive as far as proprietary software goes, that contributing functions in a similar way to FOSS projects. This rationale is not convincing enough for some developers commenting on the thread.

I am a little hesitant to recommend Craft with a custom open source license, Frank Anderson said. Even if this was a MIT+ license that added the license and payment, much like React used to have. I am hesitant because the standard open source licenses have been tested.

When asked about the licensing concerns of Studio 24 narrowing its candidates to two proprietary software options, Coralie Mercier told me, we are prioritizing accessibility. A recent project update also reports that both CMS suppliers W3C is reviewing have engaged positively with authoring tool accessibility needs and have made progress in this area.

Even if you have cooperative teams at proprietary CMSs that are working on accessibility improvements as the result of this high profile client, it cannot compare to the massive community of contributors that OSD-compliant licensing enables.

Its unfortunate that the state of open source CMS accessibility has forced the organization to narrow its selections to proprietary software options for its first redesign in more than a decade.

Open standards go hand in hand with open source. There is a mutually beneficial connection between the two that has caused the web to flourish. I dont see using a proprietary CMS as an extension of W3C values, and its not clear how much more benefit to accessibility the proprietary options offer in comparison. W3C may be neutral on licensing debates, but in the spirit of openness, I think the organization should adopt an open source CMS, even if it is not WordPress.

Like Loading...

Original post:

W3C Drops WordPress from Consideration for Redesign, Narrows CMS Shortlist to Statamic and Craft - WP Tavern

We now know more about Fall Guys’ anatomy, and it turns out they’re abominations – Windows Central

Source: Windows Central / Zackery Cuevas

Fall Guys: Ultimate Knockout has taken the gaming world by storm, already crossing over 2 million copies sold on Steam alone. Its intense and addictive gameplay, where players tackle a wide variety of solo and team-based mini-games to eventually declare a single victor, has led to gamers and streamers alike playing the game endlessly. However, due to popular request (apparently), the team behind Fall Guys: Ultimate Knockout has decided to reveal a little bit more about the mysterious little guys that make up the game's cast: the Fall Guys themselves.

When I said "mysterious little guys" before, I meant "not mysterious enough after this" and "oh wow they're bigger than I imagined" and "I cannot erase this from my mind." According to the "official Fall Guys lore," Fall Guys stand at 6ft tall, with a rather unique skeleton snaking its way through their amorphous bodies. The thing that stands out above everything else, however, is the way their eyes (unassuming from the outside) are precariously attached to a skull recessed far into their heads.

But hey, at least they're smiling.

If this elicited a reaction out of you, or changed your opinion on Fall Guys, be sure to let us know in the comments below! I'm certainly never going to see a swarm of these rushing for a goal in the same manner ever again.

So Fall Guys. Much scary.

Fall Guys: Ultimate Knockout is a wacky battle royale that is inspired by gameshows like Wipeout. Currently available on the PC and PS4, Fall Guys: Ultimate Knockout is sure to keep the attention of both casual and hardcore players alike with its fun and addictive gameplay, as long as you don't think about what Fall Guys are.

Excerpt from:

We now know more about Fall Guys' anatomy, and it turns out they're abominations - Windows Central

State of the software supply chain: Machines will make software faster – TechBeacon

How far away are we from machines making safersoftware, faster? We might be closer than you think.

Other than ensuring that your people are happy and engaged, digital innovation is the bestsource of competitiveness and value creation for almost every type of business. As a result, three things are increasingly common among corporate software engineering teams and the 20 million software developers who work for them:

They seek faster innovation.

They seek improved security.

They utilize a massive volume of open-source libraries.

The universal desire for faster innovation demands efficient reuse of code, which in turn has led to a growing dependence on open-source and third-party software libraries. These artifacts serve as reusable building blocks, which are fed into public repositories where they are freely borrowed by millions of developers in the pursuit of faster innovation.

This is the definition of the modern technology supply chainand more specifically, a software supply chain.

Organizations that invest in securing the best parts, from the fewest and best suppliers, and keeping those components updated, are widening the gap against their competitors. The best-performing organizations are applying automation to help them manage their open-source component choices and updates.

As these practices evolve, machines will become better at guiding developers to the best-quality and most secure component versions. And in the not-too-distant future, machines may be compiling the best components into application code based on functional requirements defined upfront.

Here's why automation should be a key strategy for helping you select open-source components, and other lessons from my team's research.

The2019State of DevOps Reportfound thatelite organizations are deploying 200 times more frequently than their peers, and their change failure rates are seventimes lower. They're also much faster inmean time to recover from failure than other organizations.

But these kinds of metricsarereally focused onhow you're doing internally as a development team and don't take into account many external factors.

This reminds me of what Jeff Bezossaid in his 2017 letter to shareholders: "Beware of the proxies."You can get so focused on a process, and doing that process well, that it becomes the thing that you're trying to achieve.

You might be trying to achieve faster deployments, faster mean times to recovery, or more secure code releases. They can represent your proxies for success, while not necessarily contributing to the outcome your business is attempting to achieve.

Consider adversaries who attack your code. If you can release new security updates in your codebase within two weeks, but your adversaries can find and exploit the new vulnerabilities in two days, your organization's data is at risk.

In this situation, it does not matter as much that you've already reduced your time to implement security updates fivefoldif your adversaries are still faster.

Consider thisreal-world scenario. On Wednesday, April 29, 2020, the creatorsand maintainersof SaltStack, an open-source application, announced that the app had a critical vulnerability. On the very same day, they released the safe version of the application. If you had automatic updates turned on from SaltStack, you got the newer version. If you didn't, then you needed to get the newer version, update your infrastructure, and do so before the adversaries found it.

One of the researchers at F-Secure said that"the vulnerability was so critical, this was a patch-by-Friday-or-be-breached-by-Monday kind of situation."

Andthat's exactly what happened. By Saturday morning, May 2, some18 people on GitHub reported that breaches were actively happening. They had lost control of their servers. SaltStack had been taken over, rogue code was executing on their systems, and their firewalls were being disabled. Throughout May, 27 breaches were recorded.

But not allof the news is bad: We know that developers are getting faster, too, because they're not writing all of their code themselves.

Figure 1:Number of download requests for Java component releases, 2012 to2020, from the Central Repository.Source: 2020 State of the Software Supply Chain Report

We're assembling more and more code from open-sourcecomponents and packages. As one example, it's amazing to look at download volumes for the npm package manager. There were95 billion npm package downloads in July 2020. If you annualize thatdownload volume, wewould seeover 1.1 trillion npm package downloads this year.

In Java, similar things are happening. In 2019, Maven Central had 226 billion download requests. In 2020, download request volumes are expected to hit 376 billion.

How do these monstrous numbers translate to your own developers and applications? After analyzing 1,500 unique applications, we can see that 90% of their code footprint is built from open-source software components.

As I started thinking about all of the above, I wanted to understand not just how these parts are being used, but where they are coming fromand who the open-source software suppliers are. So, in a two-year-long collaboration,Gene Kim,Stephen Magill, and I examined software release patterns and cybersecurity hygiene practices across 30,000 commercial development teams and open-source projects.

We set out to understand what attributes we coulduse to identify the best open-source project performance and practices. If development teams were going to assemble applications from these building blocks, we wanted to understand who the best suppliers were.

We wanted to know who released most often, who were the most popular suppliers, who prioritized features over securityor security over features, who enlisted automated build tools, which projects were consistently well staffed, and more. All of these variables played a role in identifying suppliers with the best track records, because they would be the ones to help developers build the best applications.

Additionally, the more you could teach machines to identify the attributes of the best open-source softwaresuppliers for developers, the faster development could become.

The top-performing projects released 1.5 timesmore frequently than the rest of the teams we studied,were 2.5 times more popular by download count, had 1.4 times larger development teams, and managed 2.9 timesfewer dependencies.

We also saw a strong correlation between open sourceprojects that updated dependencies more frequently and their ability to maintain more secure code. High-performing projects demonstrated a median time to update (MTTU) their dependencies that was 530times faster than other projects. By moving to the latest dependencies, they purposely or consequently remediated known vulnerabilities discovered in older dependencies.

Figure 2: Open-source project cluster analysis of popularity and release speed. Source: 2019 State of the Software Supply Chain Report

To better understand all this, we performed a cluster analysis of these different open-source projects based on severalattributes. We were able to see what development teams should focus on when choosing components.

Choosing open-source projects should be considered an important strategic decision for enterprise software development organizations. Different components demonstrate healthy or poor performance that affects the overall quality of their releases.

Therefore, MTTU should be an important metric when deciding which components to use within your software supply chains. Rapid MTTU is associated with lower security risk, and it's accessible from public sources.

Just as traditional manufacturing supply chains intentionally select parts from approved suppliers and rely upon formalized procurement practices,enterprise development teams should adopt similar criteria for their selection of open-source softwarecomponents.

This practice ensures that the highest-quality parts are selected from the best and fewest suppliers. Implementing selection criteria and updated practices will not only improve code quality, but can accelerate mean time to repair when suppliers discover new defects or vulnerabilities.

Ideally, dependencies should be updatedsimply, safely, and painlessly, and as part of the routine development process. But reality shows that this ideal is rarely met.

An astonishing story of how far an organization can stray from ideal update practices comes from Eileen M. Uchitelle, staff engineer at GitHub, who said it took seven years to successfully migrate GitHub from a forked version of Rails 2 to Rails 5.32.

Even with new tools available to developers that automatically create pull requests with updated dependencies, changes in APIs and potential breakage can still hold back many developers from updating. We suspect this change-induced breakage is a primary driver of poor updating practices.

Taking a deeper dive into the vast data available to us from the Central Repository, the world's largest collection of open-source components,you can better visualize open-source project releases and their adoption by enterprise application development teams that migrate from one version to a newer one. We believe this data shows how open-source component selection can play a major role in allowing for easier and more frequent updates.

Figure 3:Migration patterns between component releases for the joda-time library.Source: 2020 State of the Software Supply Chain Report

Consider the widely used joda-time library, which shows that developers using this open-source component update fairly uniformly between all pairs of versions. This suggests that updates are easy, presenting a seemingly homogeneous set of versions tomigrate to and from.

Figure 4:Migration patterns between component releases for the hibernate-validator library.Source: 2020 State of the Software Supply Chain Report

On the opposite extreme, consider the graph for the hibernate-validator library, where there are two sets of communities using itone favoring version 5 and another preferring version 6. The two communities very rarely intersect. This suggests either that updating to version 6 from version 5 is too difficultor that the value is not worth the effort.

Figure 5:Migration patterns between component releases for the spring-core library.Source: 2020 State of the Software Supply Chain Report

Finally, we take a look at the pattern for spring-core, which suggests that updating is sufficiently difficult that the effort must be planned and some version ranges end up being avoided.

If you are a developer, don't worry; your job is secure. No machine out there will take your place. Having said that, an increased reliance on automation to help you select better, higher-quality, and more secure components can serve you and your teams well today.

You can use automation, through advanced software composition analysis and open-source-governance tools, to point to better suppliers with a better track recordfor instance,they release often, update vulnerabilities quickly, are well staffed, and are popular.

Using these tools to set policies around components can help you determine when to upgrade your dependencies, and they can quickly inform you of newly discovered vulnerabilities in need of remediation. Additionally, these tools can lead developers to the best versions of components, indicating which newer versions will introduce the fewest breaking changes or introduce troublesome dependencies.

To learn more about our research into high-performance, open-souce component-based development,read the2020 State of the Software Supply Chain Reportor attend my upcoming session on this topic at the DevOps World virtual conference, whichruns from September 22-24, 2020.

The rest is here:

State of the software supply chain: Machines will make software faster - TechBeacon

Keep an Eye on This Cohort of Open Source Developer Interns – CoinDesk – Coindesk

When Christopher Allen received applications for the 2020 Blockchain Commons internship, he had a problem: He had more applications than he had ever received in the internships history, and all from stellar applicants.

This was a good problem to have, of course, and Allen tackled it head-on by expanding the internship program. He typically only takes one intern under his tutelage, but this year he took on 7.

With so many extra hands, each intern had the opportunity to work on a project of his or her preference. Each of these projects went toward improving software in the Blockchain Commons repositories.

As the internship draws to a close, the interns contributions to free and open-source software (FOSS) are nearing completion and will soon be open to the public to use.

The Blockchain Commons: a hub for open-source software

Allen founded the Blockchain Commons in 2018 in a bid to keep Bitcoins development open and distributed.

In a past life, he helped pioneer the OpenSSL/TLS protocol, an encryption standard for securing data transmitted over the internet. Come 2014, the Heartbleed Bug compromised the OpenSSL implementation of the encryption standard, which handled 60% of the internets traffic at the time (and with it, trillions of dollars of online commerce).

The flaw was promptly patched. But Allen took that tribulation to heart and vowed to not allow a single point of failure to threaten the security of other software projects he works on.

Cue Allens discovery of Bitcoin and the founding of the Blockchain Commons. After a brief tenure at Blockstream, Allen founded his not-for-profit benefit organization to do his part to keep Bitcoins development distributed.

Now, after a summer of tinkering, his newest interns have enriched the codebase and Github libraries of some of the Blockchain Commons principal projects including the addition of a project of their own design.

What these budding Bitcoin developers created

Spotbit

For their new group project, the interns began building Spotbit, a software for curating Tor-supported bitcoin (BTC) price feeds.

Led by Dartmouth senior Christian Murray with assistance from Nishit Shah, the modular, self-hosted feed draws pricing data from 100 cryptocurrency exchanges across various stablecoin and fiat trading pairs. Users can choose which exchanges they want their feed to tap into, which trading pairs to support and what data they want to store. If a user doesnt want to host a Spotbit node, they can connect to others.

Lethe Kit

Besides Spotbit, each intern has an individual project which they work on alongside Allen to improve.

Gorazd Kovacic from Slovenia, for example, has been working on the Blockchain Commons code for the Lethe Kit. The DIY hardware wallet so-named after the river of Greek mythology that cleansed the underworlds denizens with amnesia of their past lives is an air-gapped hardware wallet, meaning it cannot come in direct contact with an internet-connected device.

The Lethe Kit can generate seeds and addresses to receive transactions, but it cannot send bitcoin through partially-signed Bitcoin transactions (a previous version of this article indicated otherwise).

Kovacic has been working on integrating animated QR codes and Shamir secret shares (a cryptographic technique for dividing a private key into multiple parts) into the Lethe kit.

Gordian Wallet and Gordian Server

Another intern, Gautham Ganesh Elango, is working on Gordian, a two-part project which includes a Bitcoin full-node implementation which runs over Tor and an iOS mobile wallet.

The Gordian Server operates similarly to Bitcoin node dashboards like My Node by offering its users a graphical user interface (GUI) for interacting with Bitcoin Core.

A GUI (an interface type we use everyday when commanding our Macs and PCs with iOs or Windows, to give one example) is the user-friendly, laymans version of the command-line interface the raw coding terminal that developers use to speak to their devices.

The projects other working part, Gordian Wallet, is a mobile Bitcoin wallet for iOS which can connect to the Gordian Server.

Elango, a freshman from Australia, is also building out an accounting tool which will allow Gordian users to import transaction and price data to Microsoft Excel for tax purposes.

For another project, Elango and fellow intern Javier Vargas are stepping into the role of instructor by fleshing out the Blockchain Commons documentation of RPC codes for managing a Bitcoin node from the command-line interface.

Internship takeaways

Almost all the tools the interns have been working on contribute to each others tech stacks (Spotbit, for example, provides price data for the Gordian Wallet). Showing that theres more to open-source development than coding, cross-project collaboration is one of the internships key instructional points.

For Murray, this was indeed one of the internships primary lessons: that open-source development means creating sustainable tools that go beyond a solitary use case.

This was my first introduction to open-source development, and definitely one of the big learning curves is learning to collaborate effectively and developing processes for yourself. A lot of the stuff I wrote before I got here was something I needed to work one time, but this is a lot more about something that is going to work all the time, he told CoinDesk.

Murray said that he plans to continue to work on Bitcoin open-source software after the internship, whether professionally or otherwise. This was a common thread for the soon-to-be alumni of the Blockchain Commons.

Kovacic, who is already diving into other open-source repositories like Blockstreams c-lightning, said the internship reaffirmed my position that I want to work in the Bitcoin space.

For his part, Elango agreed, saying the internship shook off his apprehension about approaching the seemingly daunting task of maintaining open-source projects.

Its definitely got me interested in Bitcoin open-source development. At first I was kind of intimidated by these large open-source projects. After the internship, Ive become more comfortable with doing large contributions to these projects. Once I learn the basics of C++ I may start contributing to Bitcoin Core. And if not Bitcoin Core specifically, then some other open-source project, he told CoinDesk.

Looking ahead to the next cohort of interns

With this internship coming to a close, Allen is offering another one that will begin in October and end in December. He stressed that the latest internship hopes to pull in more talent from Bitcoin-adjacent fields, not just the realm of computer science. This could mean students studying law, library science or other disciplines to help improve aspects of Blockchain Commons documentation.

When Allen asked his students what they would say to incoming interns, Murray answered in the spirit of what may be considered the internships core ethos: Ask plenty of questions and cooperate with others whenever possible.

If I could give advice to anyone coming in it would be: dont be afraid to ask for help when you need it. We have one group chat and I wanted to be professional and not spam the chat with questions. One time, I had spent several hours trying to fix this Github commit and couldnt figure it out. But then Gorazd ended up giving me this one-line solution. If I had asked the question early, I would have saved a lot of time.

This article has been updated to correct a description of the Lethe Kit and to clarify how the Gordian Server and Gordian Wallet operate.

See more here:

Keep an Eye on This Cohort of Open Source Developer Interns - CoinDesk - Coindesk

The Government Digital Service truly was once world-beating. What happened? – The Guardian

No 10 adviser Dominic Cummings and his Silicon Valley ambitions for the civil service have put digital, data and technology in the spotlight but where does this leave the former bright light of UK tech, the Government Digital Service (GDS)?

For many years, government IT was the punchline to a joke that wasnt funny. People trying to deal with government departments picked up the phone or sent letters rather than experience the grief of going online.

But by using the tools of the open web simple words, clear design, open source code, agile ways of working one team in government managed to build some public services fit for the internet era. They didnt seek to amaze citizens; just make their experience simpler, clearer and faster.

That team the GDS was set up nine years ago with a brief from the then Cabinet Office minister, Francis Maude, to haul the civil service into the digital age. It started well. The UK governments new website, gov.uk, was many times cheaper than its predecessors and even won design of the year in 2013. New online services for setting up a power of attorney, taxing a vehicle and booking prison visits, among others, made a mark. Entrepreneurs described GDS as the best startup in Europe. David Cameron lauded the team as one of the great unsung triumphs of the coalition government. Five years after the team started, the UK led the world in digital government, according to the UN. Other countries took note, and copied.

The trick GDS pulled off was to realise that the game wasnt about changing websites. It was about changing government. The digital team saw that parts of public services, such as sending lots of text messages or taking payments, were being developed separately by scores of public organisations, at great cost to the public purse and making systems harder to use. By 2015, the GDS team had rebuilt some of these common components to be used again and again across the public sector. The service also published patterns and code, and enforced standards, to give everyone an incentive to raise their game.

This paid dividends, in better public services and money saved: a whopping 1.7bn by 2014, according to the Cabinet Office. As a result, in the November 2015 budget and spending review, GDS was handed a 450m bounty in what then cabinet secretary Sir Jeremy Heywood described as a vote of confidence.

Even in its pomp, GDS was not universally loved; senior civil servants described the kids in jeans as an insurgency. But the real problem was the challenge it presented to the sovereign power of Whitehall departments. Changing government was not on their agenda, nor in their interests. Common components took away control. So for that 450m, there was a tacit quid pro quo: GDS would support departments, not lead them. That shift, demanded by the chief executive of the civil service, John Manzoni, and encouraged by permanent secretaries who had been embarrassed by GDS, was a tipping point.

While GDS has retained some of the countrys smartest technology talent, its purpose has drifted. From once receiving grudging respect from departments for its rallying cries, it is now peripheral. A top-level post for government chief digital officer has gone unfilled for more than a year. This July, the UN announced that the UK had slipped to seventh in its world e-government rankings, falling six places in four years.

This leaves some awkward questions. Aside from the world-class platforms and patterns that were already taking shape five years ago, where did the 450mgo? For better or for worse, it hasnt gone into the data foundations so desired by the present administration. Whatever Cummings is looking for, he hasnt found it in GDS yet.

Some of GDSs legacy is in plain sight. Some digital successes have been the dogs that didnt bark during the pandemic. HMRC, universal credit and parts of the NHS have delivered online services that have just about stood up to extraordinary new demands. Without GDS starting out by showing departments how to deliver, rather than telling, this would not have happened.

And GDS did something else that no other team had done before. It led everyone using public services to expect a half-decent experience of their government online. It did this by worrying more about user needs than mandarin egos. For Britain to be a leading digital government, it needs a digital team that leads.

Andrew Greenway is a co-founder of Public Digital and former staffer at the Government Digital Service.

See original here:

The Government Digital Service truly was once world-beating. What happened? - The Guardian

Winux – Windows/Linux Convergence In 2020 – iProgrammer

It is a strange time when old enemies not only bury the hatchet but start to merge into a single entity. Windows and Linux, Microsoft and Open Source seem not only to be friendly but in the case of Windows and Linux merging into an undifferentiated whole - Winux anyone?

It all started with the move to .NET Core. Well it probably did, but it is too recent for a final history to be written. The .NET system was aggressively Windows- and Microsoft-only and, apart from some heroic open-source efforts on the part of the Mono team, it only worked under Windows. Then Microsoft threw away everything it had done and started over with an open-source project to reinvent .NET as a cross-platform development system and so .NET Core was born, along with much confusion and some developer suffering.

Why was .NET widened to support non-Windows environments?

Only Microsoft really knows but it seems reasonable that it was to serve the greater good of Azure. When Azure started out it mostly provided Windows-based virtual machines, but it didn't take long for it to be quite clear than its users wanted Linux and, if it was to be competitive with AWS, it needed to shift from being Windows-oriented to Linux-supporting - and it has.

Given Azure is potentially the cash cow that is to replace Windows in the future, it now becomes clear that supporting Linux is a good idea. So .NET becomes cross-platform and with .NET Core 5, or perhaps more fully in 6, in the future this task is more or less completed. There is only one version of the .NET platform and it is cross-platform.

Of course, there are still problems - aren't there always?

In particular, there is no .NET cross-platform UI and .NET Core programs tended to be command line or web-based where the UI issue doesn't arise. Eventually Microsoft realized that trying to pretend that .NET Core didn't need a UI was silly and some Windows-specific modules were rolled out to allow Win32/Forms and WPF to be used to create a UI.

As this all was coming to a conclusion, Microsoft suddenly seems to have had another realization - if Azure runs Linux, why not Windows? The Windows Subsystem for Linux (WSL) was born and you could work with Linux on a machine that primarily ran Windows. Not a virtual machine, but a hosted operating system within another operating system. Future historians might well look back on this first step as the start of the fusion between Windows and Linux and indeed Microsoft software in general and open source.

For example, why would Microsoft spend money developing an HTML rendererer for its own browser when there is an open source browser, used by Google, just sitting around waiting to be used. The Edge browser is an example of a development strategy that I think we are going to see more of as time goes on - open source + proprietary code and services.

Now we have news that Edge is going cross-platform. And why not? Chromium is cross-platform so what is surprising? What is surprising is that Microsoft is taking another step towards Linux. Of course, it all comes with some added Microsoft flavoring:

"For developers, WebView2 will be generally available for C/C++ and .NET by the end of 2020. Once available, any Windows app will be able to embed web content with the power of Microsoft Edge and Chromium. WebView2 provides full web functionality across the spectrum of Windows apps, and its decoupled from the OS, so youre no longer locked to a particular version of Windows.

Also, the new Microsoft Edge DevTools extension for Visual Studio Code is now generally available, enabling seamless workflow for developers as they switch contexts."

At the moment WebView2 only seems to support Windows, but Linux support in the near future would seem logical. Also notice the way that Microsoft is building a web of dependencies - Edge supports Visual Studio Code, which in turn favors Microsoft GitHub and of course Azure. It all fits together so tightly that you really wouldn't want to go to the trouble of pulling it apart.

"Starting in October, Microsoft Edge on Linux will be available to download on the Dev preview channel. When its available, Linux users can go to theMicrosoft Edge Insiders siteto download the preview channel, or they can download it from the native Linux package manager."

And while all this is going on WSLis being expanded. Linux GUI apps are being supported in the next few weeks. If you were determined enough, you could already get GUI apps to work, but now it's official. So I can sit down at my machine, boot Windows and run Windows and Linux GUI apps.

Things have gone a long way. There was a time when I had to worry about which operating system I was using. I now routinely use ls in PowerShell and I've almost forgotten what the Windows dir command did. Which slash to use in pathnames isn't much of a problem any more and I am increasingly surprised when I find that a Linux command doesn't work under Windows.

Our current desktop hardware has enough memory and disk storage to support a mind meld of Windows and Linux - something that until relatively recently would have seemed wasteful. We are in an age of operating system bloat - get used to it and take advantage of it.

Winux here we go...

Windows Subsystem for Linux capabilities enhance performance and make install a breeze

Microsoft Edge coming to Linux in public preview, with more support for secure remote work and enabling developers to bring Microsoft Edge to any Windows app

Chrome OS Runs Windows Apps - What's An OS Anyway?

This Is The Year Of Linux On The Desktop - Via Windows

Linux On Windows - Microsoft On How It Works

To be informed about new articles on IProgrammer,sign up for ourweekly newsletter,subscribe to theRSSfeedandfollow us on,Twitter,Facebook orLinkedin.

Make a Comment or View Existing Comments Using Disqus

or email your comment to: comments@i-programmer.info

Visit link:

Winux - Windows/Linux Convergence In 2020 - iProgrammer

GitHub aims to make India the largest market from the third largest – Economic Times

BENGALURU: GitHub, the code-repository service used by many developers, startups and companies worldover, aims to make India the largest market from the third largest at present, said Maneesh Sharma, India head of GitHub.

Sharma, who was appointed India head in February when GitHub opened its first office in the country, is doubling down on working with Indian startups and has built a sales team to target large startups, corporates and financial institutions and in the country.

Covid has accelerated digital transformation. Startups are helping disrupt the status quo which is getting every company look at how they use digital. Globally we have every large industry using digital. In India, the biggest segment using Github is IT enabled services, internet commerce companies and software product companies, he said.

Indias software as a service companies are also a potential customer base for Github, he added.

GitHub, which is popular among developers, particularly those who work on open source projects, was acquired by Microsoft in 2018. Since then, the company has stepped up expanding its presence in newer markets such as India, where there are millions of developers, who work for both large and small companies in India as well as globally. GitHub is also looking at engineering students to contribute to the repository.

We have been participating in global projects and are a great consumer of open source. We need to start thinking about how we can build software that can contribute to global communities as well, said Sharma. We are getting offshoots from the startup ecosystem as well who are starting to open source their libraries. There is a lot to do more,

They can start looking at how they can build software on GitHub. Think of it as credits. We will be doubling down on startup ecosystem, said Sharma.

Original post:

GitHub aims to make India the largest market from the third largest - Economic Times

Matillion Partner Ecosystem Identifies Trends Driving Data Transformation Market – The Grand Junction Daily Sentinel

DENVER and MANCHESTER, England, Sept. 23, 2020 /PRNewswire/ -- Matillion, the leading provider of data transformation for cloud data warehouses (CDWs), brought together data management consulting services leaders for a Matillion partner advisory roundtable to discuss how enterprise data transformation needs are impacted by current market trends. The event, held virtually in Q3, revealed the existing challenges and trends that are accelerated by the global pandemic and the pressing enterprise needs to access and leverage data for decision making.

- There is increasing demand for low-code and open source solutions among different data personas. Businesses look to enable diverse roles within their organization to use data tools that can help them take control of their projects. There is demand among data engineers who want to use solutions with both low code and open source options. There is still a need for open source, which allows engineers to innovate with data. However, an emphasis on time-to-value and scalability within a complex, enterprise IT environment, and the need to access data across parts of a business, is driving the low code/no code market.

- Enterprises are balancing the need for speed with cost optimization.Before the pandemic, many businesses were looking to increase time to value without increasing costs. But now, enterprises need to reduce infrastructure costs in preparation for a potential recession, but they also desire the quick implementation of solutions that enable them to leverage their data and reduce data latency to make timely, fact-based decisions.

-Enterprises need proven tech stacks and solutions from data consultants.In an effort to help companies optimize cost and scale strategies, consultants see a need to deliver off-the-shelf solutions that will work for diverse business use cases. Data management, integration, and transformation solutions need to work well with one another to allow enterprises easier onboarding, quicker proof of concepts to demonstrate results, and faster time to value. Offering ready-made technology stacks delivers value for clients faster as data projects are scaled down to align with pressured budgets and internal competition for available resources.

- Data volumes are driving data infrastructure modernization.The mean number of data sources per organization is 400 sources, and data volumes are growing by 63 percent per month. This has large enterprises progressing on their "cloud journey," by ditching legacy systems for new approaches in data management and data integration, to avoid additional technical debt and to position them for economic and business recovery. Cloud-native tools are easier to use and to scale, enabling enterprises to begin work on smaller proof-of-concepts to get the frameworks ready for when the pace of business picks up again.

- Talent acquisition is more critical than ever.It is easier to find the right technology solutions than it is to find employees with the right skill sets. Enterprises need to attract data engineers that will implement a modern tech stack to help them derive value from the data they have spent years amassing and aggregating.

"The latest advancements in data technologies addressed enterprise needs prior to the pandemic, but there is added pressure to modernize almost overnight to cope with new and increasing challenges," said Robert Griswold, Senior Manager, Data Foundations Practice Lead at Capgemini.

"Enterprises continue to adjust to the new ways of working, and face increasing pressure to uncover data insights," said Brian Bickell, Data Practice Director at Interworks. "There is a growing need for flexible solutions that serve a remote, distributed team. Companies are doing all they can to ensure business continuity and the ability to scale to keep them moving forward during these uncertain times."

"Current market conditions present yet-unseen pressure on enterprises to mitigate costs while becoming as competitive as possible, said Matthew Scullion, CEO of Matillion. "The trends identified by global leaders in manufacturing, finance, healthcare and more underscore demand for the power of the cloud, which organically solves for modern requirements while better positioning businesses to recover from the impact of a global pandemic."

To learn more about how Matillion and its partner ecosystem support faster time to insights within the enterprise, visit: https://partners.matillion.com/. For further data transformation industry updates and perspectives, follow Matillion on Twitter @Matillion and LinkedIn at https://www.linkedin.com/company/matillion-limited/.

About Matillion Matillion is data transformation for cloud data warehouses. Only Matillion is purpose-built for Amazon Redshift, Snowflake, Microsoft Azure Synapse, and Google BigQuery, enabling businesses to achieve new levels of simplicity, speed, scale, and savings. Trusted by companies of all sizes to meet their data integration and transformation needs, Matillion products are highly rated across the AWS, GCP and Microsoft Azure Marketplaces. Dual-headquartered in Manchester, UK and Denver, Colorado, Matillion also has a presence in New York City and Seattle. Learn more about how you can unlock the potential of your data with Matillion's cloud-based approach to data transformation. Visit us atwww.matillion.com.

Media contact Nonfiction Agency for MatillionShermineh RohanizadehSrohanizadeh@nonfictionagency.com+1 949 378 6469

View post:

Matillion Partner Ecosystem Identifies Trends Driving Data Transformation Market - The Grand Junction Daily Sentinel