How Bitcoin Became More Valuable Than The US Dollar In Cuba – bitcoinist.com

For the first time in the history of Cuba, the U.S. Dollar (USD) has lost value to another currency, living entirely on the internet without an army to back it, Bitcoin. The cryptocurrency has been trading at a premium against the United States currency in cash.

Related Reading | Turkish Lira Crashes: Bitcoin Freedom Vs. Fiat Currency Monopoly

As reported by Alex Gladstein, Chief Strategy Officer for the Human Rights Foundation, and Enrique Yecier, a Cuban citizen, via their Twitter accounts, the historic shift in currencies values extend to the main cryptocurrencies, Bitcoin, Ethereum, and stablecoin Tether, a digital currency pegged to the U.S. Dollar.

Due to the political tensions between the U.S. and Cuba, the island has been cut off from the international financial system and its payment rails. At the same time, the North American country has seen one of the biggest migrations from Cuban citizens.

This has created a situation where a lot of Cubans rely on remittances sent by their families abroad, but the national methods to receive or send a U.S. dollar transaction always affect the citizen. Therefore, crypto payments with Bitcoin and other coins, without the intervention of a third party, and more efficient in terms of fees and time.

In addition, Cubans use digital assets to protect themselves against inflation and the depreciation of their national currency. As a result, according to local reports, businesses and merchants have begun accepting Bitcoin and other cryptocurrencies as payment methods.

The Cuban citizen that made the viral report on the value of Bitcoin surpassing the U.S. dollar on the island, Enrique Yecier, claimed the following on this historical phenomenon:

Complaining that in Cuba a dollar in crypto has surpassed the value of 1 dollar in cash, is like complaining that Bitcoin today costs more than 46 thousand dollars when a year ago it cost 23 thousand. If there is someone doing things wrong, it is not BTC.

Cuba seems to be slowly, but surely moving into greater Bitcoin and crypto adoption. Per an AFP report, the national government granted merchants a license to operate with these digital assets back in August.

The measures fall into regulation 215, the report claims, issued by the Central Bank of Cuba. The new rules have been validated since September 15th and are aimed to regulate the use of cryptocurrencies for commercial transactions.

The country is still far from occupying a relevant spot in terms of adoption, as noted by Chainalysis in its 2021 report. The top ten countries with the most Bitcoin and crypto adoption have one thing in common, most are facing a high inflationary economy or a military conflict, as demonstrated by the inclusion of Argentina, Ukraine, Venezuela, Afghanistan, and others.

According to Yecer, Bitcoin its not the most valuable digital asset on the tropical island. This position goes to TRX (around $0,081162) which can go for as much as 8 CUP per token when a fair price should be around 6.15 CUP, the Cuban citizen said. Apparently, a lot of Cubans begin their crypto journey on the TRON ecosystem, thus the premium.

Related Reading | Bitcoin Bearish Signal: Hashrate Drops Over 20% In Last 24 Hours

As of press time, Bitcoin (BTC) trades at $46,797 with sideways movement in the past 24 hours.

Read the rest here:
How Bitcoin Became More Valuable Than The US Dollar In Cuba - bitcoinist.com

Bitcoin And The Commodity Of Time – Bitcoin Magazine

I'm having a hard time, living the good life, well I know I was losing time High Time - Grateful Dead

I had a long drive ahead of me. Nine hours down the coast, with a few stretches of time I knew I'd be out of streaming service range, and perhaps even out of AM/FM. Knowing this, I took a moment and dropped by my local Goodwill to take a gander at their CD collection before making my way. Lo and behold, I found a 3-CD boxed set of a Grateful Dead show from the 70s I had loved as a teenager, and figured that was the best bang for my couple of bucks: nearly three hours of guitars, feedback, percussive clomping and pitchy-sometimes-but-earnest-at-all-times singing. I bet the overlap of Bitcoiners and Dead Heads is slim, and I am not nearly as evangelical about the Dead's underlying fundamentals, but alas, as I drove further into the cosmic slop, a parallel between the two movements formed. As the hours flew by, I couldn't help but think how Bitcoin not only will bring about a global and free energy market, but with it, the re-commodification of time.

There's a lot of talk about time in the Bitcoin space, and for good reason. A fundamental mechanic of the solution to the Byzantine Generals problem is the immutability of the timestamp that orders all Bitcoin transactions. Without this component, the capped supply issuance that ensures digital scarcity would be meaningless; it doesn't matter how little bitcoin exists if one can just double-spend the same UTXO at a whim.

Proof-of-work creates immutable, decentralized truth by necessitating not just energy, but also the time spent searching for nonces for output hashes with enough leading zeroes to bring the next block header below the current difficulty target. Bitcoin does a brilliant job of utilizing the standardized local clock of a processor on the network to find an average of time spent (600 seconds target per block) across the whole network without relying on a centralized clock source to validate transaction orders. People like to scoff at Bitcoin transactions being inefficient, or slow, without realizing the implications of a global, digitally scarce, permission-less bearer asset with immutable, final settlement in under 30 minutes. The blockchain is simply a database of transactional signatures crystalized in impunity via a continuous hash string of blockheaders hashed with candidate block transactions in order to find the next blockheader. By stacking the previous blocks output hash on top of a universal forgetful function to find the next nonce, the serpentine chain of ledger becomes ever immutable; as every block stacks alongside the upward difficulty adjustment continuing its decade ascent, it becomes harder, and importantly, more wasteful for a bad faith actor to try and reorganize spends.

The act of spending electrical energy is not enough to find value in the digital space, it must be applied directly to spending time effectively, accurately computing within the consensus and thus securing the Bitcoin blockchain.

The Grateful Dead weren't always known by that name, and in fact, they got their first public audiences performing under the name The Warlocks. Unbeknownst to each other at the time, they shared that name with a young, upstart art band out of New York City. When the slow and lossy communications reached their respective coast via the new, blossoming independent music scene, they both decided to change their name. The Dead went on to see fairly imminent success throughout the following decades, whilst their counterparts changed their name to The Velvet Underground and enjoyed somewhat critical but overall muted popularity until a resurgence of their canon in the late 70s. Neither one of them wanted to spend the time to make a claim to their brand, and thus both moved on without so much any litigation or litigators. The Dead did a lot of things that are unheard of today in the hyper-commodification era of art, but perhaps none more important than allowing technologicly-savvy fans to bring their own recording equipment into their venues and tape the improvisation-heavy performances for their own, non-commercial use. A devoted taping community grew out of this allowance, and an entirely new way for hungry fans to engage with the product gave rise to a peer-to-peer market of tapes of shows highlighted for a spattering of personal reasons; someone's one-hundredth show, a birthday, New Year's Eve, debut of new material, a particularly good version of a beloved song, etc. Over time, this strictly-non-commercially-incentivized community drove each other to higher heights of recording quality, with new masters, new techniques, better microphones and better gear led to a powerful, decentralized taper community ready to offer their celluloid of choice for yours in a free market of experiences.

The particular boxed set I bought is from a series called Dick's Picks, named after the long time soundboard engineer of the band who utilized the data harvested from decades of trade and discussion amongst diehards to find the overwhelming favored shows of interest and, pulling from the archives of recordings directly from the band's own board, released commercial products of high quality directly targeted at the fans who made those shows famous in the first place by taping and trading their experience. I was a couple hours into my drive, stopping to feel the pain of filling my car with gas, when I decided to humor myself and see what these things sell for; it seemed kind of strange, knowing that nearly every show the band has ever played at this point has been recorded, located, and tagged online, legally, and for free, that these box sets could ever retain their value, furthermore proved by the fact I had found one for only a few dollars.

Imagine my initial confusion when I found that not only did they retain their value, the three discs were listed on eBay anywhere from $65-$150, appreciating at least three times in value since the February 1996 release. So not only did they compete with the slightly lower quality but freely distributed tapes, but by printing a limited run, they created supply less than ultimate demand. Had the person that donated this set to Goodwill taken the time to see its value on the open market, they might have made a different decision. Had an employee of Goodwill taken the time to search reseller markets, they might have priced the set with a higher premium.

The point isn't about whether these discs kept pace with gold from 1996 to 2021, but how they leveraged an open market of experiences to be commodified without innate commercial intent. The band put their advertising and commercial outreach into the hands of those that understood the product the best, resulting in deeper bond between the perceived value of the experience through the network of audience and the value of a high-fidelity commodity for re-experience. Certain nights, the group consciousness of a tuned-in-but-definitely-dropped-out masses, on the stage and off, came together in just the right way; those were the nights you wanted to play in your van on your four-and-a-half hour drive to the next night's show.

A whole community of trading tapes grew alongside the formidable touring empire of the band's now ubiquitous pop-culture presence. They always sold plenty of tickets, plenty of albums, plenty of t-shirts, and whatever loss of property they seemingly endured by allowing their fans this freedom was more than made up in other revenue streams. But beyond the obvious free marketing, production, and distribution, the band got something far more meaningful; they got a large following of humans to experience their lives listening to the recordings of the group. I don't care to convince you on their merits, that is not the point here, but I think anyone should agree there is simply something different about the way fans of this group behave in a lifestyle manner. This open network, completely symbiotic to the band's own commercial success, allowed a mutually perceived experience to be commodified and thus socially valued. The audience grew itself, and soon enough the market demanded less tape hiss and the more balanced highs of the eventually-released official discs.

One of the reasons those that did could even afford to drop out of the working class to abscond upon the concrete ribbons was sounder money. In fact, this particular night I happened on was recorded in winter 1970, before Nixon even took us off the gold standard. It took a minimum-wage worker less than a shift to afford the $4 ticket, and gas had not yet begun its rise from half a dollar in 1972 to a $1.35 in 1981. It didn't take a lot of time to earn enough for a three-show run, and hundreds of fans modeled lifestyle-supporting revenue streams around the nomadic culture; large craft bazaars would pop up in the parking lots with kitchens, arts and of course tape exchanges for those that missed a show to get up to date.

For many, the Grateful Dead were more than just a hobby, it was their life and livelihood. The lifestyle required such minimal overhead and the dollar was strong enough that the momentum of the summer of love spilled out into the hopeful halls and amphitheaters across the nation. The purchasing power of your time spent in labor was strong against the cheap price of goods and services; your time was worth something. As we find ourselves in an inflating goods and service market (in part) due to an expanding monetary supply, we find our time being devalued below our ability to keep the pace with rising prices. We work more and more and get less and less for it. This is a problem that can be solved (in part) with a technological upgrade to our monetary network. By imbuing our time laboring into a disinflationary and decentralized economic protocol, instead of fighting a compounding, hopeless struggle against the leaking entropy from an inflating dollar system, humans can spend more time making beautiful things for themselves and others. Bitcoin's dollar-denominated purchasing power does not rely on the dollar inflating more than the 2% target per year since the third halving algorithmically brought the relative-to-total supply issuance below 1.8%. Imagine a free, global market represented with a deflationary supply backed by geographically-independent, universally permissionless energy sources spending their time carving a hash string of blocks to communicate immutable transactional history through a network of peer-to-peer participants. There are very, very few use cases for a blockchain that would not be better served with a faster, more centralized database, but the historic ledger of volatility between human energy and capital is certainly at a level of demanding such necessity.

The history of humanity deserves a decentralized, open and yet immutable level of trust. Proof-of-work is not just the answer to the Byzantine Generals problem, it is also the first empirically sound answer to communally experienced time, and with it brings the assurance and ability for users to trust the commodity of time that is Bitcoin in the future. Bitcoin does change time preference in a literal time mechanism, for proof-of-work is a proof of history of spent computational power. It is a clock, just not a predictive clock. Mostly rubbish for planning future events, but in actuality it is an immutably true and decentralized standard of history and time; a decentralized time stamp server in order to solve the digital double-spend problem. Every payment a Bitcoin user receives of this digitally scarce bearer asset gets more purchasing power over time, and thus your economic incentive is to conserve your satoshis to maximize economic yield.

In these use cases, you can see the time preference variable changing directly alongside the economic incentive of the protocol. But when did this new standard of history go from being simply a shared database amongst cypherpunks to the immutable ledger of truth we all know today? I would argue it happened just before December 2012, as the nodes enforced the first halving upon the miners, just a few weeks before the astrological calendar of the Mayans ended. The implications that a new standard of time could have on human experience are vast. The path of a group society was incredibly modified with the mechanisms and technological advancements that allowed us to have a group consensus on months, days, hours, minutes and others. Through so-called quantum experiments such as the double-slit experiment, humans have in fact been able to see the modulation of wave forms of propelled atoms depending on the standard of time selected in the data harvesting. Perhaps we could recreate the experiment by taking snapshots each time a block is mined to look for demonstrative effects of a new standard of passing time in the observable universe. But regardless of what unknown implications of empirical, decentralized truth may come in the physics world, the way humans interact with time on a Bitcoin standard is quite different than how we used to on a fiat standard. You could make enough for a Dead ticket, the gas to get there, and a place to stay in a day of minimum wage work in 1970. This allowed more resources to be spent on capturing the shows in higher fidelity, and an abundance of human time to create a prolific culture around the group. The open-source community around Bitcoin makes it better, stronger and more available to serve more humans, but this social construct would not have coalesced around the protocol without the deflationary effects of the commodification of time via Bitcoin. You can save yourself a lot of time by using Bitcoin to save yourself a lot of time.

This is a guest post by Mark Goodwin. Opinions expressed are entirely their own and do not necessarily reflect those of BTC, Inc. or Bitcoin Magazine.

Go here to see the original:
Bitcoin And The Commodity Of Time - Bitcoin Magazine

This Under-the-Radar Stock Outpaced Bitcoin in 2021 — Is It a Smart Buy Right Now? – Motley Fool

Even with its otherworldly volatility,Bitcoin(CRYPTO: BTC) has been one of the best investments to own over the past several years. And in 2021, that theme continued, as the most valuablecryptocurrencyhas soared nearly 70%this year.

Risk-averse investors who avoid this burgeoning asset class might instead want to own actual businesses that give them the potential for outsized returns. In that case, look no further than The Joint Corp.(NASDAQ:JYNT). In fact, this nationwide franchisor of chiropractic clinics has even outperformed Bitcoin, up a remarkable 150% in 2021.

Does The Joint. Corp. stock look like anattractive opportunitytoday? Let's find out.

Image source: Getty Images.

As of Sept. 30, thebusinesshad666 total locations, of which 583 were franchised and 83 were corporate owned. What separates The Joint Corp. from traditional chiropractors is that the former only provides basic back adjustments. Sessions require no appointments and take just a few minutes to complete. There's no expensive equipment, and because patients don't need insurance, there's also no need for administrative staff.

While revenue won't soar going forward like it has historically (systemwide sales skyrocketed 70%annually from 2010 through 2020), investors can still expect serious gains as thecompany continues to gain scale. Thegross marginis just shy of 90%, as operating a capital-light franchise model is extremely lucrative.

There are some clear positive indicators that bode well for The Joint Corp.'s long-term prospects. Google Trendsdatashows that searches for "chiropractor near me" have trended higher over the past five years. Additionally, a 2020 Centers for Disease Control and Prevention study revealed that 25%of U.S. adults had back pain within the prior three months.

As the country's vaccination rate ticks up and people feel comfortable seeing a chiropractor for their back pain, The Joint Corp. will be there to treat them. Not only does the chiropractic care market generate $18 billionin annual revenue, but 50%of Americans don't even know what the word "chiropractic" even means. Powerful momentum, supported by what I believe is the general public'srising interest in health and wellness, will propel thisbusinessin the coming year andbeyond.

By 2023, management expects to have 1,000 clinics open. And they see the potential for 1,800 locations in the U.S. one day. This means the company's profitability, which has been accelerating in recent years, could be multitudes higher in the not-too-distant future. That's a key ingredient when it comes to achieving market-crushing returns.

Since reporting third-quarter financial results on Nov. 4, the stock has fallen nearly 32% (as of Dec. 15). TheRussell 2000, asmall-cap index, is down just 9%during the same time period. Although The Joint Corp. posted a year-over-year revenue increase of 36% for Q3, it was down meaningfully from the 61% jump in the prior quarter. I think this sequential deceleration spooked investors.

And uncertainty regarding the ongoing pandemic and the omicron variant, mixed with the often-discussed topics of inflation and the Fed's next move, result in high-growth names getting unusually hammered. The Joint Corp. is not immune to the market's latest whims.

Will The Joint Corp. outperform Bitcoin again in 2022? Your guess is as good as mine. But I think investors would be smart to take advantage of the recent price decline and consider buying shares in this fast-growing business. I know I will be.

This article represents the opinion of the writer, who may disagree with the official recommendation position of a Motley Fool premium advisory service. Were motley! Questioning an investing thesis -- even one of our own -- helps us all think critically about investing and make decisions that help us become smarter, happier, and richer.

Excerpt from:
This Under-the-Radar Stock Outpaced Bitcoin in 2021 -- Is It a Smart Buy Right Now? - Motley Fool

Ethereum Classic Continues To Follow Bitcoin, Holds Above This Key Level: Is A Bullish Weekend Ahead? – B – Benzinga

Ethereum Classic (CRYPTO: ETC) gave the bulls a gift this week and held above a key level at the $32 mark and began to trade sideways, just as Benzinga wrote may happen on Monday.

A sell-off in the general markets on Thursday and Friday morning leading into Fridays monthly options expiry spilled over into the cryptocurrency market andby Friday afternoon, the SPDR S&P 500. ETF Trust(NYSE:SPY) was bouncing up slightly off the open and pulling the crypto sector up with it.

Ethereum Classic has been trading more in tandem with Bitcoin (CRYPTO: BTC) than Ethereum (CRYPTO: ETH) recently, in a consistent downtrend off the November highs. That may be set to changebecause Ethereum Classic not only held above the key level, but on Friday printed a bullish pattern on the daily chart.

See Also:How to Buy Ethereum Classic

The Ethereum Classic Chart: Ethereum Classics downtrend dragged the crypto down almost 50% from the Nov. 9 high of $65.33 to a low of $33.31 on Wednesday. The crypto then bounced up slightly from the level but on Friday retested it as support.

The retest of the support level and subsequent bounce up has caused the crypto to print a bullish double bottom pattern at the level. If the pattern is recognized, Ethereum Classic could be in for a bullish weekend ahead. It is worth noting that Bitcoin also printed a double bottom pattern near the $45,500 level on Friday.

By Friday afternoon, Ethereum Classic was working to print a bullish hammer candlestick on the daily chart, which can often signal a reversal to the upside is in the cards. If Ethereum Classic is able to trade up above the most recent lower high the $36.82 level that was printed on Thursday it will negate the downtrend and could set the crypto into a new uptrend.

Ethereum Classics relative strength index (RSI) has been hovering near or below the 30% level since Dec. 4. When a stock or cryptos RSI reaches orexceeds the level, it becomes oversold, which can be a buy signal for technical traders.

To make a meaningful move to the upside over the coming days, Ethereum Classic will need to see increasing bullish volume. On Friday, the cryptos lower-than-average volume signaled continued consolidation at about 63,540 compared to the average 10-day volume of 147,257.

Ethereum Classic is trading below the eight-day and 21-day exponential moving averages (EMAs), with the eight-day EMA trending below the 21-day, both of which are bearish indicators. The crypto is also trading below the 50-day simple moving average, which indicates longer-term sentiment is bearish.

Want direct analysis? Find me in the BZ Pro lounge! Click here for a free trial.

Image: ETC Public Domain via Flickr

Read the original post:
Ethereum Classic Continues To Follow Bitcoin, Holds Above This Key Level: Is A Bullish Weekend Ahead? - B - Benzinga

The internet runs on free open-source software. Who pays to fix it? – MIT Technology Review

To support MIT Technology Review's journalism, please consider becoming a subscriber.

For something so important, you might expect that the worlds biggest tech firms and governments would have contracted hundreds of highly paid experts to quickly patch the flaw.

The truth is different: Log4J, which has long been a critical piece of core internet infrastructure, was founded as a volunteer project and is still run largely for free, even though many million- and billion-dollar companies rely on it and profit from it every single day. Yazici and his team are trying to fix it for next to nothing.

This strange situation is routine in the world of open-source software, programs that allow anyone to inspect, modify, and use their code. Its a decades-old idea that has become critical to the functioning of the internet. When it goes right, open-source is a collaborative triumph. When it goes wrong, its a far-reaching danger.

Open-source runs the internet and, by extension, the economy, says Filippo Valsorda, a developer who works on open-source projects at Google. And yet, he explains, it is extremely common even for core infrastructure projects to have a small team of maintainers, or even a single maintainer that is not paid to work on that project.

The team is working around the clock, Yazici told me by email when I first reached out to him. And my 6 a.m. to 4 a.m. (no, there is no typo in time) shift has just ended.

In the middle of his long days, Yazici took time topoint a finger at critics, tweetingthat Log4j maintainers have been working sleeplessly on mitigation measures; fixes, docs, CVE, replies to inquiries, etc. Yet nothing is stopping people to bash us, for work we arent paid for, for a feature we all dislike yet needed to keep due to backward compatibility concerns.

Before the Log4J vulnerability made this obscure but ubiquitous software into headline news, project lead Ralph Goers had a grand total of three minor sponsors backing his work. Goers, who works on Log4J on top of a full-time job, is in charge of fixing the flawed code and extinguishing the fire thats causing millions of dollars in damage. Its an enormous workload for a spare-time pursuit.

The underfunding of open-source software is a systemic risk to the United States, to critical infrastructure, to banking, to finance, says Chris Wysopal, chief technology officer at the security firm Veracode. The open-source ecosystem is up there in importance to critical infrastructure with Linux, Windows, and the fundamental internet protocols. These are the top systemic risks to the internet.

Visit link:

The internet runs on free open-source software. Who pays to fix it? - MIT Technology Review

NAB to ‘innersource’ some of its business platforms – iTnews

NAB is set to innersource some of its key business platforms, after successfully applying the model to the development and maintenance of more internally-focused code libraries and tools.

Innersource is an increasingly popular set of software engineering practices that are used to create an open source-like culture inside of an organisation.

NAB has been innersourcing code for about two-and-a-half years, with the model forming part of a broader set of practices known as the NAB engineering foundation or NEF, which is designed to help development teams get code into the cloud and into production faster.

Another big four proponent of innersource is CBA, which revealed its own work to iTnews earlier this year.

NAB engineering manager Matt Cobby told the Innersource Summit 2021 last month that the bank adopted innersource initially to remove duplication of coding effort and costs as different teams worked to make their products cloud-ready.

We migrated Australias first highly confidential banking workload into public cloud in 2016, and we enabled teams to move rapidly and take control of their own outcomes, but this led to duplication of tooling and there was a need to reduce the cost per workload to scale faster, Cobby said.

It was in this situation that I found myself looking for a tool to automate AWS Credentials setup, and I found 20 different versions of the same tool across Github. Some were supported and some werent, and some were fully functioning and others less than perfect.

I felt that this was definitely one of these places where we were not being as efficient as we could be, and I felt that the techniques of open source development could help us improve.

Cobby said that the bank also wanted to reduce that cost of experimentation in order to help teams develop new ideas and test out new business solutions quickly and efficiently.

That led NAB to adopt innersource, which it defines as the sharing of knowledge, skills and code across teams in NAB using open-source collaboration techniques.

By creating formal ways to share the work of different teams and to collaborate on further development, the bank hoped to remove undifferentiated heavy lifting of multiple teams reinventing code libraries and tools, and in doing so, refocus the efforts of teams to reach a business outcome much faster.

Innersource setup

With hundreds of development teams across the bank that each had their own way of working, the bank focused its innersourcing efforts on the interfaces between teams.

This allowed us to create a safe environment for engineers to talk to other engineers across the bank: to reach out, to understand their codebases, and to share their work, Cobby said.

NAB said its adoption of innersource had to balance the needs of the bank in terms of architectural endorsement, security endorsement, ownership, accountability and auditability, with the needs of an open source community to be creative.

To do so, it appointed community champions that act as on-the-ground evangelists.

They make sure that their peers know about innersource, Cobby said.

Theyre running community showcases for new products where they do peer review on the products, they check if the product meets certain criteria such as do we know what problem it solves, that it isnt the duplication of an existing product, that it meets our minimum standards and that it has a strong ownership. Its at this point as well that security and architecture both have a voice and can endorse or query any individual product.

Where we have multiple solutions to the same problem, well build a small community around that problem and well work with all the interested parties to reduce that duplication and come up with a better solution for everyone.

On the other side of the model, Cobby said a strong culture of product ownership has been established.

This is where we make sure that each product within innersource has a distinct product owner, he said.

The owners responsibilities are around making sure that the product meets the minimum standards, that it has a workflow, that theres somebody there to read and evaluate the pull requests, and to make sure that these pull requests meet certain SLAs.

Theyre there to provide technical support for the products and to take questions from people when theyre asking about contributions to the products.

We also provide these product owners with a playbook in order to help them innersource their own platforms and their own products.

Products that are to be innersourced are classified as either curated or community.

The purpose of this is to show that when consuming teams are looking at what they can use [from elsewhere in the bank], they have the confidence that the code they are using is endorsed and has production-level support - but we dont set the bar so high that the community projects cant get started, Cobby said.

Typically, a curated product is proven in production. We know that its gone through all our normal existing operational processes, that its running in production with customer workloads, that its been security tested, that it encapsulates many years of learning and experience across the organisation, and that theres often significant investment behind it.

This means that there are very few curated products, but they are very high quality.

On the community side we embrace our open source origins and this is more of an incubator for new ideas. We make sure theres a very low barrier to entry.

We tend to use a more open-source style support model where its often by best endeavours, and the typical products we see in this space are around tooling or individual pipeline components which are used in the delivery of applications.

While maintaining a light touch, Cobby said there are some minimum standards that all code repositories have to meet to create a safe space for teams to reach out and work on other teams repositories in a clear and consistent way.

We make sure that every innersource repository and every innersource product has a README [file] that makes it very clear what the product is doing and what problem it solves, Cobby said.

We make sure the CODEOWNERS [file] is maintained and up-to-date so that external developers know who to talk to when they have a question.

Theres a contributing guide so that when you want to make a change theres a very clear path for you to do so, and a code of conduct to make sure that you know the acceptable behaviours for the team [that created the product or tool].

Benefits so far

NAB said that code quality, collaboration and learning opportunities had all increased under innersource.

When we write code in the open, we tend to write better code, Cobby said.

Were improving discoverability and the ease of finding the source of truth for a piece of information, and were reusing intellectual property across the different domains.

Cobby said that the openness made it easier to understand why certain architectural decisions were made.

We peer review each others work and our discussions are in the open, so that we can always find out why a certain architectural decision was made or why this decision was made not to use a particular technology, he said.

Teams can also move faster by making changes to existing code libraries directly, where required.

When we have the ability to read another teams repository, we have the ability to remove bottlenecks, Cobby said.

If youre dependent upon a team and they cant implement your change, you have the ability to make the change yourself and get it accepted into the core product.

Were then also breaking down the silos of the organisation and helping learnings from one area be applied into different areas.

The bank saw some unanticipated benefits around mentorship, cross-skilling and learning.

We found through some of the innersource hackathons that we ran that we had senior engineers mentoring junior engineers, we found frontend developers learning how to be API developers, weve had backend business service operators learning how to be frontend React developers, Cobby said.

This is one of the real unexpected benefits from innersource and is something which is giving us probably far more return on investment than we ever expected.

So one of the main benefits for us is this cross-skilling of people across the organisation.

Expansion opportunities

Still, Cobby indicated there are opportunities for the bank to strengthen its innersource adoption as well as to broaden its use.

Weve been looking at how we can innersource our business platforms, he said.

With some of the benefits that weve seen before about decoupling teams and removing the blockers from coupled backlogs, theres a real business potential here for understanding how we can ease delivery through the organisation and across multiple platforms.

He also saw further opportunities to automate some of the metrics NAB used around innersource; the bank is working with Github on this particular area of improvement.

Were looking at the number of people collaborating across teams, were looking at things such as product reuse, he said.

We have automation that scans Github for dependency management and tells us how many reuses of an individual library that were seeing. We can then quantify that library reuse into financial terms in terms of how much it cost to develop, and how many times its been reused.

Were also using the metrics for operational health of the innersource products, because its very important to check that products dont end up in some wasteland. Were using some of these metrics in reporting to find products which need some help, or that need an owner, and then we step in and get them some help.

He continued: Weve just scratched the surface in terms of what we can do [here].

I believe that the source code of an organisation is often an untapped source of intelligence, and theres a lot of information there we could look at to help us understand what are the flows of information across the organisation.

Read the original post:

NAB to 'innersource' some of its business platforms - iTnews

Why The IAB Tech Lab Still Hasn’t Taken On The Administrator Role For Unified ID 2.0 AdExchanger – AdExchanger

Nothing in ad tech is ever easy.

Despite previously signaling interest in serving as an administrator for Unified ID 2.0 earlier this year, the IAB Tech Lab is still on the fence about taking on this role for the open-source initiative to replace third-party cookies with email-based IDs.

During a Tech Lab board meeting last Thursday, a vote on the matter was tabled for further discussion.

Tech Lab assuming the administrator role for Unified ID 2.0 is being actively explored, but no decision has been made, a spokesperson told AdExchanger. We will provide an update when we have something to share with the industry.

So, whats the holdup?

One issue has to do with the fact that the Tech Lab doesnt feel comfortable taking on the role of admin as currently defined in The Trade Desks technical specs for Unified ID 2.0, according to someone with knowledge of the matter who asked to remain anonymous.

The administrators main job is to be in charge of a centralized database of sorts and manage access to the UID2 partner ecosystem. That means distributing encryption keys to UID2 operators, distributing decryption keys to compliant members, sending UID2 opt-out requests to operators and DSPs and auditing participants for compliance. The administrator must also shut off bad actors that abuse the ID.

Its that last bit the IAB Tech Lab board isnt comfortable with, as in pulling the plug if a partner violates UID2s code of conduct.

The Trade Desk, however, is pushing for an industry entity to take on the responsibility of controlling the kill switch for UID2.

Hence, the impasse.

Back in February, though, the IAB Tech Lab seemed close to sealing the deal. In a blog post, Jordan Mitchell, then the IAB Tech Labs SVP of privacy, identity and data (he left in April), noted that the Tech Lab is well suited to the serve the technical role of UID2 admin and manage the open-source software powering UID2 in collaboration with other industry players.

But the devil clearly lives in the details on this one.

Beyond the current structure of the role, the Tech Lab is also concerned about policing the use of UID2 in countries with strict privacy laws, like the General Data Protection Regulation in Europe.

If the IAB Tech Lab does eventually take on a modified version of the admin role, Europe and potentially other countries, such as Brazil and India, will likely be carved out, at least to start.

GDPR violations carry a hefty fine and no one wants to be the one left holding a bag full of potential liability.

But all that said, the IAB Tech Lab board appears willing to move forward as an admin as long as its not on the hook for shutting down the baddies.

And although the admin role is still TBD, the Tech Lab has already taken on a few other functions related to the initiative, including hosting the open source code repositories for UID2 on GitHub.

The next step will be to set up a follow-up board meeting dedicated to the topic of UID2, likely for sometime in the new year. This meeting will include a vote.

One of the reasons a vote didnt happen at the board meeting last week is because the UID2 item appeared rather far down on a long agenda and by the time it was addressed a lot of members had already left.

The IAB Tech Labs board, chaired by Neal Richter, Amazon DSPs director of advertising science, is made up of nearly 40 product and business leaders across a broad range of advertising and media companies, including Google, CafeMedia, Facebook, TikTok, LiveRamp, News Corp, ViacomCBS, Criteo, The Trade Desk, PubMatic and Neustar.

See the article here:

Why The IAB Tech Lab Still Hasn't Taken On The Administrator Role For Unified ID 2.0 AdExchanger - AdExchanger

ShiftLeft Expands Attackability Detection Coverage to JavaScript and TypeScript – StreetInsider.com

News and research before you hear about it on CNBC and others. Claim your 1-week free trial to StreetInsider Premium here.

The new feature release adds the most popular programming language to the scanning arsenal of NG-SAST and I-SCA users, empowering shift left security practices by analyzing full attack data paths and prioritizing attackable vulnerabilities.

SANTA CLARA, Calif.--(BUSINESS WIRE)--ShiftLeft, Inc., an innovator in automated application security testing, today announced that its Intelligent-SCA product has added scanning and attackability analysis for JavaScript (JS) and the TypeScript (TS) language to the ShiftLeft CORE platform. JavaScript is the most widely used programming language and is also a frequent attack target for cybercriminals seeking to exploit vulnerabilities in open source code and the software supply chain.

Development teams using JavaScript frequently add functionality to their code by quickly writing new code or borrowing it from open source libraries like npm or reusing existing libraries and code modules on GitHub. Because JavaScript is a dynamic language and something of a Swiss Army Knife working on both the front-end and the server side, developers often move quickly to write quick fixes or hacks that create longer term vulnerabilities. Equally challenging, open source Javascript libraries frequently contain vulnerabilities that create unknown risk for the application. When the introduced risks are serious, it can require months of remediation work to identify and address all the risk ramifications.

By adding JavaScript coverage, ShiftLeft dramatically expanded the ability of Application Security (AppSec) teams to shift security left by providing detailed and accurate guidance to development teams on which vulnerabilities in web applications and JavaScript-driven frameworks can be proven to result in damaging attacks. With the addition of JavaScript coverage, ShiftLeft is one of the most comprehensive solutions in the marketplace and allows us to test all our web application code before we ever go into production, says Adam Fletcher, Chief Security Officer at Blackstone. This means we see security flaws sooner and can focus our efforts on the most attackable vulnerabilities, letting us safely ship code faster. With the new product capabilities, ShiftLeft offers the following benefits:

By adding JavaScript coverage, ShiftLeft can dramatically expand the percentage of application code covered with attackability insights, says Alok Shukla, VP Products, ShiftLeft. As the most popular language playing a critical role in the global web and application infrastructure, JavaScript security will become even more important as the pace and severity of attacks on applications and the open source supply chain - much of which is written in JavaScript increase over the course of 2022.

The addition of JS/TS coverage further cements ShiftLeft as the most comprehensive and authoritative provider of Application Security testing and attackability analysis on the market today. Application security teams and developers using ShiftLeft are able to close more security gaps at a faster pace and spend more time focusing on the issues that matter thanks to the unique ability of ShiftLeft to spotlight attackable vulnerabilities and clearly identify low-risk theoretical vulnerabilities.

About ShiftLeft

ShiftLeft enables software developers and application security teams to radically reduce the attackability of their applications by providing near-instantaneous security feedback on software code during every pull request. By analyzing application context and data flows in near real-time with industry-leading accuracy, ShiftLeft empowers developers and AppSec teams to find and fix the most serious vulnerabilities faster. Using its unique graph database that combines code attributes and analyzes actual attack paths based on real application architecture, ShiftLefts platform scans for attack context and pathways typical of modern applications, across APIs, OSS, internal microservices, and first-party business logic code, and then provides detailed guidance on risk remediation within existing development workflows and tooling. ShiftLeft CORE, a unified code security platform, combines the companys flagship NextGen Static Analysis (NG SAST), Intelligent Software Composition Analysis (SCA), and contextual security training through ShiftLeft Educate to provide developers and application security teams the fastest, most accurate, most relevant, and easiest to use automated application security and code analysis platform.

Backed by Bain Capital Ventures, Mayfield, Thomvest Ventures, and SineWave Ventures, ShiftLeft is based in Santa Clara, CA. To learn how ShiftLeft keeps AppSec in sync with the rapid pace of DevOps, see https://www.shiftleft.io/.

View source version on businesswire.com: https://www.businesswire.com/news/home/20211216005188/en/

PR Contact:Corinna KruegerShiftLeftckrueger@shiftleft.io

Source: ShiftLeft, Inc.

See more here:

ShiftLeft Expands Attackability Detection Coverage to JavaScript and TypeScript - StreetInsider.com

Command Prompt in Windows 11 to be replaced by Windows Terminal as the default experience – Ghacks Technology News

Windows Terminal was unveiled in 2019, and after a year in preview phase, it was released as an open source tool in 2020. Microsoft has announced that the Command Prompt in Windows 11 will be replaced by Windows Terminal.

The Redmond-based company has been making changes to its operating system, replacing legacy components, with modern ones. The most notable change is, of course, Control Panel, which has slowly but surely been superseded by the Settings app. Notepad recently got an overhaul, a much-needed one in my opinion. So, it's not surprising that Microsoft wants to shift away from CMD to a modern equivalent with richer options.

The move towards making Windows Terminal as the default command line tool will begin with the Windows Insider Program. It makes sense, as feedback from users will be crucial, and will probably involve testing use-case scenarios, where CMD is normally used.

The announcement made by the company, first spotted by The Verge, states that Microsoft will enforce the change for all Windows 11 users in 2022.

While Windows Terminal will primarily be useful for programmers, its functions are not necessarily limited to developers. All commands that are supported in Command Prompt, are also supported in Windows Terminal. So, if you're familiar with the legacy tool, you'll feel at home with its replacement. In addition to this, the tool also supports PowerShell, Azure Cloud Shell, and Windows Subsystem for Linux (WSL), meaning it is quite versatile.

Interface-wise, Windows Terminal has significant advantages. It supports tabs and panes, you can work on multiple tabs or panes and switch between them easily like you were using a web browser. The command line shell also lets you rename tabs, duplicate them, set a color to the tab's title bar, etc. The application does more, you can customize its appearance, color schemes, for a more personalized experience. I wish File Explorer supported these features.

Windows Terminal has a GPU accelerated text rendering engine, the command line shell includes support for Unicode and UTF-8 character support, HTML, RTF and Plain Text formatting. The tool can be used with special characters and emojis. Keyboard shortcuts are always nice to have.

Due to the fact that it is open source, anyone can contribute to the source code, track issues on GitHub. The utility is available at the Microsoft Store, which means it will get updates and new features faster than if it were to be patched via Windows Update. And it is compatible with Windows 10.

Will CMD be removed from Windows 11?

The fact that the announcement says that Windows Terminal will be the default experience, seems to suggest that Command Prompt will continue to exist, alongside PowerShell. It just won't be the recommended option anymore. Maybe Microsoft will nag you to use Windows Terminal, like it does with Edge. If you don't get it, you may to read this article for context.

It's a little sad to wave goodbye to CMD, I'll miss it. Have you used Windows Terminal?

Summary

Article Name

Windows Terminal will replace Command Prompt as the default experience in Windows 11

Description

Microsoft will replace Command Prompt with Windows 11 to be replaced by Windows Terminal in 2022.

Author

Ashwin

Publisher

Ghacks Technology News

Logo

The rest is here:

Command Prompt in Windows 11 to be replaced by Windows Terminal as the default experience - Ghacks Technology News

Crypto Wealth Manager Vaneck Launches Polygon and Avalanche Investment Offerings Bitcoin News – Bitcoin News

The wealth manager Vaneck has announced it has expanded its exchange-traded note (ETN) offerings to support the tokens polygon and avalanche. The two ETNs follow five previously launched funds in Europe that allow investors to gain exposure to leading digital assets.

Vaneck has announced the launch of two ETNs that leverage the crypto assets polygon (MATIC) and avalanche (AVAX). The ETNs represent shares of either AVAX or MATIC and the funds are fully collateralized. Vaneck expands its crypto investment offering with two new ETNs on crypto platforms Avalanche and Polygon, the wealth manager tweeted on December 16.

Avalanche and polygon have seen significant demand this year and have gathered massive gains year-to-date. The token avalanche (AVAX) has seen its market capitalization join the top ten digital assets in the world, in terms of overall valuation. Today, AVAX holds the 9th position after climbing 3,509% since this time last year.

Polygon (MATIC) has also risen in value a great deal in 2021 with year-to-date gains of around 11,393%. MATIC is the 14th largest crypto asset in terms of market capitalization today which is around $15 billion. Both MATIC and AVAX are compatible with Ethereum but are also considered Ethereum competitors.

The ETNs offered by Vaneck are like exchange-traded funds (ETFs) but ETNs are considered unsecured debt securities. Vaneck had tried to get its spot market bitcoin (BTC) ETF approved by the U.S. Securities and Exchange Commission this year but the ETF was denied in mid-November.

The Polygon and Avalanche ETNs use Crypto Compares MVIS data to replicate the value and yield performance of each asset. The underlying crypto assets in Vanecks ETNs are held in custody by Bank Frick & Co. AG. The AVAX ETN ticker will be VAVA, and the MATIC ETN ticker will be VPOL.

What do you think about Vaneck introducing Polygon and Avalanche ETNs? Let us know what you think about this subject in the comments section below.

Jamie Redman is the News Lead at Bitcoin.com News and a financial tech journalist living in Florida. Redman has been an active member of the cryptocurrency community since 2011. He has a passion for Bitcoin, open-source code, and decentralized applications. Since September 2015, Redman has written more than 4,900 articles for Bitcoin.com News about the disruptive protocols emerging today.

Image Credits: Shutterstock, Pixabay, Wiki Commons

Disclaimer: This article is for informational purposes only. It is not a direct offer or solicitation of an offer to buy or sell, or a recommendation or endorsement of any products, services, or companies. Bitcoin.com does not provide investment, tax, legal, or accounting advice. Neither the company nor the author is responsible, directly or indirectly, for any damage or loss caused or alleged to be caused by or in connection with the use of or reliance on any content, goods or services mentioned in this article.

View post:

Crypto Wealth Manager Vaneck Launches Polygon and Avalanche Investment Offerings Bitcoin News - Bitcoin News