Covid-19 Exposes the Myth of Bitcoin as a Safe Haven – hackernoon.com

Its fair to characterise Bitcoin as a reaction to the last global economic meltdown. At the time, this crypt offered great promise as a safe haven for investors against exactly the type of financial crash we saw in 2007/2008 and, with appalling (and predictable) inevitability, are seeing again now.

Let's face it, we all knew this crash was coming (yes you did, even if you didn't want to); we just didn't know when or how bad.

Now we know "when"... and "how bad"...? Well, by any market metric it's not looking good not by a long, long margin.

Getting it Right this Time Round

The difference is, Dec will be sustainable in the face of a crisis in the way that Bitcoin was supposed to have been, but has (so far) proved not to be.

We now urgently need technology and digital assets that won't simply be a rinse and repeat of the last ten years; if not, financial "stability" will continue to be measured in the time it takes the (now demonstrably) fragile mainstream economy to cycle from boom to bust.

So the question begging to be asked (and answered) is...

What the Heck Happened to Bitcoin?

Any and every Bitcoin enthusiast you've ever been trapped in an elevator with will have told you (undoubtedly many times between floors) that socioeconomic crises like we are now encountering are precisely what this (and most current) cryptocurrency was designed to insure us against. Some of cryptos most ardent supporters are (or perhaps "were" in the light of recent events) adamant that due to these digital asset being uncorrelated with traditional assets like stocks they are safe havens against the type of economic crash Covid-19 is wreaking on the markets.

Maybe now it will sink in: the digital token smoke and mirrors charade represented by current crypts (and the ropy technology that supports them) is a terrible way to secure your financial future and a disastrous attempt at establishing some kind of reputed stable alt economy or sustainable payments system to challenge mainstream alternatives.

It is now time to think differently. And laterally.

But we need to do it quickly, and as a community.

What About Stablecoins?

For the purposes of this article, and its explanatory analogies vis-a-vis our tech and aims, stablecoins are fringe products: stablecoins function to create connections between the legacy world and blockchain. Thus, the raison dtre of stablecoins to mitigate and hence solve price volatility which has so pervasively characterised cryptocurrencies while attempting to retain other characteristics of Bitcoin, is interesting but not game changing in the way we propose.

The fact that stablecoins most notably Tether (USDT) is also a popular asset among crypto traders who want to place their funds in dollars during market downturns so they can avoid crypto price volatility really just reveals them to be a type of crypt that wants to have its proverbial cake and eat it too.

The problem with the stablecoin paradigm even though it seeks to maintain the free flow of capital and censorship resistance, which is laudableis that it is exclusively reliant on third-party factors and commodities, whether through value-pegging or collateralisation. As a result, the value of stablecoins is dependent on external factors that users cannot control, and in this regard they are very similar to a stock or a bond rather than a truly decentralised alternative.

Certainly, the concept of stablecoins is apartsolution but not thewholesolution to therealproblem that needs solving: a step in the right direction, yes; the answer to a truly stable, user-centric crypt that will achieve wide-scale public adoption, generating wealth for all (but an already wealthy elite), no.

Cryptocurrency: Inflated Financial Assets

In practice, Bitcoin is too slow and inefficient to act like electronic cash and hence support any sort of alt economy to rival its mainstream counterpart. (Proof-of-Work and other expensive and unnecessary protocols have effectively hobbled it.) Instead, many enthusiasts today view it as a form of digital gold. Real gold has long been considered a reliable store of value, and investors tend to see it as a form of insurance against an economic downturn.

Many Bitcoin advocates have claimed that the digital asset belongs in this league too.

Or at least they did. Until the proverbial odorous excrement hit the fan two weeks ago.

And that was before the unmitigated carnage of March 12, when Bitcoin lost more than 40% of its value...

As traders continue to rush for the door, dumping Bitcoin to raise much-needed cash, cryptocurrency in its current underdeveloped and over-hyped form is revealing itself to be little more than another financial asset.

But is This Really any Surprise?

It shouldn't be. Not to you and me. Not to anyone. Beyond the hype, Bitcoin (or any crypt) was never really anything elsebutan inflated financial asset; the issue is and always was that the promise of a safe haven implied by Bitcoin (and other crypts) was always the clue to its most egregious failing. Sure, Bitcoin (and other crypts) is not correlated to the financial markets but it is not correlated toanything(a feature that, as discussed, stablecoins are designed to overcome), hence leaving it open to its infamous volatility.

Then last month happened. Though Bitcoins price has jumped since, the 40% dip of March 12 was enough to reveal the crypts instability in the face of the type of crises it was supposed to be stabilising HODLersagainst.

So is Bitcoin not actually a safe haven after all? Maybe. Maybe not. (At least if you cant sell corn or livestock you and your family can eat 'em.) Though it appears to have failed the biggest test of the idea yet, the debate will undoubtedly rage on, serving as a reminder that we are still figuring out exactly what Bitcoinisandis not.

Dec is What it is Not

It is the is not that our token, Dec, is specifically and unambiguously built as an antidoteto.

Unlike stablecoins (which are really only a reflection of legacy commodities and aggregate collateralisation of other crypts), Dec is pegged to the value of user data; user data being a commodity that is controlled by every user at the level of every user. Our platform and web browser ensures the 100% security and immutability of this data, hence ensuring its stability and value.

As a result, Dec places every users financial future in their own hands through proactive online activity and participation while specifically not leaving asset acquisition and valuation to the vagaries of external factors over which users have zero control.

Please find us on social media:

Subscribe to get your daily round-up of top tech stories!

Go here to see the original:
Covid-19 Exposes the Myth of Bitcoin as a Safe Haven - hackernoon.com

BTCPay Looks to Anonymize Bitcoin Transactions With PayJoin Integration – CoinDesk

BTCPay, a popular open source tool for accepting bitcoin payments, is turning to PayJoin for preserving the privacy of those transactions.

PayJoin (also called P2EP) is a relatively new way to send private transactions in bitcoin and may offer better privacy than current popular alternatives such as CoinJoin. Having BTCPay on board gives PayJoin a major boost in recognition that could translate into broader use of the privacy technology by other firms.

BTCPay developer Andrew Camilleri told CoinDesk the company plans to release an "initial" version of the P2EP privacy feature built into BTCPay on Thursday. He and BTCPay lead developer Nicholas Dorier have been the main contributors to the code.

Open source BTCPay is used by a range of merchants as a way of accepting bitcoin and lightning payments.

"Our mission is financial sovereignty for everyone and PayJoin is a great tool to help break blockchain analysis heuristics and achieve that. Since BTCPay is so widely used, it should help jumpstart usage," Camilleri told CoinDesk.

The work has been sponsored by Blockstream for the past several months to help Camilleri focus on the PayJoin changes.

"We're hoping to improve the privacy and fungibility of bitcoin by accelerating the adoption of P2EP. If enough wallets and businesses support P2EP, it could provide the critical mass needed to achieve widespread financial privacy," said Blockstream Chief Strategy Officer Samson Mow.

Not as private

CoinJoin is the main privacy tool used these days, in part because it is used by wallets Wasabi and Samourai, making it much easier for people to use.

CoinJoin allows multiple people to mix their bitcoin transactions together, making it less obvious who owns which bitcoin. While it helps users to maintain their privacy, one of the main issues is it's easy to see when a bunch of users have done a CoinJoin simply by looking at the blockchain.

Bitcoin researcher Paul Sztorc likened the technology to "wearing a ski mask to an indoor mall."

The main benefit of PayJoin's ConJoin implementation, on the other hand, is that once done, the transactions look the same as other transactions on the Bitcoin blockchain.

So instead of many senders mixing their transactions, only the sender and receiver mix a transaction.

Ultimately we need to make a choice on what kind of world we want to live in, one where there is financial privacy or one where there isnt.

It "breaks blockchain analysis heuristics," Camilleri said. Blockchain analytics companies are able to glean certain transaction criteria to guess (often correctly) if bitcoins belong to the same owner, or to see if the transaction was a part of a CoinJoin.

"Bitcoin's our chance for a logical and fair form of money. Companies that offer services that enable others to discriminate are essentially destroying that chance," Camilleri said.

One disadvantage, however, is both the sender and receiver have to support PayJoin.

"Merchant payment processor support for P2EP made perfect sense. P2EP requires the sender and receiver to both be online. If you're sending, you're naturally online, and merchants have to be online all the time," Mow said.

What's next

PayJoin has been around since 2018, but not a lot of services have added support for it yet. Both the sender and receiver need to support the standard, but most wallets don't support it right now.

"The current active implementations only allow you to do PayJoins between the same wallets, which is a bit too restrictive for widespread usage. There's nothing stopping any wallet or service from adding support for a universal PayJoin protocol now," Camilleri said.

This is one problem the project Snowball is trying to solve by creating code allowing for PayJoin transactions that can be easily added to any bitcoin wallet. The developers behind it plan to eventually open "pull requests" with suggested code to popular bitcoin wallets, to help get the ball rolling by encouraging them to adopt the privacy feature, and making it as easy as possible to do so.

Blockstream plans to further spur adoption of PayJoin. For now, it is working on adding PayJoin support to the bitcoin wallet Blockstream Green.

"The next interesting step would be for an exchange to support P2EP. Ultimately we need to make a choice on what kind of world we want to live in, one where there is financial privacy or one where there isn't," Mow said.

"Money needs to be private and fungible in order for it to be a 'good' money," he added. "With bitcoin, every transaction is open for anyone to see, so we still have a lot of work to do to get it there. Without privacy and fungibility, money can be used as a tool for oppression or financial surveillance. Bitcoin is the future of money and the future of money shouldnt be Orwellian."

The leader in blockchain news, CoinDesk is a media outlet that strives for the highest journalistic standards and abides by a strict set of editorial policies. CoinDesk is an independent operating subsidiary of Digital Currency Group, which invests in cryptocurrencies and blockchain startups.

Read the original:
BTCPay Looks to Anonymize Bitcoin Transactions With PayJoin Integration - CoinDesk

YouTube Bans Bitcoin (BTC) and Crypto Analyst Tone Vays, Removes All Videos and Terminates Account – The Daily Hodl

Prominent Bitcoin and crypto analyst Tone Vays says his YouTube account has been terminated.

Vays, a former Bear Stearns risk analyst and vice president at JP Morgan Chase, says his account was taken down shortly after he received a warning on one of his videos.

The popular trader alerted his 201,000 followers on Twitter on Thursday, and the former home of his YouTube channel now displays a 404 Not Found error.

So that escalated quickly. From a single video warning to the ENTIRE CHANNEL Being taken down by YouTube. If @TeamYouTube cant resolve this, might be a career change in the near future. Will just trade during the day and have a nutrition @ytcreators channels at night.

I highly doubt any review by @TeamYouTube took place as the YouTube channel got taken down an hour after the initial warning and 30 minutes after my appeal. But if this it for my @ytcreators channel, then it is what it is, was fun while it lasted.

Cryptocurrency supporters on YouTube have reported an increasing number of issues on the platform in recent months, with videos being removed and channels receiving strikes without warning.

In December, hundreds of Bitcoin (BTC) and crypto-related videos were suddenly removed in a sweeping crackdown across dozens of channels.

However, those videos were eventually restored and a YouTube spokesperson said the worlds leading video-sharing platform made the wrong call by flagging content from crypto YouTube creators as harmful.

So far, YouTube has not commented on exactly why it terminated Vayss account, saying only that he violated the platforms terms of service.

Featured Image: Shutterstock/metamorworks

See original here:
YouTube Bans Bitcoin (BTC) and Crypto Analyst Tone Vays, Removes All Videos and Terminates Account - The Daily Hodl

This Nobel-Winning Economist Predicted Bitcoin’s Formidable Rise in 1991 – Bitcoinist

Almost 30 years ago, Nobel Prize-winning American economist Milton Friedman said he would like to have money controlled by a computer. He also said it would be a better world without the Federal Reserve. One of his two desires is already happening in the form of Bitcoin. In fact, he seems to have predicted its formidable rise in 1991. And with the FEDs incessant money printing causing growing criticismis it a question of time before the other one comes true as well?

Its almost eerie watching this short clip in which Friedman appears to talk about Bitcoin, an invention that would come some 18 years later.

As the main advocate opposing the Keynesian government policies in place today, Friedman promoted a macroeconomic viewpoint known as monetarism. Rather than the FED stepping in and printing money as they see fit, he argued that there should be a slow and steady expansion of money supply.

With the historic bailouts and QE that we see going on today in response to the coronavirus pandemic, Friedman would probably be turning in his grave. He expressed his desire back in 1991 to have money controlled by a computer that could not interfere and adjust the policy at will.

The video clip was posted on the Bitcoin Twitter channel and naturally garnered scores of likes, retweets, and applause. Some of the comments said:

He would prolly be all in with Bitcoin if he was still alive

And others stated:

Well make it happen, Milton

Of course, with Bitcoin being the most successful experiment of tamper-proof decentralized money running across computers (nodes), its easy to forget that there were precursors to Bitcoin.

David Chaum released DigiCash in 1989 which made use of cryptography for private payments and introduced the concept of public and private keys. The project garnered support from libertarians and small groups in favor of a digital currency that could be transferred internationally free from government control.

While DigiCash and other projects pre-Bitcoin failed to gain traction,Friedman was no stranger to the fact that there was a need for electronic money. He believed that it would happen in the future. In fact, in that same year, he said:

One thing we are still lacking and will soon develop is reliable e-cash a method by which money can be transferred from A to B on the Internet without A knowing B and vice versa

With Bitcoin proposing a viable alternative to fiat and entirely free from central actors; will Friedmans second desire come true as well? Will the FED be removed completely? Its going to be interesting to how things unfold

Read the original:
This Nobel-Winning Economist Predicted Bitcoin's Formidable Rise in 1991 - Bitcoinist

Bitcoin and the Folly of the Safe-Haven Trade – Traders Magazine

By Philippe Bekhazi, Co-founder and CEO of XBTO Group.

While we are by no means back to normalcy and the timetable is still murky as to when we resume business as usual, enough time has passed to hold a mirror up to how bitcoin responded to the initial Coronavirus shock and most pronounced market turmoil.

Firstly, any crisis whether induced by biological, geopolitical or financial shocks always results in a crisis of liquidity. Period. Peter sells some stock to pay for a loss on crypto or vice versa. Paul sells high yield bonds to buffer up MBS positions. Or simply selling winners and going to cash and short-term treasury instruments.

The vicious cycle spins and the same horror movie plays, as over-levered investors (mainly denominated in USD) need to sell indiscriminately and across asset classes. The rush to liquidity and the comfort of cash drives all correlations to one, despite promises and proselytization that some instruments are immune to this inevitable selling tsunami.

Therefore, in the most recent installment, when the VIX went to 80, and global uncertainty and fear on steroids reigned supreme, those over-levered pools of capital got flushed out again and red sprayed across our trading screens; regardless of the fundamental constructs and theses underpinning any asset classes or individual names. So, to hammer home the point once again, in any risk off environment there is no safe haven, especially not one traded as thinly and with as much speculation as bitcoin.

Holding the x-ray up to the light

Lets take an objective look at the state of bitcoin, how it fared during peak crisis, and its role in a portfolio anchored by a long-term view and mandate.

Firstly, bitcoin is not a deep market at all and with a market cap of around $120 billion equates to that of Tesla (TSLA). It is also driven, currently, in the short term, more by over-levered speculators than by long-term holders, hence its volatility, which (while gravely exacerbated with the crisis), had vacillated with large peak to trough moves of 10,000 to 3,500 (on some exchanges) in 2020 alone [and before Coronavirus was even a driving factor]. In contrast to a safe-haven cushion that would zig while the markets zagged, bitcoin dropped over 50% while the S&P dropped 30%.

While many speculators have indeed been flushed out by employing reckless amounts of leverage, savvy trading outfits and long-term investors continue to hold. They do so for the same fundamental reasons as to why they built exposure in the first place, a thesis that has been starkly reinforced through unprecedented Central Bank action worldwide, including the US Feds QE to infinity stance.

So, for the true believers in bitcoin, not only have the fundamentals not changed, but they have become more pronounced and engraved into the investment psyche. Moreover, in contrast to erratic and impossible to predict monetary policies, there are more knowns within bitcoin protocols, and an inability to put the printing presses on autopilot. On the contrary, a halvening in mid-May will result in less availability and scarcity as the block reward will fall from 12.5BTC per block to 6.25 (a block is roughly mined every 10 minutes), so a greater worth is placed on each unit, or coin similar to less mining equating to higher demand for existing reserves of gold, also touted as an asset to hold in unhinged times.

Bitcoin versus Gold

Now that we have morphed from an initial liquidity crisis into the potential next phase of a crisis (usually credit crisis, but could take on others forms), we can talk more rationally about safe haven assets and their long-term role in providing diversification and a non-correlating return stream to traditional equity and bond portfolios.

Many talking heads myopically preach either gold or bitcoin as the sole answer in dire times, often taking to public forums to attack each other and create a schism between either camp. As noted, neither acts as promised (at least initially), and there is no need to play a zero-sum game here. One could argue the merits of holding both as diversification tools and alternate stores of value, each with their own idiosyncratic benefits and use cases.

While gold certainly has a deep history narrating its utility and potential worth, bitcoin has more attractive, contemporary features, such as no physical delivery, no storage, and greater immutability and real-life payment applications. All valid reasons why many have made its case as a replacement to gold.

My view is that institutional allocators and stewards of capital should have exposure to both, placing at least 50% of their current gold holdings (usually 1-2% of an overall portfolio) into bitcoin. Those taking a 10-year outlook will see the non-correlation benefits to their portfolios and their participation (alongside the continued weeding out of weak players) will also smooth out volatility as larger tickets and block trades counterbalance shorter-term trading strategies.

While perhaps more evident to a newer generation embracing a more futuristic mind-set, bitcoin also provides exposure to an underlying network effect the value of which is not currently priced in, built on the premise of more decentralization, the utility and staying power of blockchain technologies and emerging tokens of value (stablecoins, security tokens etc.) that will coalesce to eventually make the asset truly reflect and catch up to the sum of its parts.

What does not kill you makes your stronger

While the current crisis engulfing our daily lives and the global economy is eerily unique, many of the lessons learned ring true from prior market disruptions and dislocations:

Sophisticated traders and long-term institutional investors alike should be easing their way in and taking a nibble at this digital diversifier, especially against the macro backdrop of irreversible currency debasement.

Moreover, structural positives did arise from this latest stress test, with the crypto infrastructure holding up and proving its mettle across custody, trading and execution which all came together to function in a global 24/7 environment that is very different from traditional market machinations.

The ecosystem is evolving and getting stronger and with that we need to advance the mindset, rationale for investing, and pools of incoming capital to strengthen the asset class.

The views represented in this commentary are those of its author and do not reflect the opinion of Traders Magazine, Markets Media Group or its staff. Traders Magazine welcomes reader feedback on this column and on all issues relevant to the institutional trading community.

Read the original here:
Bitcoin and the Folly of the Safe-Haven Trade - Traders Magazine

10 developer skills on the rise and five on the decline – TechCentral.ie

Image: StockXpert

Heres how to ensure your programming chops stay sharp

Print

Read More: GraphQL Pytorch skills software development training

Technology is constantly evolving and so, too, are the developer skills employers look for to make the most of what is emerging and what is solidifying its place in the enterprise.

As companies dive deeper into digital transformations and pivot to data-driven cultures, tech disciplines such as AI, machine learning, internet of things (IoT) and IT automation are driving organisations technology strategies and boosting demand for skills with tools, such as Docker, Ansible and Azure, that will help companies innovate and stay competitive in rapidly changing markets.

What were seeing is companies developing internal skill maps within their developer organisations so they can see what skills they have, and where they need to grow, says Vivek Ravisankar, CEO and co-founder of HackerRank. Theyre building these competency frameworks to find their skill gaps and then put in place training and education to close those.

Understanding which disciplines and skills are up-and-coming and which are fading can help both companies and developers ensure they have the right skills and knowledge to succeed. And what better way to find that out than to mine developer job postings.

Indeed.com analysed job postings using a list of 500 key technology skill terms to see which ones employers are looking for more these days and which are falling out of favour. Such research has helped identify cutting-edge skills over the past five years, with some previous years risers now well establish, thanks to explosive growth.

Docker, for one, has risen more than 4,000% in the past five years and was listed in more than 5% of all US tech jobs in 2019.IoT as well has shot up nearly 2,000% in the past half-decade, with Ansible an IT automation, configuration management, and deployment tool and Kafka a tool for building real-time data pipelines and streaming apps showing similarly strong growth. And, of course, therise of data sciencehas also since cemented high demand for a range of skills, including artificial intelligence, machine learning, and data analysis.

Developers looking to add new skills to their repertoire should pay close attention to the most recent upticks in skills demand that Indeed identified from September 2018 to September 2019 and those falling out of favour as outlined below. Each skill is accompanied by average annual salary information for developers who possess these skills, according to PayScale.com.

Pytorch is an open-source machine-learning library written in Python, C++ and CUDA. It is used for applications such as computer vision and natural language processing. While primarily developed by Facebooks AI Research Lab, it is offered free under the modified BSD license.

Rate of growth, 2018-2019:+138%

Average salary:$118,000

GraphQL is an open-source data query and manipulation language for APIs, and a runtime for fulfilling queries in existing data sets. GraphQL was originally developed for internal use by Facebook but was released for public use in 2015 under the GraphQL Foundation, hosted by the non-profit Linux Foundation. GraphQL supports reading, writing and subscribing to changes in data, and servers are available for multiple languages, including Haskell, JavaScript, Perl, Python, Ruby, Java, C#, Scala, Go, Elixer, Erlang, PHP, R and Clojure.

Rate of growth, 2018-2019:+80%Average salary:$97,000

Kotlin is a cross-platform, statically typed, general-purpose programming language that is designed to interoperate with Java. The Java Virtual Machine (JVM) version of its standard library, in fact, depends on the Java Class Library, though Kotlins syntax is more concise than that of Java. In May of 2019, Google announced that the Kotlin language is now its preferred language for Android developers and has been included as an alternative to the standard Java compiler since the release of Android Studio 3.0 in 2017.

Rate of growth, 2018-2019:+76%Average salary:$99,000

Vue is a progressive, incrementally adoptable JavaScript framework for building user interfaces on the Web. It allows users to extend HTML with attributes (called directives) that offer increased functionality to HTML applications through either built-in or user-defined directives.

Rate of growth, 2018-2019:+72%Average salary:$116,000

.Net Core is a free, open-source, managed software framework for Windows, Linux and macOS. It is a cross-platform successor to Microsofts proprietary .NET Framework and is released for use under the MIT License. It is primarily used in the development of desktop application software, AI/machine learning and IoT applications.

Rate of growth, 2018-2019:+71%Average salary:$87,000

Formerly Looker Data Sciences, Looker is a data exploration and discovery business intelligence platform that was acquired by Google Cloud Platform in 2019. Lookers modelling language, LookML, enables data teams to define relationships in their database so business users can explore, save and download data without needing to knowSQL. Looker was the first commercially available BIplatform built for and aimed at scalable or MPRDBM (massively parallel relational database management system) such as Amazons Redshift, Google BigQuery, HP Vertica, Netezza and Teradata.

Rate of growth, 2018-2019:+68%Average salary:$68,000

HashiCorps Terraform is open-source infrastructure-as-codesoftware that allows users to define and provision a data centre using theproprietary, high-level configuration language Hashicorp Configuration Language(HCL) or JSON. Terraform supports a number of cloud infrastructure providers,including Amazon AWS, IBM Cloud, Google Cloud Platform, DigitalOcean, MicrosoftAzure, and more.

Rate of growth, 2018-2019:+66%Average salary:$104,000

Googles suite of cloud computing services runs of the same infrastructure used for Googles end-user products, and includes a set of management tools, modular cloud services such as computing data storage, data analytics and machine learning. The platform provides infrastructure as a service, platform as a service and serverless computing environments to customers, as well as Googles App Engine, which allows for development and hosting web applications in Google-managed data centres.

Rate of growth, 2018-2019:+62%Average salary:$191,000

Originally designed by Google, Kubernetes (sometimes abbreviated as K8s) is an open-source container orchestration system for automating application deployment, scaling and management. Kubernetes provides a platform for application container automation, deployment, scaling and operation across clusters of hosts.

Rate of growth, 2018-2019:+61%Average salary:$115,000

Spring Boot is an open-source, Java-based integration framework used to create microservices and to build stand-alone and production-ready Spring applications. Spring Boot is built on the Spring framework and gives developers a platform on which to jumpstart development of Spring applications. Spring Boot uses pre-configured, injectable dependencies to speed up development and save developers time.

Rate of growth, 2018-2019:+58%Average salary:$78,000

As fast as some tech skills rise, others fall. Five skills that dropped off significantly in the year between 2018 and 2019 are:

The free, open-source Web browser from the Mozilla foundation has seen its popularity wane in recent years; developers with these skills may also find they are not in demand.

Rate of growth, 2018-2019:-47%

Open source software from HashiCorp for building and maintaining portable virtual software development environments, Vagrant tries to simplify configuration management of virtual environments.

Rate of growth, 2018-2019:-41%

Skills related to Googles web browser have also decreased in popularity between 2018 and 2019.

Rate of growth, 2018-2019: -33%

Production of optics systems have seen a steep decline of late.

Rate of growth, 2018-2019:-33%

The Global System for Mobiles is an older telecommunications standard for mobile phones, which could explain why it has decreased in popularity.

Rate of growth, 2018-2019: -26%

IDG News Service

Read More: GraphQL Pytorch skills software development training

View post:
10 developer skills on the rise and five on the decline - TechCentral.ie

Yes, Section 215 Expired. Now What? – EFF

On March 15, 2020, Section 215 of the PATRIOT Acta surveillance law with a rich history of government overreach and abuseexpired. Along with two other PATRIOT Act provisions, Section 215 lapsed after lawmakers failed to reach an agreement on a broader set of reforms to the Foreign Intelligence Surveillance Act (FISA).

In the week before the law expired, the House of Representatives passed theUSA FREEDOM Reauthorization Act, without committee markup or floor amendments, which would have extended Section 215 for three more years, along with some modest reforms.

In order for any bill to become law, the House and Senate must pass an identical bill, and the President must sign it. That didnt happen with the USA FREEDOM Reauthorization Act. Instead, knowing the vote to proceed with the Houses bill in the Senate without debating amendments was going to fail, Senator McConnell brought a bill to the floor that would extend all the expiring provisions for another 77 days, without any reforms at all. Senator McConnell's extension passed the Senate without debate.

But the House of Representatives left town without passing Senator McConnells bill, at least until May 12, 2020, and possibly longer. That means that Section 215 of the USA PATRIOT Act, along with the so-called lone wolf and the roving wiretap provisions have expired, at least for a few weeks.

EFF has argued that if Congress cant agree on real reforms to these problematic laws, they should be allowed to expire. While we are pleased that Congress didn't mechanically reauthorize Section 215, it is only one of a number of largely overlapping surveillance authorities. The loss of the current version of the law will still leave the government with a range of tools that is still incredibly powerful. These include other provisions of FISA as well as surveillance authorities used in criminal investigations, many of which can include gag orders to protect sensitive information.

In addition, the New York Times and others have noted that Section 215s expiration clause contains an exception permitting the intelligence community to use the law for investigations that were ongoing at the time of expiration or to investigate offenses or potential offenses that occurred before the sunset. Broad reliance on this exception would subvert Congresss intent to have Section 215 truly expire, and the Foreign Intelligence Surveillance Court should carefullyand publiclycircumscribe any attempt to rely on it.

Although Section 215 and the two other provisions have expired, that doesnt mean theyre gone forever. For example, in 2015, during the debate over the USA FREEDOM Act, these same provisions were also allowed to expire for a short period of time, and then Congress reauthorized them for another four years. While transparency is still lacking in how these programs operate, the intelligence community did not report a disruption in any of these critical programs at that time. If Congress chooses to reauthorize these programs in the next couple of months, its unlikely that this disruption will have a lasting impact.

The Senate plans to vote on a series of amendments to the House-passed USA FREEDOM Reauthorization Act in the near future. Any changes made to the bill would then have to be approved by the House and signed by the President. This means that Congress has the opportunity to discuss whether these authorities are actually needed, without the pressure of a ticking clock.

As a result, the House and the Senate should take this unique opportunity to learn more about these provisions and create additional oversight into the surveillance programs that rely on them. The expired provisions should remain expired until Congress enacts the additional, meaningful reforms weve been seeking.

You can read more about what EFF is calling for when it comes to reining in NSA spying, reforming FISA, and restoring Americans privacy here.

Link:
Yes, Section 215 Expired. Now What? - EFF

si2 Launches Survey on Artificial Intelligence and Machine Learning in Eda – AiThority

Silicon Integration Initiativehas launched an industry-wide survey to identify planned usage and structural gaps for prioritizing and implementing artificial intelligence and machine learning in semiconductor electronic design automation.

The Si2 platform provides a unique opportunity for semiconductor companies, EDA suppliers and IP providers to voice their needs and focus resources on common solutions, including enabling and leveraging university research.

The survey is organized by a recently formed Si2 Special Interest Group chaired by Joydip Das, senior engineer, Samsung Electronics, and co-chaired by Kerim Kalafala, senior technical staff member, EDA, and master inventor, IBM. The 18-member group will identify where industry collaboration will help eliminate deficiencies caused by a lack of common languages, data models, labels, and access to robust and categorized training data.

Recommended AI News:Artio Medical Appoints Jeff Weinrich to Board of Directors

This SIG is open to all Si2 members. Current members include:

Advanced Micro DevicesANSYSCadence Design SystemsHewlett Packard EnterpriseIBMIntelIntento DesignKeysight TechnologiesMentor, a Siemens Business

NC State UniversityPFD Solutions

Qualcomm

Samsung Electronics

Sandia National LaboratoriesSilvaco

SynopsysThrace SystemsTexas Instruments

The survey is open April 15 May 15.

Leigh Anne Clevenger, Si2 senior data scientist, said that the survey results would help prioritize SIG activities and timelines. The SIG will identify and develop requirements for standards that ensure data and software interoperability, enabling the most efficient design flows for production, Clevenger said. The ultimate goal is to remove duplicative work and the need for data model translators, and focus on opening avenues for breakthroughs from suppliers and users alike.

Recommended AI News:Ligandal Is Developing Potential Antidote And Vaccine To SARS-CoV-2

High manufacturing costs and the growing complexity of chip development are spurring disruptive technologies such as AI and ML, Clevenger explained. The Si2 platform provides a unique opportunity for semiconductor companies, EDA suppliers and IP providers to voice their needs and focus resources on common solutions, including enabling and leveraging university research.

View original post here:
si2 Launches Survey on Artificial Intelligence and Machine Learning in Eda - AiThority

New AI improves itself through Darwinian-style evolution – Big Think

Machine learning has fundamentally changed how we engage with technology. Today, it's able to curate social media feeds, recognize complex images, drive cars down the interstate, and even diagnose medical conditions, to name a few tasks.

But while machine learning technology can do some things automatically, it still requires a lot of input from human engineers to set it up, and point it in the right direction. Inevitably, that means human biases and limitations are baked into the technology.

So, what if scientists could minimize their influence on the process by creating a system that generates its own machine-learning algorithms? Could it discover new solutions that humans never considered?

To answer these questions, a team of computer scientists at Google developed a project called AutoML-Zero, which is described in a preprint paper published on arXiv.

"Human-designed components bias the search results in favor of human-designed algorithms, possibly reducing the innovation potential of AutoML," the paper states. "Innovation is also limited by having fewer options: you cannot discover what you cannot search for."

Automatic machine learning (AutoML) is a fast-growing area of deep learning. In simple terms, AutoML seeks to automate the end-to-end process of applying machine learning to real-world problems. Unlike other machine-learning techniques, AutoML requires relatively little human effort, which means companies might soon be able to utilize it without having to hire a team of data scientists.

AutoML-Zero is unique because it uses simple mathematical concepts to generate algorithms "from scratch," as the paper states. Then, it selects the best ones, and mutates them through a process that's similar to Darwinian evolution.

AutoML-Zero first randomly generates 100 candidate algorithms, each of which then performs a task, like recognizing an image. The performance of these algorithms is compared to hand-designed algorithms. AutoML-Zero then selects the top-performing algorithm to be the "parent."

"This parent is then copied and mutated to produce a child algorithm that is added to the population, while the oldest algorithm in the population is removed," the paper states.

The system can create thousands of populations at once, which are mutated through random procedures. Over enough cycles, these self-generated algorithms get better at performing tasks.

"The nice thing about this kind of AI is that it can be left to its own devices without any pre-defined parameters, and is able to plug away 24/7 working on developing new algorithms," Ray Walsh, a computer expert and digital researcher at ProPrivacy, told Newsweek.

If computer scientists can scale up this kind of automated machine-learning to complete more complex tasks, it could usher in a new era of machine learning where systems are designed by machines instead of humans. This would likely make it much cheaper to reap the benefits of deep learning, while also leading to novel solutions to real-world problems.

Still, the recent paper was a small-scale proof of concept, and the researchers note that much more research is needed.

"Starting from empty component functions and using only basic mathematical operations, we evolved linear regressors, neural networks, gradient descent... multiplicative interactions. These results are promising, but there is still much work to be done," the scientists' preprint paper noted.

Related Articles Around the Web

Continued here:
New AI improves itself through Darwinian-style evolution - Big Think

Research Team Uses Machine Learning to Track Covid-19 Spread in Communities and Predict Patient Outcomes – The Ritz Herald

The COVID-19 pandemic is raising critical questions regarding the dynamics of the disease, its risk factors, and the best approach to address it in healthcare systems. MIT Sloan School of Management Prof. Dimitris Bertsimas and nearly two dozen doctoral students are using machine learning and optimization to find answers. Their effort is summarized in the COVIDanalytics platform where their models are generating accurate real-time insight into the pandemic. The group is focusing on four main directions; predicting disease progression, optimizing resource allocation, uncovering clinically important insights, and assisting in the development of COVID-19 testing.

The backbone for each of these analytics projects is data, which weve extracted from public registries, clinical Electronic Health Records, as well as over 120 research papers that we compiled in a new database. Were testing our models against incoming data to determine if it makes good predictions, and we continue to add new data and use machine-learning to make the models more accurate, says Bertsimas.

The first project addresses dilemmas at the front line, such as the need for more supplies and equipment. Protective gear must go to healthcare workers and ventilators to critically ill patients. The researchers developed an epidemiological model to track the progression of COVID-19 in a community, so hospitals can predict surges and determine how to allocate resources.

The team quickly realized that the dynamics of the pandemic differ from one state to another, creating opportunities to mitigate shortages by pooling some of the ventilator supply across states. Thus, they employed optimization to see how ventilators could be shared among the states and created an interactive application that can help both the federal and state governments.

Different regions will hit their peak number of cases at different times, meaning their need for supplies will fluctuate over the course of weeks. This model could be helpful in shaping future public policy, notes Bertsimas.

Recently, the researchers connected with long-time collaborators at Hartford HealthCare to deploy the model, helping the network of seven campuses to assess their needs. Coupling county level data with the patient records, they are rethinking the way resources are allocated across the different clinics to minimize potential shortages.

The third project focuses on building a mortality and disease progression calculator to predict whether someone has the virus, and whether they need hospitalization or even more intensive care. He points out that current advice for patients is at best based on age, and perhaps some symptoms. As data about individual patients is limited, their model uses machine learning based on symptoms, demographics, comorbidities, lab test results as well as a simulation model to generate patient data. Data from new studies is continually added to the model as it becomes available.

We started with data published in Wuhan, Italy, and the U.S., including infection and death rate as well as data coming from patients in the ICU and the effects of social isolation. We enriched them with clinical records from a major hospital in Lombardy which was severely impacted by the spread of the virus. Through that process, we created a new model that is quite accurate. Its power comes from its ability to learn from the data, says Bertsimas.

By probing the severity of the disease in a patient, it can actually guide clinicians in congested areas in a much better way, says Bertsimas.

Their fourth project involves creating a convenient test for COVID-19. Using data from about 100 samples from Morocco, the group is using machine-learning to augment a test previously designed at the Mohammed VI Polytechnic University to come up with more precise results. The model can accurately detect the virus in patients around 90% of the time, while false positives are low.

The team is currently working on expanding the epidemiological model to a global scale, creating more accurate and informed clinical risk calculators, and identifying potential ways that would allow us to go back to normality.

We have released all our source code and made the public database available for other people too. We will continue to do our own analysis, but if other people have better ideas, we welcome them, says Bertsimas.

Original post:
Research Team Uses Machine Learning to Track Covid-19 Spread in Communities and Predict Patient Outcomes - The Ritz Herald