Assange would be held in darkest corner of the prison system if extradited to the US – World Socialist Web Site

Assange would be held in darkest corner of the prison system if extradited to the US By Oscar Grenfell 1 February 2020

Representatives of the Dont Extradite Assange (DEA) organisation revealed last week that if Julian Assange is extradited from Britain to the US, he will be held in almost total isolation and subjected to draconian conditions usually inflicted on those convicted of terror offences.

The information, which was sourced from official US court filings, was relayed in a statement by John Rees outside Westminster Magistrates Court after Assanges most recent case management hearing on January 22. Rees said that Assange would be placed under Special Administrative Measures as soon as he arrives in the US and prior to any trial.

WikiLeaks editor-in-chief Kristinn Hraffnsson said the US State Department had indicated that the First Amendmentthe central US constitutional protection for free speech and freedom of the presswould not apply to Assange, despite the fact that he has been charged under domestic American law.

Taken together, the revelations damn the attempt to extradite Assange to the USwhere he faces espionage charges and the prospect of life imprisonment, or even the death penalty, for exposing US war crimesas an extraordinary rendition operation. In violation of fundamental precepts of international law, Assange, a journalist and publisher, will be treated as a national security threat and deprived of his fundamental democratic rights.

Special Administrative Measures (SAMs) were introduced by the Democratic Party administration of Bill Clinton in 1996. They were legislated with bipartisan support in the wake of the right-wing terrorist Oklahoma City bombing.

SAMs provide for the intensive monitoring and isolation of prisoners already in solitary confinement, on the pretext of preventing any threats to national security, including violent acts and disclosures of classified information. The already draconian measures were expanded in the aftermath of the 9/11 terrorist attacks in 2001, including providing authorities the right to spy on prisoners privileged attorney-client conversations.

A 2017 report by the Allard K. Lowenstein Human Rights Clinic at Yale Law School and the Center for Constitutional Rights (CCR) described SAMs as the darkest corner of the US federal prison system.

The report explained that SAMs combine the brutality and isolation of maximum security units with additional restrictions that deny individuals almost any connection to the human world. They prohibit prisoners who live under them from contact or communication with all but a handful of approved individuals, and impose a second gag on even those few individuals. The net effect is to shield this form of torture in our prisons from any real public scrutiny.

Underscoring the intensity of the US-vendetta against Assange, there were just 51 SAMs prisoners in 2017 out of a federal prison population of more than 183,000. Most had been convicted of terror-related offences and were held at ADX Florence, a supermax prison in the Colorado Desert. The facility has been described as a clean version of hell by one of its former wardens, Robert Hood.

Prisoners held under SAMs are denied even the narrow avenues of indirect communicationthrough sink drains or air ventsavailable to prisoners in solitary confinement. They are generally held in single cells for all but 10 hours a week. Their recreation hours are spent alone in a confined space with few or no amenities.

SAMs detainees are only allowed to communicate with lawyers and relatives who have been screened by authorities, including the intelligence agencies. All outbound and incoming mail is read by the Federal Bureau of Investigations (FBI).

The Yale-CCR report presented case studies of prisoners at ADX Florence who had to wait months before their letters to relatives were cleared for sending. Visitation rights are also extremely curtailed and are monitored by the FBI.

The reports authors bluntly stated that pretrial prisoners were placed under SAMs with the aim of compelling them to plead guilty, fundamentally undermining the presumption of innocence.

The coercive nature and harsh conditions placed on pretrial SAMs detainees was no accident: experience shows that the DOJ uses total isolation as a tool to break people, just as the CIA did during its foray into detention, the report states.

Because SAMs prisoners are barred from communicating with the outside world, and are denied any information, they are effectively prevented from participating in their own defence.

One attorney cited in the report stated that SAMs dehumanise defendants and create a situation where they cannot exist in a defiant posture to fight the case, serving to eliminate them as participants in their defence. Another noted that SAMs prisoners are expected to give testimony before a jury, after having been prevented from speaking to anyone for months, or even years.

SAMs prisoners have no access to the internet and when they receive newspapers, weeks after publication, they arrive with substantial redactions. In many cases, they are arbitrarily prevented from receiving reading materials. In one incident recounted in the report, the authorities prevented a prisoner at ADX Florence from getting books by former President Barack Obama, on the grounds that it would threaten national security.

SAMs prisoners are also barred from speaking to reporters, or anyone other than their attorney and FBI-approved family visitors. Lawyers are also gagged from relaying anything said by their client, or even talking about the conditions they face. If they violate these draconian conditions, which are aimed at suppressing any discussion of their clients plight or attempting to win public support, they face criminal prosecution.

In 2005, famous civil rights attorney Lynne Stewart and her Arabic interpreter were convicted of conspiracy and of providing material support to terrorists, after publicly-releasing statements from her client, Sheikh Omar Abdel-Rahman. Lynne Stewart was sentenced to a decade in prison and was only released early on compassionate grounds in the late stages of her terminal cancer.

The authorities can also spy on private communications between lawyers and their SAMs clients. Under official regulations, this material supposedly cannot be provided to prosecuting authorities. However the ability of the state to monitor defence strategies effectively erodes the Fifth Amendment right to due process and the Sixth Amendment right to counsel.

Lawyers quoted by the Yale-CCR report, moreover, disclosed they had been placed under pervasive government surveillance while representing SAMs prisoners, including being placed on airport watch lists. Such measures are aimed at intimidating attorneys and preventing SAMs inmates from receiving legal counsel.

The report documents the harrowing conditions faced by convicted inmates, who are afflicted with psychological disorders after years of isolation under SAMs. In a number of cases, prisoners had entered an almost catatonic state, which prevented them from communicating, or carrying out any activities, including reading.

The report continues, Physical conditions are similarly inhumane at pre-trial facilities where SAMs detainees are heldthat is, facilities designed to hold individuals who have been charged, but not convicted, of a crime. Conditions at the Metropolitan Correctional Center (MCC) in Manhattan, where defendants charged with terrorism-related offenses are often held pre-trial, are particularly harsh. Detainees in the MCCs 10 South, where high-level defendantsincluding those under SAMsare held, have little natural light and no possibility for outdoor recreation. Recreational time is provided in a closed room identical to the detainees cell. Unable to open windows or spend time outdoors, detainees in 10 South have no access to fresh air.

The SAMs measures would be compounded by the fact that Assange would appear before the Eastern District Court of Virginia, the preferred government venue for national security cases because it is located close to the Pentagon and CIA, with the largest concentration of intelligence agency employees in the US. It has registered a conviction rate in such trials of more than 98 percent.

Assange has already endured almost a decade of US-led persecution. He was arbitrarily detained in Ecuadors London embassy for almost seven years, as a result of British threats to arrest him if he set foot outside the tiny building. Since being dragged out of the embassy by British police on April 11, Assange has been held in the maximum-security Belmarsh Prison, where his health has continued to deteriorate, to the point that the UN Rapporteur on Torture has warned that he might die.

The revelation that Assange would be placed under SAMs makes clear that his extradition to the US would be nothing less than a death sentence. In a 2015 interview, Assange himself warned that if extradited, he would likely be subjected to SAMS, which he described as a sort of living death.

The lawless character of the US attempt to prosecute Assange underscores the necessity for workers, students, young people and all defenders of democratic rights to prevent his extradition. The Socialist Equality Parties in Britain and Australia have announced meetings and rallies next month, coinciding with the beginning of the extradition hearing, to galvanise the widespread support for Assange into a political movement to secure his freedom.

2019 has been a year of mass social upheaval. We need you to help the WSWS and ICFI make 2020 the year of international socialist revival. We must expand our work and our influence in the international working class. If you agree, donate today. Thank you.

Read this article:
Assange would be held in darkest corner of the prison system if extradited to the US - World Socialist Web Site

What Is Quantum Computing and How Does it Work? – Built In

Accustomed to imagining worst-case scenarios, many cryptography experts are more concerned than usual these days: one of the most widely used schemes for safely transmitting data is poised to become obsolete once quantum computing reaches a sufficiently advanced state.

The cryptosystem known as RSA provides the safety structure for a host of privacy and communication protocols, from email to internet retail transactions. Current standards rely on the fact that no one has the computing power to test every possible way to de-scramble your data once encrypted, but a mature quantum computer could try every option within a matter of hours.

It should be stressed that quantum computers havent yet hit that level of maturity and wont for some time but when a large, stable device is built (or if its built, asan increasingly diminishing minority argue), its unprecedented ability to factor large numbers would essentially leave the RSA cryptosystem in tatters. Thankfully, the technology is still a ways away and the experts are on it.

Dont panic. Thats what Mike Brown, CTO and co-founder of quantum-focused cryptography company ISARA Corporation, advises anxious prospective clients. The threat is far from imminent. What we hear from the academic community and from companies like IBM and Microsoft is that a 2026-to-2030 timeframe is what we typically use from a planning perspective in terms of getting systems ready, he said.

Cryptographers from ISARA are among several contingents currently taking part in the Post-Quantum Cryptography Standardization project, a contest of quantum-resistant encryption schemes. The aim is to standardize algorithms that can resist attacks levied by large-scale quantum computers. The competition was launched in 2016 by the National Institute of Standards and Technology (NIST), a federal agency that helps establish tech and science guidelines, and is now gearing up for its third round.

Indeed, the level of complexity and stability required of a quantum computer to launch the much-discussed RSA attack is very extreme, according to John Donohue, scientific outreach manager at the University of Waterloos Institute for Quantum Computing. Even granting that timelines in quantum computing particularly in terms of scalability are points of contention, the community is pretty comfortable saying thats not something thats going to happen in the next five to 10 years, he said.

When Google announced that it had achieved quantum supremacy or that it used a quantum computer to run, in minutes, an operation that would take thousands of years to complete on a classical supercomputer that machine operated on 54 qubits, the computational bedrocks of quantum computing. While IBMs Q 53 system operates at a similar level, many current prototypes operate on as few as 20 or even five qubits.

But how many qubits would be needed to crack RSA? Probably on the scale of millions of error-tolerant qubits, Donohue told Built In.

Scott Aaronson, a computer scientist at the University of Texas at Austin, underscored the same last year in his popular blog after presidential candidate Andrew Yang tweeted that no code is uncrackable in the wake of Googles proof-of-concept milestone.

Thats the good news. The bad news is that, while cryptography experts gain more time to keep our data secure from quantum computers, the technologys numerous potential upsides ranging from drug discovery to materials science to financial modeling is also largely forestalled. And that question of error tolerance continues to stand as quantum computings central, Herculean challenge. But before we wrestle with that, lets get a better elemental sense of the technology.

Quantum computers process information in a fundamentally different way than classical computers. Traditional computers operate on binary bits information processed in the form of ones or zeroes. But quantum computers transmit information via quantum bits, or qubits, which can exist either as one or zero or both simultaneously. Thats a simplification, and well explore some nuances below, but that capacity known as superposition lies at the heart of quantums potential for exponentially greater computational power.

Such fundamental complexity both cries out for and resists succinct laymanization. When the New York Times asked 10 experts to explain quantum computing in the length of a tweet, some responses raised more questions than they answered:

Microsoft researcher David Reilly:

A quantum machine is a kind of analog calculator that computes by encoding information in the ephemeral waves that comprise light and matter at the nanoscale.

D-Wave Systems executive vice president Alan Baratz:

If were honest, everything we currently know about quantum mechanics cant fully describe how a quantum computer works.

Quantum computing also cries out for a digestible metaphor. Quantum physicist Shohini Ghose, of Wilfrid Laurier University, has likened the difference between quantum and classical computing to light bulbs and candles: The light bulb isnt just a better candle; its something completely different.

Rebecca Krauthamer, CEO of quantum computing consultancy Quantum Thought, compares quantum computing to a crossroads that allows a traveler to take both paths. If youre trying to solve a maze, youd come to your first gate, and you can go either right or left, she said. We have to choose one, but a quantum computer doesnt have to choose one. It can go right and left at the same time.

It can, in a sense, look at these different options simultaneously and then instantly find the most optimal path, she said. That's really powerful.

The most commonly used example of quantum superposition is Schrdingers cat:

Despite its ubiquity, many in the QC field arent so taken with Schrodingers cat. The more interesting fact about superposition rather than the two-things-at-once point of focus is the ability to look at quantum states in multiple ways, and ask it different questions, said Donohue. That is, rather than having to perform tasks sequentially, like a traditional computer, quantum computers can run vast numbers of parallel computations.

Part of Donohues professional charge is clarifying quantums nuances, so its worth quoting him here at length:

In superposition I can have state A and state B. I can ask my quantum state, are you A or B? And it will tell me, I'm a or I'm B. But I might have a superposition of A + B in which case, when I ask it, Are you A or B? Itll tell me A or B randomly.

But the key of superposition is that I can also ask the question, Are you in the superposition state of A + B? And then in that case, they'll tell me, Yes, I am the superposition state A + B.

But theres always going to be an opposite superposition. So if its A + B, the opposite superposition is A - B.

Thats about as simplified as we can get before trotting out equations. But the top-line takeaway is that that superposition is what lets a quantum computer try all paths at once.

Thats not to say that such unprecedented computational heft will displace or render moot classical computers. One thing that we can really agree on in the community is that it wont solve every type of problem that we run into, said Krauthamer.

But quantum computing is particularly well suited for certain kinds of challenges. Those include probability problems, optimization (what is, say, the best possible travel route?) and the incredible challenge of molecular simulation for use cases like drug development and materials discovery.

The cocktail of hype and complexity has a way of fuzzing outsiders conception of quantum computing which makes this point worth underlining: quantum computers exist, and they are being used right now.

They are not, however, presently solving climate change, turbocharging financial forecasting probabilities or performing other similarly lofty tasks that get bandied about in reference to quantum computings potential. QC may have commercial applications related to those challenges, which well explore further below, but thats well down the road.

Today, were still in whats known as the NISQ era Noisy, Intermediate-Scale Quantum. In a nutshell, quantum noise makes such computers incredibly difficult to stabilize. As such, NISQ computers cant be trusted to make decisions of major commercial consequence, which means theyre currently used primarily for research and education.

The technology just isnt quite there yet to provide a computational advantage over what could be done with other methods of computation at the moment, said Dohonue. Most [commercial] interest is from a long-term perspective. [Companies] are getting used to the technology so that when it does catch up and that timeline is a subject of fierce debate theyre ready for it.

Also, its fun to sit next to the cool kids. Lets be frank. Its good PR for them, too, said Donohue.

But NISQ computers R&D practicality is demonstrable, if decidedly small-scale. Donohue cites the molecular modeling of lithium hydrogen. Thats a small enough molecule that it can also be simulated using a supercomputer, but the quantum simulation provides an important opportunity to check our answers after a classical-computer simulation. NISQs have also delivered some results for problems in high-energy particle physics, Donohue noted.

One breakthrough came in 2017, when researchers at IBM modeled beryllium hydride, the largest molecule simulated on a quantum computer to date. Another key step arrived in 2019, when IonQ researchers used quantum computing to go bigger still, by simulating a water molecule.

These are generally still small problems that can be checked using classical simulation methods. But its building toward things that will be difficult to check without actually building a large particle physics experiment, which can get very expensive, Donohue said.

And curious minds can get their hands dirty right now. Users can operate small-scale quantum processors via the cloud through IBMs online Q Experience and its open-source software Quiskit. Late last year, Microsoft and Amazon both announced similar platforms, dubbed Azure Quantum and Braket. Thats one of the cool things about quantum computing today, said Krauthamer. We can all get on and play with it.

RelatedQuantum Computing and the Gaming Industry

Quantum computing may still be in its fussy, uncooperative stage, but that hasnt stopped commercial interests from diving in.

IBM announced at the recent Consumer Electronics Show that its so-called Q Network had expanded to more than 100 companies and organizations. Partners now range from Delta Air Lines to Anthem health to Daimler AG, which owns Mercedes-Benz.

Some of those partnerships hinge on quantum computings aforementioned promise in terms of molecular simulation. Daimler, for instance, is hoping the technology will one day yield a way to produce better batteries for electric vehicles.

Elsewhere, partnerships between quantum computing startups and leading companies in the pharmaceutical industry like those established between 1QBit and Biogen, and ProteinQure and AstraZeneca point to quantum molecular modelings drug-discovery promise, distant though it remains. (Today, drug development is done through expensive, relatively low-yield trial-and-error.)

Researchers would need millions of qubits to compute the chemical properties of a novel substance, noted theoretical physicist Sabine Hossenfelder in the Guardian last year. But the conceptual underpinning, at least, is there. A quantum computer knows quantum mechanics already, so I can essentially program in how another quantum system would work and use that to echo the other one, explained Donohue.

Theres also hope that large-scale quantum computers will help accelerate AI, and vice versa although experts disagree on this point. The reason theres controversy is, things have to be redesigned in a quantum world, said Krauthamer, who considers herself an AI-quantum optimist. We cant just translate algorithms from regular computers to quantum computers because the rules are completely different, at the most elemental level.

Some believe quantum computers can help combat climate change by improving carbon capture. Jeremy OBrien, CEO of Palo Alto-based PsiQuantum, wrote last year that quantum simulation of larger molecules if achieved could help build a catalyst for scrubbing carbon dioxide directly from the atmosphere.

Long-term applications tend to dominate headlines, but they also lead us back to quantum computings defining hurdle and the reason coverage remains littered with terms like potential and promise: error correction.

Qubits, it turns out, are higher maintenance than even the most meltdown-prone rock star. Any number of simple actions or variables can send error-prone qubits falling into decoherence, or the loss of a quantum state (mainly that all-important superposition). Things that can cause a quantum computer to crash include measuring qubits and running operations in other words: using it. Even small vibrations and temperature shifts will cause qubits to decohere, too.

Thats why quantum computers are kept isolated, and the ones that run on superconducting circuits the most prominent method, favored by Google and IBM have to be kept at near-absolute zero (a cool -460 degrees Fahrenheit).

Thechallenge is two-fold, according to Jonathan Carter, a scientist at Berkeley Quantum. First, individual physical qubits need to have better fidelity. That would conceivably happen either through better engineering, discovering optimal circuit layout, and finding the optimal combination of components. Second, we have to arrange them to form logical qubits.

Estimates range from hundreds to thousands to tens of thousands of physical qubits required to form one fault-tolerant qubit. I think its safe to say that none of the technology we have at the moment could scale out to those levels, Carter said.

From there, researchers would also have to build ever-more complex systems to handle the increase in qubit fidelity and numbers. So how long will it take until hardware-makers actually achieve the necessary error correction to make quantum computers commercially viable?

Some of these other barriers make it hard to say yes to a five- or 10-year timeline, Carter said.

Donohue invokes and rejects the same figure. Even the optimist wouldnt say its going to happen in the next five to 10 years, he said. At the same time, some small optimization problems, specifically in terms of random number generation could happen very soon.

Weve already seen some useful things in that regard, he said.

For people like Michael Biercuk, founder of quantum-engineering software company Q-CTRL, the only technical commercial milestone that matters now is quantum advantage or, as he uses the term, when a quantum computer provides some time or cost advantage over a classical computer. Count him among the optimists: he foresees a five-to-eight year time scale to achieve such a goal.

Another open question: Which method of quantum computing will become standard? While superconducting has borne the most fruit so far, researchers are exploring alternative methods that involve trapped ions, quantum annealing or so-called topological qubits. In Donohues view, its not necessarily a question of which technology is better so much as one of finding the best approach for different applications. For instance, superconducting chips naturally dovetail with the magnetic field technology that underpins neuroimaging.

The challenges that quantum computing faces, however, arent strictly hardware-related. The magic of quantum computing resides in algorithmic advances, not speed, Greg Kuperberg, a mathematician at the University of California at Davis, is quick to underscore.

If you come up with a new algorithm, for a question that it fits, things can be exponentially faster, he said, using exponential literally, not metaphorically. (There are currently 63 algorithms listed and 420 papers cited at Quantum Algorithm Zoo, an online catalog of quantum algorithms compiled by Microsoft quantum researcher Scott Jordan.)

Another roadblock, according to Krauthamer, is general lack of expertise. Theres just not enough people working at the software level or at the algorithmic level in the field, she said. Tech entrepreneur Jack Hidaritys team set out to count the number of people working in quantum computing and found only about 800 to 850 people, according to Krauthamer. Thats a bigger problem to focus on, even more than the hardware, she said. Because the people will bring that innovation.

While the community underscores the importance of outreach, the term quantum supremacy has itself come under fire. In our view, supremacy has overtones of violence, neocolonialism and racism through its association with white supremacy, 13 researchers wrote in Nature late last year. The letter has kickstarted an ongoing conversation among researchers and academics.

But the fields attempt to attract and expand also comes at a time of uncertainty in terms of broader information-sharing.

Quantum computing research is sometimes framed in the same adversarial terms as conversations about trade and other emerging tech that is, U.S. versus China. An oft-cited statistic from patent analytics consultancy Patinformatics states that, in 2018, China filed 492 patents related to quantum technology, compared to just 248 in the United States. That same year, the think tank Center for a New American Security published a paper that warned, China is positioning itself as a powerhouse in quantum science. By the end of 2018, the U.S. passed and signed into law the National Quantum Initiative Act. Many in the field believe legislators were compelled due to Chinas perceived growing advantage.

The initiative has spurred domestic research the Department of Energy recently announced up to $625 million in funding to establish up to five quantum information research centers but the geopolitical tensions give some in the quantum computing community pause, namely for fear of collaboration-chilling regulation. As quantum technology has become prominent in the media, among other places, there has been a desire suddenly among governments to clamp down, said Biercuk, who has warned of poorly crafted and nationalistic export controls in the past.

What they dont understand often is that quantum technology and quantum information in particular really are deep research activities where open transfer of scientific knowledge is essential, he added.

The National Science Foundation one of the government departments given additional funding and directives under the act generally has a positive track record in terms of avoiding draconian security controls, Kuperberg said. Even still, the antagonistic framing tends to obscure the on-the-ground facts. The truth behind the scenes is that, yes, China would like to be doing good research and quantum computing, but a lot of what theyre doing is just scrambling for any kind of output, he said.

Indeed, the majority of the aforementioned Chinese patents are quantum tech, but not quantum computing tech which is where the real promise lies.

The Department of Energy has an internal list of sensitive technologies that it could potentially restrict DOE researchers from sharing with counterparts in China, Russia, Iran and North Korea. It has not yet implemented that curtailment, however, DOE Office of Science director Chris Fall told the House committee on science, space and technology and clarified to Science, in January.

Along with such multi-agency-focused government spending, theres been a tsunami of venture capital directed toward commercial quantum-computing interests in recent years. A Nature analysis found that, in 2017 and 2018, private funding in the industry hit at least $450 million.

Still, funding concerns linger in some corners. Even as Googles quantum supremacy proof of concept has helped heighten excitement among enterprise investors, Biercuk has also flagged the beginnings of a contraction in investment in the sector.

Even as exceptional cases dominate headlines he points to PsiQuantums recent $230 million venture windfall there are lesser-reported signs of struggle. I know of probably four or five smaller shops that started and closed within about 24 months; others were absorbed by larger organizations because they struggled to raise, he said.

At the same time, signs of at least moderate investor agitation and internal turmoil have emerged. The Wall Street Journal reported in January that much-buzzed quantum computing startup Rigetti Computing saw its CTO and COO, among other staff, depart amid concerns that the companys tech wouldnt be commercially viable in a reasonable time frame.

Investor expectations had become inflated in some instances, according to experts. Some very good teams have faced more investor skepticism than I think has been justified This is not six months to mobile application development, Biercuk said.

In Kuperbergs view, part of the problem is that venture capital and quantum computing operate on completely different timelines. Putting venture capital into this in the hope that some profitable thing would arise quickly, that doesnt seem very natural to me in the first place, he said, adding the caveat that he considers the majority of QC money prestige investment rather than strictly ROI-focused.

But some startups themselves may have had some hand in driving financiers over-optimism. I wont name names, but there definitely were some people giving investors outsize expectations, especially when people started coming up with some pieces of hardware, saying that advantages were right around the corner, said Donohe. That very much rubbed the academic community the wrong way.

Scott Aaronson recently called out two prominent startups for what he described as a sort of calculated equivocation. He wrote of a pattern in which a party will speak of a quantum algorithms promise, without asking whether there are any indications that your approach will ever be able to exploit interference of amplitudes to outperform the best classical algorithm.

And, mea culpa, some blame for the hype surely lies with tech media. Trying to crack an area for a lay audience means you inevitably sacrifice some scientific precision, said Biercuk. (Thanks for understanding.)

Its all led to a willingness to serve up a glass of cold water now and again. As Juani Bermejo-Vega, a physicist and researcher at University of Granada in Spain, recently told Wired, the machine on which Google ran its milestone proof of concept is mostly still a useless quantum computer for practical purposes.

Bermejo-Vegas quote came in a story about the emergence of a Twitter account called Quantum Bullshit Detector, which decrees, @artdecider-like, a bullshit or not bullshit quote tweet of various quantum claims. The fact that leading quantum researchers are among the accounts 9,000-plus base of followers would seem to indicate that some weariness exists among the ranks.

But even with the various challenges, cautious optimism seems to characterize much of the industry. For good and ill, Im vocal about maintaining scientific and technical integrity while also being a true optimist about the field and sharing the excitement that I have and to excite others about whats coming, Biercuk said.

This year could prove to be formative in the quest to use quantum computers to solve real-world problems, said Krauthamer. Whenever I talk to people about quantum computing, without fail, they come away really excited. Even the biggest skeptics who say, Oh no, theyre not real. Its not going to happen for a long time.

Related20 Quantum Computing Companies to Know

Read more:
What Is Quantum Computing and How Does it Work? - Built In

Quantum computing has arrived, but we still don’t really know what to do with it – ZDNet

As of 2019, the UK is half-way througha ten-year national programme designed to boost quantum technologies, which has so far benefited from a combined 1 billion investment from government and industry. The verdict? Quantum has a lot of potential but we're not sure what for.

Speaking at a conference in London, Claire Cramer, from the US Department of Energy, said: "There is a lot of promise in quantum, but we don't have a transformative solution yet. In reality, we don't know what impact the technology will have."

That is not to say, of course, that the past five years have been a failure. Quite the opposite: researchers around the world can now effectively trial and test quantum technology, because the hardware has been developed. In other words, quantum computers are no longer a feat of the imagination. The devices exist, and that in itself is a milestone.

SEE: Sensor'd enterprise: IoT, ML, and big data (ZDNet special report) | Download the report as a PDF (TechRepublic)

Earlier this month at the Consumer Electronics Show, in fact, IBM went to great lengths to remind the public that the IBM Q System One a 20-qubit quantum computer that the company says is capable of performing reliable quantum computations is gaining more momentum among researchers.

The Q System One has been deployed to 15 companies and laboratories so far, as a prototype that research teams can run to work out how quantum computers may be used to solve problems in the future.

Finding out what those problems might be is quantum's next challenge. Liam Blackwell, deputy director at the engineering and physical sciences research council, said: "A lot of money has been invested, and we need to start seeing actual outcomes that will benefit the UK. The challenge now, really, is that we have to deliver."

Research teams are not leaping into the unknown: there are already a few potential applications of quantum technology that have been put forward, ranging from enhancing security with quantum cryptography to improving the accuracy of GPS.

Pharmaceuticals and drug discovery have been identified as fields that could hugely benefit from the new technology as well. Last year, for example, neuroscience firm Biogen partnered with quantum computing research firm 1QBit to better tackle diseases like Alzheimer's and multiple sclerosis.

For Cramer, though, this is only scratching the surface. "Look at laser technology, for example," she said. "Seventy years ago, people didn't think lasers could even exist, and now you wouldn't think twice about holding a laser pointer in your hand.

"It's the same thing with quantum. We can't imagine what the transformative applications will be yet; so we need to maintain a culture of discovery."

There is only one secret to achieve a successful "culture of discovery", she continued: research, research, and more research. In the US, for example, the Department of Commerce recently created the Quantum Economic Development Consortium (QEDC). Its objectives? To "identify technology solutions" and "highlight use cases and grand challenges to accelerate development efforts".

It is not enough, however, to pump money into labs. Once blue-sky researchers have come up with an unexplored application of quantum, they still have to be able to commercialise their idea and bridging between labs and industry might be easier said than done.

In the UK, the issue is not confined to quantum technology. A recent report by VC company Octopus Ventures showed that trillions of pounds are lost every year because of the difficulty of bringing new ideas from university labs to the stock exchange.

In contrast, in the US, over 26,000 companies started in research teams from the Massachusetts Institute of Technology (MIT). Combined, these businesses have an annual turnaround of over $2 trillion (1.5 trillion).

"The UK has a very strong lead on research in quantum, but we have lessons to learn from the US," said Elham Kashefi, professor of computer science at the University of Edinburgh. "We need to push research to the next level, to connect it to industry."

SEE: The dark side of IoT, AI and quantum computing: Hacking, data breaches and existential threat

UKRI, an organisation that directs innovation funding through the budget of the department for business, energy and industrial strategy, recently stressed that commercialising quantum technology would be a priority.

The UK organisation invested 20 million in "pioneer funding" for start-ups leveraging quantum technology to develop "products of the future". Four projects benefited from the award to develop prototypes ranging from quantum sensors that can detect objects underground, to encryption tools that keep data safe.

UKRI is now investing another 153 million in new projects, alongside a 205 million investment from industry. Presenting the organisation's plans for the future, UKRI's director for quantum technologies, Roger McKinlay, said: "I don't know what's coming next, but I hope that we can continue to support what I believe is by far the most interesting emerging technology at the moment."

It doesn't seem, therefore, that quantum uncertainty will be resolved anytime soon but it certainly is worth watching this spot.

Follow this link:
Quantum computing has arrived, but we still don't really know what to do with it - ZDNet

IBM Just Called Out Google Over Their "Quantum Computer" – The National Interest Online

On Oct. 23, 2019, Google published a paper in the journal Nature entitled Quantum supremacy using a programmable superconducting processor. The tech giant announced its achievement of a much vaunted goal: quantum supremacy.

This perhaps ill-chosen term (coined by physicist John Preskill) is meant to convey the huge speedup that processors based on quantum-mechanical systems are predicted to exhibit, relative to even the fastest classical computers.

Googles benchmark was achieved on a new type of quantum processor, code-named Sycamore, consisting of 54 independently addressable superconducting junction devices (of which only 53 were working for the demonstration).

Each of these devices allows the storage of one bit of quantum information. In contrast to the bits in a classical computer, which can only store one of two states (0 or 1 in the digital language of binary code), a quantum bit qbit can store information in a coherent superposition state which can be considered to contain fractional amounts of both 0 and 1.

Sycamore uses technology developed by the superconductivity research group of physicist John Martinis at the University of California, Santa Barbara. The entire Sycamore system must be kept cold at cryogenic temperatures using special helium dilution refrigeration technology. Because of the immense challenge involved in keeping such a large system near the absolute zero of temperature, it is a technological tour de force.

Contentious findings

The Google researchers demonstrated that the performance of their quantum processor in sampling the output of a pseudo-random quantum circuit was vastly better than a classical computer chip like the kind in our laptops could achieve. Just how vastly became a point of contention, and the story was not without intrigue.

An inadvertent leak of the Google groups paper on the NASA Technical Reports Server (NTRS) occurred a month prior to publication, during the blackout period when Nature prohibits discussion by the authors regarding as-yet-unpublished papers. The lapse was momentary, but long enough that The Financial Times, The Verge and other outlets picked up the story.

A well-known quantum computing blog by computer scientist Scott Aaronson contained some oblique references to the leak. The reason for this obliqueness became clear when the paper was finally published online and Aaronson could at last reveal himself to be one of the reviewers.

Challenges to Googles story

The story had a further controversial twist when the Google groups claims were immediately countered by IBMs quantum computing group. IBM shared a preprint posted on the ArXiv (an online repository for academic papers that have yet to go through peer review) and a blog post dated Oct. 21, 2019 (note the date!).

While the Google group had claimed that a classical (super)computer would require 10,000 years to simulate the same 53-qbit random quantum circuit sampling task that their Sycamore processor could do in 200 seconds, the IBM researchers showed a method that could reduce the classical computation time to a mere matter of days.

However, the IBM classical computation would have to be carried out on the worlds fastest supercomputer the IBM-developed Summit OLCF-4 at Oak Ridge National Labs in Tennessee with clever use of secondary storage to achieve this benchmark.

While of great interest to researchers like myself working on hardware technologies related to quantum information, and important in terms of establishing academic bragging rights, the IBM-versus-Google aspect of the story is probably less relevant to the general public interested in all things quantum.

For the average citizen, the mere fact that a 53-qbit device could beat the worlds fastest supercomputer (containing more than 10,000 multi-core processors) is undoubtedly impressive. Now we must try to imagine what may come next.

Quantum futures

The reality of quantum computing today is that very impressive strides have been made on the hardware front. A wide array of credible quantum computing hardware platforms now exist, including ion traps, superconducting device arrays similar to those in Googles Sycamore system and isolated electrons trapped in NV-centres in diamond.

These and other systems are all now in play, each with benefits and drawbacks. So far researchers and engineers have been making steady technological progress in developing these different hardware platforms for quantum computing.

What has lagged quite a bit behind are custom-designed algorithms (computer programs) designed to run on quantum computers and able to take full advantage of possible quantum speed-ups. While several notable quantum algorithms exist Shors algorithm for factorization, for example, which has applications in cryptography, and Grovers algorithm, which might prove useful in database search applications the total set of quantum algorithms remains rather small.

Much of the early interest (and funding) in quantum computing was spurred by the possibility of quantum-enabled advances in cryptography and code-breaking. A huge number of online interactions ranging from confidential communications to financial transactions require secure and encrypted messages, and modern cryptography relies on the difficulty of factoring large numbers to achieve this encryption.

Quantum computing could be very disruptive in this space, as Shors algorithm could make code-breaking much faster, while quantum-based encryption methods would allow detection of any eavesdroppers.

The interest various agencies have in unbreakable codes for secure military and financial communications has been a major driver of research in quantum computing. It is worth noting that all these code-making and code-breaking applications of quantum computing ignore to some extent the fact that no system is perfectly secure; there will always be a backdoor, because there will always be a non-quantum human element that can be compromised.

Quantum applications

More appealing for the non-espionage and non-hacker communities in other words, the rest of us are the possible applications of quantum computation to solve very difficult problems that are effectively unsolvable using classical computers.

Ironically, many of these problems emerge when we try to use classical computers to solve quantum-mechanical problems, such as quantum chemistry problems that could be relevant for drug design and various challenges in condensed matter physics including a number related to high-temperature superconductivity.

So where are we in the wonderful and wild world of quantum computation?

In recent years, we have had many convincing demonstrations that qbits can be created, stored, manipulated and read using a number of futuristic-sounding quantum hardware platforms. But the algorithms lag. So while the prospect of quantum computing is fascinating, it will likely be a long time before we have quantum equivalents of the silicon chips that power our versatile modern computing devices.

Michael Bradley, Professor of Physics & Engineering Physics, University of Saskatchewan.This article is republished from The Conversation under a Creative Commons license. Read the original article.

Media: Reuters

See original here:
IBM Just Called Out Google Over Their "Quantum Computer" - The National Interest Online

Saving salmon and coronavirus outbreak: News from the College | Imperial News – Imperial College London

Heres a batch of fresh news and announcements from across Imperial.

From a new project to preserve safe havens for salmon, to Imperial researchers analysing the extent of the coronavirus outbreak, here is some quick-read news from across the College.

The population of Wild North Atlantic Salmon is now at its lowest level ever recorded, inspiring the new Six Rivers Project, led by the Marine and Freshwater Research Institute (MFRI) Iceland and Imperial College London, and funded by Sir Jim Ratcliffe and Ineos.

The project, which had its inaugural conference this month, is focused on preserving both the land and river ecosystems across six rivers in northeast Iceland, supporting one of the last safe havens where salmon populations still thrive.

Imperials Professor Guy Woodward said: The North Atlantic Salmon is a keystone species in the ecosystem. Icelands rivers have simple ecosystems providing ideal research conditions. Their latitude also brings with it a potential sensitivity to the effects of climate change, more so than in other parts of the world.

Read more about the inaugural conference of the Six Rivers Project.

Provost Ian Walmsley discussed the future of quantum computing at the Digital-Life-Design (DLD) conference in Munich.

He made the case for globalcollaboration in the race to develop a viable quantum computer, and spoke to German media, including BR24.

Other participants in DLD, one of the worlds most important technology events, included Nick Clegg, Ursula von der Leyen and Garry Kasparov.

The first Photonics Online Meetup a free, online-only global conference for photonics researchers went ahead with great success this month.

The five-hour-long conference on 13 January 2020 brought together 1,100 researchers in 37 countries across six continents in real time. More than 635 of these researchers gathered at 66 local hubs in 27 countries to join in together.

There was also a Twitter-based poster session, with 59 virtual posters averaging 3,000 views each. Videos of the event are now available online, with around 150 people downloading the videos in the first 24 hours.

Read more at the Photonics Online Meetup.

The Early Years Centre (EYC) has reopened after an extensive refurbishment. Staff, parents and children attended the opening event on Thursday 16 January.

Tracy Halsey, Early Years Centre Manager, thanked staff for their efforts, and Professor Emma McCoy, Early Years Committee chair, declared the new centre open with a ribbon-cutting ceremony.

This 8m investment has expand the EYCs capacity, creating an additional 56 places and refurbishing the existing indoor and outdoor space. The extra places are being introduced in response to the growing demand for affordable childcare onsite. The EYC can offer places to over 200 children and will reduce the average waiting time for a place. The EYC will celebrate its fiftieth anniversary this year.

Read more about the project on our news site.

Pembridge Hall has become one of the top halls in the UK to complete the Student Switch Off campaigns climate change quiz. Over 1,000 Imperial students took the quiz with over 500 pledging to save energy, water and recycle. Pembridge Hall will receive 50 tubs of Ben & Jerrys ice cream as their reward.

The Student Switch Off campaign, aimed at encouraging sustainability, also includes a microgrant scheme which gives students funding to organise their own pro-environmental activities. Imperial undergraduate, Lauren Wheeler, has become the first student in the UK to receive a microgrant to run an event she will raise funds to help those affected by the recent wildfires in Australia.

The hall that gets the most student engagement over the year will receive 250 for their hall committee. The campaign will continue this term.

Imperial researchers are helping with the global response to the spread of coronavirus. They are also leading voices on the matter in the media worldwide, appearing in over a thousand media articles and broadcast news packages about the outbreak.

An ongoing series of reports from the MRC Centre for Global Infectious Disease Analysis and J-IDEA at Imperial is looking at the number of cases and understanding the transmissibility of the disease. Other researchers at the College are working on areas including vaccine development and helping the UK to respond.

Commentary from eleven Imperial experts has featured in global outlets including the BBC World Service, CNN, andNew York Times.

For further updates, visit the Centres website.

Want to be kept up to date on news at Imperial?

Sign up for our free quick-read daily e-newsletter, Imperial Today.

Original post:
Saving salmon and coronavirus outbreak: News from the College | Imperial News - Imperial College London

Will 2020 Be The Year Cryptocurrency And Blockchain Becomes Operational? – Forbes

Getty

There is no doubt that 2019 was the year of enterprise blockchain adoption. The buzzword of blockchain and cryptocurrency was humming as giants tech giants like Microsoft, IBM, AWS, Oracle and many, many more started testing the waters. Even in the cryptocurrency space, Banking giants and payment companies like JPMorgan, Wells Fargo, Square, Circle, and Skrill all saw growth in deciding to offer cryptocurrency services.

However, in the last three years where blockchain and cryptocurrency have managed to emerge into the mainstream light, there has yet to be a solution that is entirely solved by this emerging technology. As the saying goes - Blockchain is a solution looking for a problem.

The problem is, as this hunt to become operational and usable enters its fourth year in earnest the lustre of the technology and its financial offshoot, cryptocurrency, starts to wear off.

As an article from Adrienne Jeffries at the Verge titled Blockchain is meaningless explained: The idea of a blockchain, the cryptographically enhanced digital ledger that underpins Bitcoin and most cryptocurrencies, is now being used to describe everything from a system for inter-bank transactions to a new supply chain database for Walmart. The term has become so widespread that its quickly losing meaning.

Part of the issue is that blockchain is still a very young technology - despite being over 10 years old. It managed to bubble under and meet the needs of a fraction of the global population before being thrust forward and demanded to handle the worlds problems. Operational problems still persist with blockchain; from scalability, speed and cost, interoperability and the decentralized / centralized battle among the private and public chains.

Still, 2020 could be - or needs to be - a turning point for the industry and there are signs that companies are looking to make the technology work for them.

Square, the fintech payment company headed up by the affable Jack Dorsey has long had an interest in cryptocurrency. They included it in their platform and have been seeing niche usage, pinning it as something for the future. However, the irony of championing Bitcoin from a payment service is the same as a store proclaiming it only accepts solid gold for its goods.

Bitcoin has set its designation on being a store of value for a few reasons. Firstly, it has, for the most part, been on an upward trend in value and thus is not something people want to part with. Secondly, it is simply not a good payment tool as it is not instant and it has variable transaction fees.

But, it was announced by Dorsey that his payments company was taking a leap to make Bitcoin more usable for payments by looking to build on the Lightning Network. The Lightning Network is a "Layer 2" payment protocol that operates on top of a blockchain-based cryptocurrency (like Bitcoin). It enables fast transactions among participating nodes and has been touted as a solution to the Bitcoin scalability problem.

This would indeed help cryptocurrency become more operational, and for a company like Square, it could open up some big doors for its users who can take advantage of Bitcoins decentralized global network.

The issue is that the Lightning Network is probably more rough and embryonic than Bitcoin, it is still being refined and developed and is far from a polished product. So, in terms of making a solution for the use of cryptocurrency in an operational sense, there needs to be other quick-fix solutions.

Using Bitcoin at a point of sale is not that uncommon, in fact, it can be spent at places like Starbucks, Wholefoods, Nordstroms and other major retailers thanks to another company providing a cryptocurrency point of sale - Flexa.

But again, this solution relies on a lot more than just making Bitcoin spendable, there needs to buy in from merchants at critical mass - and this is a problem as outlined by Forbes. But, it is not only about Bitcoin being a spendable asset, it is also about normalizing and legitimizing it to the point where companies are not ashamed of it.The problem is, Starbucks, along with every single one of a huge group of giant enterprises now accepting cryptocurrency as payment, seems to be having trouble admitting what theyre doing. As a photograph of the receipt for the transaction was taken, one member of the Winklevoss entourage recommended that Cameron [Winklevoss] cover up the Starbucks logo with his thumb. Theyre not participating in the first announcement, she reminded Cameron.

A little further south, In Venezuela, there is another big name that is happy to accept Bitcoin - but again, it is not as simple as relying solely on the technology. Burger King restaurants in Venezuela now accepting cryptocurrencies thanks to crypto company Cryptobuyer.

Burger King will trial the system at its premises in the Sambil Caracas shopping mall with plans to roll out the system to all of the countrys forty restaurants in 2020. While this sounds promising, it is still a case of a digital island, as coined by the World Trade Organisation.

Being a digital island is a big factor to consider in the blockchain space, and leads onto another major problem with the technology that is spreading to lots of little islands across the globe - interoperability.

It could be argued that one of the big problems holding blockchain back from being globally operational is interoperability. As solutions are built, piloted, and worked on, they remain isolated and siloed - this is especially true with private blockchains such as IBMs Hyperledger which makes up a big portion of major companys blockchain solutions. In fact, more than 50 percent of Forbes Blockchain 50 list use Hyperledger, but for the other half, there would be no way to link these solutions together.

Even the World Trade Organisation has highlighted the need for interoperability in the progress of blockchain technology in their paper titled: Can blockchain revolutionize international trade

The development of interoperability solutions is therefore critical to avoid conflicts between disparate approaches and ensure that blockchain networks talk to each other, thereby allowing the technology to be used to its full potential. The Blockchain community is well aware of the stakes at play and is actively researching technical Solutions, explains the WTO.

While the idea of different blockchains interacting with one another still seemed a distant possibility just a year or two ago, concrete solutions are now starting to Emerge.

The WTO goes on to talk about solutions that have been around since 2018, such as the Enterprise Ethereum Alliance, which is an open-source cross-platform standards-based framework for Ethereum-based permissioned blockchains that would allow interoperability

between permissioned blockchains built on the Ethereum public blockchain.

More so, The WTO also mentions that for a truly global blockchain system interoperability will have to be achieved - which seems unlikely in the coming year - or at least bridges across the blockchains. One of the more recent bridging solutions mentioned relate to a bridge protocol launched by Syscoin that links to Ethereum, one of the biggest usable blockchains around.

By forming an interoperable bridge to Ethereum Syscoin demonstrates the potential of having just two chains operating together. Enterprises that look to make blockchain more usable by choosing platforms like Syscoins Z-DAG (Zero-Confirmation Directed Acyclic Graph) network for faster transactions then also benefit from holding onto the power of Ethereums smart contracts.

As an example, prevalent in the cryptocurrency space today, USDT, a stablecoin pegged to the US dollar and asset reserves managed by a company called Tether sees its transactions comprising of 50 percent of Ethereums network transactions. If this could be 'outsourced' to a different chain while keeping its characteristics, the benefits to both the USDT and Ethereum networks help enable both to be more usable.

Yet again, the concerns are that full interoperability solutions are still a few years away as it has been seen that this issue was being addressed as far back as 2017 as companies behind three blockchain platforms Aion, ICON and Wanchain announced the creation of a new advocacy group, the Blockchain Interoperability Alliance. This alliance was aimed at developing globally accepted standards to promote greater connectivity and interoperability between the disparate blockchain networks.

Do or die

It may be a little early to be hitting the panic buttons on blockchain not reaching its potential and then falling off the radar only to be a wasted opportunity, but cracks are showing.

In 2018, Cisco took only 18 months of blockchain research to realise that there was no immediate future for them in the space and shut down their whole division.

It will take a while for the many players in the complex markets to get up to speed told Anoop Nanra, head of the companys blockchain initiative, to CNBC.

What needs to be achieved may not be a full level adoption and operational relevance of blockchain this year, but without a big breakthrough stride there could well be questions asked at the end of the year as to where next for the technology.

Read more:
Will 2020 Be The Year Cryptocurrency And Blockchain Becomes Operational? - Forbes

Fewer Pronouncements of BTC’s Death in 2019, but Here Are the Top 5 – Cointelegraph

Bitcoin has long been disregarded as a speculative asset class that is doomed to fail by mainstream media outlets around the world.

The apathy toward the worlds preeminent cryptocurrency has been embodied by countless articles that have either hailed the death of Bitcoin or predicted its impending demise.

For the past three years, Cointelegraph has reviewed the top annual Bitcoin obituaries courtesy of 99Bitcoins list of news articles that have unwittingly foretold the end of the cryptocurrency.

In 2017, Bitcoin and the overall cryptocurrency market saw the biggest surge in history, when BTCs value soared to over $20,000 by December of that year. The lofty gains were eventually curtailed by a humbling price correction and stagnation that ensued for much of 2018.

Unsurprisingly, the number of Bitcoin obituaries skyrocketed during those years. In 2017, Bitcoins death was predicted 124 times, while 2018 saw that number reduced to 93. At the end of 2019, a full decade since Bitcoins inception, the number of articles proclaiming the end of the cryptocurrency had fallen to just 40.

In this listicle, Cointelegraph looks at the top five Bitcoin burials of the last year.

The annual World Economic Forum in Davos is a major event in the global financial and banking sphere, and cryptocurrencies have been part of its debates and forums over the past few years.

During a TV debate hosted by CNBC in Davos in last January, Bitcoins underlying protocol became a major point of contention in its potential future as a valued cryptocurrency.

BCG Digital Ventures Founder Jeff Schumacher highlighted concerns regarding the way in which Bitcoin derives value from its proof-of-work protocol suggesting that its value could plummet to $0.

I do believe it will go to zero. I think its a great technology but I dont believe its a currency. Its not based on anything.

The debate itself was not centered around an attack on Bitcoin but a robust debate around the future of cryptocurrencies and the power of the protocols that underpin the blockchain technology used to keep them running.

As Bitcoins price began to rebound from a lowly start in 2019, an article in Gizmodo slammed the cryptocurrency and people who have invested into the space.

The author of the piece was scathing in his take on Bitcoin as the cryptocurrency bounced back to a four-month high following a difficult market climate in 2018.

Its fake money thats about as practical to use in the real world as Monopoly bills.

The writer also took aim at Bitcoins proof-of-work consensus algorithm, criticising the energy-intensive demands of recording transactions and maintaining the blockchain.

In May 2019, renowned investor Kevin OLeary of SharkTank fame likened Bitcoin to playing a digital game in an interview on CNBCs Squawk Box.

OLearys comments came as Bitcoin was sharply appreciating in value midyear. His major criticism was that investors could not take large sums of value in and out of Bitcoin because buyers demand a guarantee on the value of BTC theyre receiving.

OLeary made reference to his own efforts to buy real estate in Switzerland using Bitcoin, with the difficulty being that the receiver of such a large amount of Bitcoin is not ready to trade other assets due to the risk associated with BTCs price volatility.

To me, its garbage, because you cant get in and out of it in large amounts.

Globally respected investor Warren Buffett, founder of Berkshire Hathaway investment group, has long been a dissenting voice towards cryptocurrencies and Bitcoin in particular.

Earlier this year, Buffett cast fresh aspersions on the space while speaking to the press ahead of his companys annual meeting.

Buffett went as far as saying that Bitcoin had become a gambling device and that the cryptocurrency hadnt produced anything.

Ill tear off a button here. What I have here is a little token Ill offer it to you for $1000, and Ill see if I can get the price up to $2000 by the end of the day... But the button has one use and its a very limited use.

While slamming Bitcoin, Buffett gave credit to the power of blockchain technology while pointing towards JP Morgans recently developed blockchain products.

An article in the New York Post speculated in June that Bitcoin would not survive in the future, despite another price surge occurring that same month.

The cryptocurrency had jumped up to $13,000, marking a massive increase in value from the beginning of the year, when it was hovering around $3,000.

The author of the article suggested that the surge could have come down to Facebooks reveal of its Libra cryptocurrency plans, which may have boosted market sentiment in the cryptocurrency sector.

The argument suggested that Facebooks product is something that provides real value and could be the catalyst that leads to the demise of Bitcoin.

In fact, it might spell the beginning of the end for bitcoin. There are almost no major Wall Street investors of substance with meaningful track records who have invested in bitcoin. To me, its fools gold. There are no financial statements, no balance sheets, no revenues or assets.

A decline in the number of sweeping statements made about the future lifespan or looming demise of Bitcoin and cryptocurrencies is an interesting development in the space.

The dawn of 2020 brings down the curtain on the first decade of the crypto era. Bitcoin led the way with its inception in 2009, and the space has subsequently exploded over the past 10 years.

Coinbase CEO Brian Armstrong wrapped up his take on the last decade in a blog post early in the new year, in which he vilified the multitude of articles which had inaccurately written off Bitcoin.

There were over 379 articles written, prematurely declaring the end of Bitcoin. Not only did Bitcoin survive, it thrived, becoming the top performing asset of the decade. The naysayers were proved wrong and we learned an important lesson about human nature: most big breakthroughs are contrarian ideas that people dismiss and ridicule at the start.

As Armstrong suggests, the success and adoption of Bitcoin birthed an era of innovation that spawned a plethora of cryptocurrency and blockchain projects that have driven the development of the space.

Toward the end of the decade, the interest in the space spilled into mainstream industries. The worlds best investment firms started gaining exposure to crypto while some of the biggest companies began using blockchain technology or developing their own enterprise software and tokens.

A combination of all of these factors may be the reason for the significantly fewer Bitcoin obituaries being published. The focus seems to have shifted to major developments in the space, like Facebooks Libra cryptocurrency plans and moves by regulators around the world.

View original post here:
Fewer Pronouncements of BTC's Death in 2019, but Here Are the Top 5 - Cointelegraph

Two types of cryptocurrency are now dominating the market – Decrypt

Two types of cryptocurrency tokens are outperforming the rest of the market, according to data from Longhash. Over the last year, native exchange tokens and tokens used for cryptocurrency lendingin DeFi platformshad the greatest returns on investment (ROI).

The data, which LongHash sourced from blockchain research platform Messari, shows that only two of the 19 different token classes analyzed produced a positive ROI in the last year when measured against the US dollar.

Lending and exchange tokens had ROIs above 70% over the last year. Image: LongHash/Messari

Lending tokens were at the top, with an average ROI of 75%. These tokens are typically used by DeFi lending platforms, that allow you to lend and borrow money with other people around the worldwithout going through a bank or other third party. This suggests a rise in interest towards DeFi products and services.

Examples of lending tokens include the stablecoin DAI, which is pegged to the US dollar, and Nexo (NEXO), which provides passive income to token holders.

Exchange tokens came second over the last year. These are cryptocurrencies native to crypto exchanges. They are typically used to pay trading fees or for other services on the exchanges. Some exchanges like Binance have built up entire ecosystems around their exchange coinsin this case BNBand even pay their staff with it.

In the last 90 days, a slightly different picture emerges. Exchanges tokens have only just produced positive ROIs while currencies, such as Bitcoin, performed well. However, lending tokens remain in the lead.

Earlier this month, Decrypt found that proof-of-stake coinsin particular Tezos (XTZ) and Cosmos (ATOM)managed to rack up impressive gains against Bitcoin towards the end of last year. Crypto exchanges adding support for staking rewards, and the news this generated, likely helped to boost their bottom line.

Read more from the original source:
Two types of cryptocurrency are now dominating the market - Decrypt

Top 5 Performing Cryptocurrency In Jan 2020: Ethereum Classic, Litecoin, Bitcoin SV are Among them – Coingape

The cryptocurrency market has been on a roll in the past month as altcoins showed signs to start an all-season as every top crypto gained handsomely over the last seven days. According to data on Coingecko data, a number of altcoins outperformed Bitcoin in this period, as bulls look forward to a sustained uptrend heading into Bitcoin halving period. As January comes to an end, closing out the first month of 2020, we look back at the best performing cryptocurrencies

As liquidity is a key metric to look at when trading cryptocurrencies we narrowed the list to include only the top 30 coins in liquidity. The five best-performing coins in the past week include Ethereum Classic (ETC), Bitcoin SV (BSV), Dash (DASH), ZCash (ZEC) and Litecoin (LTC), which saw a 20% gain in the past 48 hours.

Across the top 10 coins by market cap, Litecoin (LTC) is the best performer in the past 48 hours as the seventh largest coin touched key resistance at $70 after a sharp 20% increase in price in a day. The price of LTC has since dipped slightly to $67.00 USD, as at time of writing, easing the bulls pressure as near term indicators indicate a reversal before buying continues.

The sharp spike in LTCs price in the past 24 hours has seen the Bitcoin-lite token record a 62.5% soar in the past month.

The 25th largest coin in market capitalization, ZCash (ZEC), a privacy enabled cryptocurrency, enters the list in the fourth position after a 132% spike in price in the past month. The price of ZEC stands at $63.54, representing a 6.7% drop in the past 24 hours, as the community voted to distribute 20% of the mining reward on to an infrastructure development fund.

The privacy coin is ending its two-year downturn similar to many altcoins and a spike to $80 USD resistance area is looking very much alive.

What was once considered a dead fork of Ethereum is alive and kicking once again after a successful month of January 2020. The month started off with a successful Agartha hard fork on the blockchain boosting the original Ethereum by 145% in the past month.

ETC currently trades at $10.99 USD, representing 8% decrease over the past 24 hours. The cryptos hashrate recently hit an all-time as the coin hit $12 USD.

In second place enters Digital Cash, aka Dash, which has broken the charts in the past fortnight as the price spike over 180% in the past month. Over the past month Dash has witnessed increased adoption rates in South America and Latin America; Mexico ATMs have accepted Dash withdrawals in over 11,000 locations across the country. The demand for Dash is set to boost further as the privacy-enabled crypto continues to show positive reviews in economically straining countries.

As at time of writing, Dash trades at $115.35 USD, representing a slight 6.2% drop in the last 24 hours.

The most Impressive altcoin in January 2020 has to be Bitcoin SV (BSV), which hit an all-time high of $438 USD around 17 days ago. The Bitcoin hard fork is currently trading at 271 USD across the major exchanges, representing a 181% spike in the past month.

The spike in Bitcoin SV price was heavily linked to Craig Wrights Tulip III paper which was expected to provide proof that he is Satoshi, creator of Bitcoin.

With the number of altcoins registering triple-digit gains at a high, an altcoin season looks very possible by the end of 2020.

Summary

Article Name

Top 5 Performing Cryptocurrency In Jan 2020: Ethereum Classic (ETC), Litecoin (LTC), Bitcoin SV (BSV), And More

Description

As January comes to an end, closing out the first month of 2020, we look back at the best performing cryptocurrencies

Author

Lujan Odera

Publisher Name

Coingape

Publisher Logo

Share on Facebook

Share on Twitter

Share on Linkedin

Share on Telegram

The rest is here:
Top 5 Performing Cryptocurrency In Jan 2020: Ethereum Classic, Litecoin, Bitcoin SV are Among them - Coingape

How does risk and reward work in cryptocurrency? – Coin Rivet

Regardless of the type of investment, there will always be some risk involved. Otherwise, it would be hard to get a hefty reward, right?

Today I aim at looking at strategies, issues and solutions to some risk/reward conundrums. Understanding the relationship between risk and reward is a crucial piece in building your investment philosophy. Investmentssuch as flipping cryptocurrencies, staking or miningeach have their own risk profile. Understanding the differences can help you more effectively diversify and protect your investment portfolio.

As I said so many times before, diversifying is key to spreading your risk. Without the right strategy for diversification and understanding what types of risks are there, associated to each type of investment, youre likely to lose ROI.

Hence, I take my altcoin price and fundamental analysis so seriously.

Usually people say the higher the risk, the higher the potential return. Perhaps a more accurate statement is, the higher the risk, the higher the potential return, and the less likely it will achieve the higher return.

To understand this relationship completely, you must know what your risk tolerance is and be able to gauge the relative risk of a particular investment correctly. When you choose to put your money into altcoins that are riskier than Bitcoin or Ethereum, you run the possibility of experiencing any or all of the following to some degree:

To fully comprehend how to measure risk, lets discuss the Risk/Reward ratio and why it is useful.

According to Investopedia, the risk/reward ratio marks the prospective reward an investor can earn, for every dollar he or she risks on an investment.

Many investors use risk/reward ratios to compare the expected returns of an investment with the amount of risk they must undertake to earn these returns. Consider the following example: an investment with a risk-reward ratio of 1:7 suggests that an investor is willing to risk $1, for the prospect of earning $7. Alternatively, a risk/reward ratio of 1:3 signals that an investor should expect to invest $1, for the prospect of earning $3 on his investment.

Traders often use this approach to plan which trades to take, and the ratio is calculated by dividing the amount a trader stands to lose if the price of an asset moves in an unexpected direction (the risk) by the amount of profit the trader expects to have made when the position is closed (the reward).

Hence, the risk/reward ratio is a key indicator of how well youve allocated your funds.

In the case of Bitcoin and some other key altcoins, such as Ethereum or XRP, this ratio is absolutely insane.

Before I conclude this piece, I should underline there is a difference between dirty risk and clean risk. To explain how these concepts differ, Ill compare two lending mechanics: Salt and Lendroid.

Dirty risk is hidden risk.Essentially, its risk not fully understood by the borrower. In Salt, you could take out a loan and pay a fixed monthly interest. What isnt properly explained is the hidden risk, in this case what happens if your collaterals price falls to a certain threshold. In sum, you would simply get liquidated if the price of ETH falls below a certain level.

That represents Dirty risk because borrowers may not be aware of this, as it is not cleared explained on the website.

Clean risk, on the other hand, is transparent risk. Essentially, borrowers are fully aware of hidden fees or liquidation prices.

To better understand the difference, let me use the case of Lendroid, a lending platform. What Lendroid created is an alternative mechanism to avoid getting borrowers liquidated.

In Lendroind, there are two kinds of risk liquidity pools you can get involved in. One is the Harbour Pool, which is risk free by design. Worst case scenario, you get back the money you put in it. Much like the brand new smart contract lotteries built on Ethereum. You pool funds with other users, stake those funds, and use the interest to bet. This way, you may never lose (or keep losses to less than 1%).

The other kind of pool in Lendroid is the High Water Pool, where risk is made very clear. It advertises the currency and collateral combination it supports, so the user can make an informed decision. This way, the chances of getting liquidated greatly diminish.

With clear risk and a differentiation between risk pools, its possible for borrowers to avoid hidden risks.

Safe trades!

Follow this link:
How does risk and reward work in cryptocurrency? - Coin Rivet