Trail of Bits and Prysm Group Launch Mainnet360 – Business Wire

NEW YORK--(BUSINESS WIRE)--Cybersecurity research and consulting firm Trail of Bits recently announced the launch of Mainnet360, the first service to provide a comprehensive assessment of both the security and economic elements of blockchain software. Trail of Bits developed Mainnet360 over several months this year in partnership with Prysm Group, an economic consulting firm specializing in supporting teams that create incentive-compatible blockchain systems.

A new kind of security

One of the most overlooked concepts underlying the blockchain boom is economic security. While the security of Bitcoin depends on traditional notions of cryptography and code correctness, it also depends on humans making economically rational decisions at scale. This is the core idea behind the Proof-of-Work algorithm that backs Bitcoins consensus.

Building software that relies on economic security allows for engineering feats that were never previously possible, such as permissionless electronic money. However, understanding precisely when these systems are secure is extremely challenging, and little research currently exists. To remedy this problem, DARPA held a workshop entitled Applications and Barriers to Consensus Protocols at their headquarters earlier this year.

Thanks to an introduction by DARPA at this workshop, employees at Trail of Bits and Prysm group realized there were almost no resources for blockchain companies trying to build systems resilient against failures in code correctness or economic design. Existing offerings were limited in scope, while attackers worked much more holistically.

Solving correlated problems with a joint offering: Mainnet360

Blockchain networks require both an economic review and a code review to ensure security. Mainnet360 is the first service to provide both.

It takes a complex interaction of economics and computer science to secure blockchain systems; implementation errors in either area allow value to be stolen, destroyed, or not fully captured. Mainnet360 clients will receive a comprehensive review of both the economic framework that drives their system and the code with which it is implemented. In this way, Mainnet360 confirms that a systems deployed code is correct and incentivizes users to add value to the system.

Delivering all-in-one expertise

Building stable decentralized systems requires a broad set of experts cooperating closely, which Mainnet360 provides in one convenient package. The Mainnet360 team will work closely with developers to identify and remove risks, architect future work, and find the ideal technical solutions with the economic constraints in mind.

Offering new benefits for blockchain platforms

In addition to system design review, Trail of Bits specializes in creating testing and verification tools. Now with the support of Prysm Group, Trail of Bits is extending this tooling further to verify economic properties.

Interested teams can learn more about using Mainnet360 by emailing contact@mainnet360.com or by visiting http://www.mainnet360.com

About Trail of Bits

Since 2012, Trail of Bits has helped secure some of the worlds most targeted organizations and products. They combine high-end security research with a realworld attacker mentality to reduce risk and fortify code.

About Prysm Group

Prysm Group is a blockchain economics and governance design firm led by Harvard-trained PhD-level economists with areas of expertise in consortium governance, consensus governance, token economics, incentive design, and market structure. Taking a first-principles approach, Prysm Group uses the tools of contract theory, game theory, market design, social choice theory, and monetary economics to design customized solutions for distributed ledger technology and blockchain-based projects.

Original post:
Trail of Bits and Prysm Group Launch Mainnet360 - Business Wire

Edward Snowden Says Why He Will Never Return to the United States – Free Speech TV

The Right Livelihood Awards celebrated their 40th anniversary Wednesday at the historic Cirkus Arena in Stockholm, Sweden, where more than a thousand people gathered to celebrate this years four laureates:

Swedish climate activist Greta Thunberg; Chinese womens rights lawyer Guo Jianmei, Brazilian indigenous leader Davi Kopenawa and the organization he co-founded, the Yanomami Hutukara Association; and Sahrawi human rights leader Aminatou Haidar, who has challenged the Moroccan occupation of Western Sahara for decades. The Right Livelihood Award is known as the Alternative Nobel Prize.

Over the past four decades, its been given to grassroots leaders and activists around the globe among them the world-famous NSA whistleblower Edward Snowden.

At Wednesdays gala, Amy Goodman interviewed Snowden in front of the award ceremonys live audience via video link from Moscow, where he has lived in exile since leaking a trove of secret documents revealing the U.S. governments had built an unprecedented mass surveillance system to spy on Americans and people around the world.

After sharing the documents with reporters in 2013, Snowden was charged in the U.S. for violating the Espionage Act and other laws.

As he attempted to flee from Hong Kong to Latin America, Snowden was stranded in Russia after the U.S. revoked his passport, and he has lived there ever since.

Edward Snowden won the Right Livelihood Award in 2014. He accepted the award from Moscow, Russia.

Amy Goodman CIA Democracy Now! Edward Snowden Espionage Act Free Speech TV NSA Right Livelihood Awards United States Whstleblower

View post:
Edward Snowden Says Why He Will Never Return to the United States - Free Speech TV

Edward Snowden & Twitter The Verdict (2019-12-08) – Global Real News

Welcome! Today we did a very comprehensive analysis of Edward Snowdens Twitter activity. Lets dive in. These are the main things: as of 2019-12-08, Edward Snowden (@Snowden) has 4185310 Twitter followers, is following 1 people, has tweeted 4548 times, has liked 445 tweets, has uploaded 373 photos and videos and has been on Twitter since December 2014.

Going from the top of the page to the bottom, their latest tweet, at the time of writing, has 5 replies, 92 retweets and 251 likes, their second latest tweet has 64 replies, 1,754 reweets and 3,952 likes, their third latest tweet has 18 replies, 386 retweets and 1,315 likes, their fourth latest tweet has 2 replies, 190 retweets and 382 likes and their fifth latest tweet has 58 replies, 556 retweets and 1,180 likes. But thats enough numbers for now

MOST POPULAR:

Going through Edward Snowdens last couple pages of tweets (and retweets), the one we consider the most popular, having let to a very nice 215 direct replies at the time of writing, is this:

That seems to have caused quite a lot of discussion, having also had 1285 retweets and 3323 likes.

LEAST POPULAR:

And what about Edward Snowdens least popular tweet in the recent past (again, including retweets)? We believe its this one:

That only had 2 direct replies, 190 retweets and 382 likes.

THE VERDICT:

We did a ton of research into Edward Snowdens Twitter activity, looking through what people are saying in response to them, their likes/retweet numbers compared to the past, the amount of positive/negative responses and so on. We wont bore you with the details, so our verdict is this: we believe the online sentiment for Edward Snowden on Twitter right now is great.

Well leave it there for today. Thanks for coming, and drop a comment if you disagree with me. Dont be afraid to speak your mind.

View post:
Edward Snowden & Twitter The Verdict (2019-12-08) - Global Real News

America’s torturers and their co-conspirators must be prosecuted – World Socialist Web Site

Americas torturers and their co-conspirators must be prosecuted 9 December 2019

I was in such an indescribable state of pain I could hear sounds coming from the brothers, not only one but more than one brother; one was moaning, another one vomiting and another one screaming: my back, my back!

He started banging my head against the wall with both his hands. The banging was so strong that I felt at some point my skull was in pieces Then he dragged me to another very tiny squared box. With the help of the guards he shoved me inside the box

Denbeaux, Mark et al., How America Tortures (2019), Appendix I: Abu Zubaydahs Notes

**

Last month, the Seton Hall University School of Laws Center for Policy and Research published a paper titled How America Tortures, which contains eight significant drawings by torture victim Abu Zubaydah.

The drawings by themselves are a powerful indictment of the entire political establishment in the United States, which has failed to hold anyone accountable for the crimes that are depicted.

The paper represents the work of a team led by Professor Mark Denbeaux, who is serving as an attorney for a number of Guantanamo Bay detainees, including Abu Zubaydah. The paper brings together material from numerous sources, including Central Intelligence Agency cables and other government documents, and Abu Zubaydahs own account of what occurred, to provide a chronology not only from the CIAs perspective, but also from the perspective of the tortured. The result is damning.

The CIAs torture techniques are cataloged in comprehensive detail in the report. They include cramped confinement in small boxes, in some cases adding insects to the dark box as another way to scare the detainee locked inside. The paper documents the use of female soldiers to sexually abuse and humiliate detainees, with female military personnel going shirtless during interrogations, giving forced lap dances, and rubbing red liquids on the detainees which they identified as menstrual blood.

One FBI agent described finding detainees chained hand and foot in a fetal position on the floor, with no chair, food, or water. Most times they had urinated or defacated [sic] on themselves, and had been left there for [eighteen, twenty-four] hours or more.

Loud rap music was played around the clock. The now-infamous practice of involuntary rectal feeding involved pumping pureed food into the victims rectum for no medical reason.

How have the perpetrators of these bestial crimes managed to avoid prosecution? It is not for lack of evidence.

Today is the fifth anniversary of the release of the US Senate Select Committee on Intelligences executive summary of its findings regarding the CIA torture program. This executive summary, which runs in the hundreds of pages, is itself merely an outline of the full 6,700-page report, including 38,000 footnotes, which has been suppressed.

The World Socialist Web Site wrote at the time of the release of the summary: From a legal standpoint, the war crimes and crimes against humanity that are documented in the report warrant the immediate arrest, indictment, and prosecution of every individual involved in the program, from the torturers themselves and their outside contractors all the way up to senior officials in the Bush and Obama administrations who presided over the program and subsequently attempted to cover it up.

The crimes perpetrated by the American military and intelligence agencies in the course of the so-called war on terror were heinous, premeditated, and involved extreme depravity. These crimes were further aggravated by protracted efforts to cover them up, destroy evidence and obstruct investigations.

The crimes cannot be written off as the overzealous conduct of low-level rogue agents. On the contrary, they were organized in cold blood and at the highest levels. The Seton Hall Law School paper states as a matter of fact that top officials in the West Wing of the White House and the Office of Legal Counsel of the Department of Justice orchestrated and poorly oversaw a horrific torture program that was responsible for the detention and interrogation of countless detainees.

A New York Times editorial dated December 5, titled Dont Look Away, is an attempt at damage control following the release of the Abu Zubaydah illustrations. While denouncing torture as barbaric and illegal, the article seeks to blame the torture program on the Republicans, denouncing President Trump and those who think like him.

The Times concludes: The United States has by far the greatest security establishment on earth, with the greatest reach. When the United States commits or abets war crimes, it erodes the honor, effectiveness, and value of that force.

The Times does not attempt to explain how it came to pass that nobody was ever prosecuted for conduct that it admits was barbaric and illegal and constituted war crimes.

In reality, the CIA torture program was entirely bipartisan. Jay Rockefeller, the top Democrat on the Senate Intelligence Committee, as well as then-House Democratic Leader Nancy Pelosi were briefed on the program in 2002.

The Obama administration played a key role in legitimizing torture and shielding war criminals from prosecution. Under the slogan of looking forward not backward, the Democrats refused to prosecute anyone involved in the program or cover-up. The only CIA employee who was ever prosecuted by the Obama administration in connection with torture was analyst John Kiriakou, who was jailed for publicly acknowledging that the CIA was engaged in waterboarding.

Obama refused for years to release the Senate torture report and assisted the CIAs efforts to suppress it. In 2015, the Obama administration successfully sued to prevent the American Civil Liberties Union from obtaining it under the Freedom of Information Act.

What did the New York Times have to say about these barbaric and illegal practices at the time? On April 6, 2002, a Times headline gloated, A Master Terrorist is Nabbed. Describing the abduction of Abu Zubaydah in Pakistan, without charges or legal proceedings of any kind, the Times wrote, His seizure demonstrates that the painstaking international detective work of the current phase of the war on terror is paying off.

On June 12, 2002, in an article titled Traces of Terror, the Times continued its role as a CIA stenographer: After nearly 100 sessions with CIA and FBI interrogators at a heavily guarded, undisclosed location, the captured terrorist Abu Zubaydah has provided information that American officials say is central to the Bush administrations efforts to pre-empt a new wave of attacks against the United States.

This version of events was, as is now universally acknowledged, a pack of lies. Abu Zubaydah was not a high-level operative in Al Qaeda, and he may not have even been a member. He has never been charged with a crime, let alone tried and convicted. Yet to this day, he continues to rot in a cell in the Guantanamo Bay torture camp, with no prospect of being released.

Five years after the publication of the Senate report, where are the torturers and their co-conspirators now? Gina Haspel, who presided over a CIA torture compound in Thailand and was implicated in the destruction of tapes of Abu Zubaydahs torture in 2005, was promoted by Trump to become the new director of the agency.

The previous director, John Brennan, who was a high-level CIA official during the Bush administration and under Obama ordered agents to break into Senate staffers computers in an effort to search for incriminating information relating to torture, is now serving as a well-paid senior national security and intelligence analyst for NBC News and MSNBC. He makes regular appearances on news programs to agitate in favor of the Democrats impeachment drive.

James Mitchell, whose company, Mitchell Jessen and Associates, received a $81 million contract from the CIA to develop and implement the enhanced interrogation techniques that were used on Abu Zubaydah and others, remains at large. According to a Bloomberg News article in 2014, he is now retired and spends his free time kayaking, rafting and climbing.

And what has been the fate of those who have exposed official criminality? Julian Assange is imprisoned in Belmarsh Prison in London, where his life is endangered by conditions amounting to torture. Chelsea Manning was imprisoned and tortured, released, and then imprisoned again for refusing to testify against Assange before a grand jury. Edward Snowden was forced to flee the country and seek refuge in Russia.

The torturers and their co-conspirators have not been prosecuted, not because of lack of evidence or insufficient legal grounds, but because the entire political establishment is implicated at the highest levels, including the Democrats, the Republicans, the military and intelligence agencies, the establishment media, and all of those who perpetrated the reactionary fraud of the war on terror.

The failure to prosecute the torturers has served to embolden the most fascistic layers in the state apparatus, opening the way for Trump to boast of his support for torture in broad daylight. Trump and his fascistic advisers, frightened by the growth of social opposition, believe that the Gestapo-style methods that have been implemented in the course of the war on terror are necessary to terrify and suppress opposition both abroad and internally. While Trump brags that he is in favor of implementing torture practices at Guantanamo Bay that are a hell of a lot worse, he tells police officers within the US: Dont be too nice.

The Democrats and their allies are concerned that public discussion of the crimes of the state would serve to fuel popular hostility towards the institutions the New York Times describes as the greatest security establishment on earth. It would cut across the Democrats ongoing efforts to ingratiate and align themselves with the CIA as part of the impeachment drive against Trump. Moreover, the revelations of CIA torture underscore the hypocrisy of their efforts to justify American imperialist aggression and subversion all over the world in the name of human rights.

For these reasons, the demand to bring the torturers to justice must be taken up by the international working class. Every individual who participated in the CIA torture program or the cover-up in any capacity, including those who failed to intervene when they had an opportunity to do so, should face arrest, indictment and prosecution.

The fight to end torture once and for all must be connected to the mounting struggles of the international working class to defend and expand its democratic and social rights and halt the drive of the ruling class toward dictatorship. The entire existing social order is implicated in torture and must be overthrown.

Tom Carter

2019 has been a year of mass social upheaval. We need you to help the WSWS and ICFI make 2020 the year of international socialist revival. We must expand our work and our influence in the international working class. If you agree, donate today. Thank you.

Read more from the original source:
America's torturers and their co-conspirators must be prosecuted - World Socialist Web Site

Quantum Computers Are the Ultimate Paper Tiger – The National Interest Online

Google announced this fall to much fanfare that it had demonstrated quantum supremacy that is, it performed a specific quantum computation far faster than the best classical computers could achieve. IBM promptly critiqued the claim, saying that its own classical supercomputer could perform the computation at nearly the same speed with far greater fidelity and, therefore, the Google announcement should be taken with a large dose of skepticism.

This wasnt the first time someone cast doubt on quantum computing. Last year, Michel Dyakonov, a theoretical physicist at the University of Montpellier in France, offered a slew of technical reasons why practical quantum supercomputers will never be built in an article in IEEE Spectrum, the flagship journal of electrical and computer engineering.

So how can you make sense of what is going on?

As someone who has worked on quantum computing for many years, I believe that due to the inevitability of random errors in the hardware, useful quantum computers are unlikely to ever be built.

Whats a quantum computer?

To understand why, you need to understand how quantum computers work since theyre fundamentally different from classical computers.

A classical computer uses 0s and 1s to store data. These numbers could be voltages on different points in a circuit. But a quantum computer works on quantum bits, also known as qubits. You can picture them as waves that are associated with amplitude and phase.

Qubits have special properties: They can exist in superposition, where they are both 0 and 1 at the same time, and they may be entangled so they share physical properties even though they may be separated by large distances. Its a behavior that does not exist in the world of classical physics. The superposition vanishes when the experimenter interacts with the quantum state.

Due to superposition, a quantum computer with 100 qubits can represent 2100 solutions simultaneously. For certain problems, this exponential parallelism can be harnessed to create a tremendous speed advantage. Some code-breaking problems could be solved exponentially faster on a quantum machine, for example.

There is another, narrower approach to quantum computing called quantum annealing, where qubits are used to speed up optimization problems. D-Wave Systems, based in Canada, has built optimization systems that use qubits for this purpose, but critics also claim that these systems are no better than classical computers.

Regardless, companies and countries are investing massive amounts of money in quantum computing. China has developed a new quantum research facility worth US$10 billion, while the European Union has developed a 1 billion ($1.1 billion) quantum master plan. The United States National Quantum Initiative Act provides $1.2 billion to promote quantum information science over a five-year period.

Breaking encryption algorithms is a powerful motivating factor for many countries if they could do it successfully, it would give them an enormous intelligence advantage. But these investments are also promoting fundamental research in physics.

Many companies are pushing to build quantum computers, including Intel and Microsoft in addition to Google and IBM. These companies are trying to build hardware that replicates the circuit model of classical computers. However, current experimental systems have less than 100 qubits. To achieve useful computational performance, you probably need machines with hundreds of thousands of qubits.

Noise and error correction

The mathematics that underpin quantum algorithms is well established, but there are daunting engineering challenges that remain.

For computers to function properly, they must correct all small random errors. In a quantum computer, such errors arise from the non-ideal circuit elements and the interaction of the qubits with the environment around them. For these reasons the qubits can lose coherency in a fraction of a second and, therefore, the computation must be completed in even less time. If random errors which are inevitable in any physical system are not corrected, the computers results will be worthless.

In classical computers, small noise is corrected by taking advantage of a concept known as thresholding. It works like the rounding of numbers. Thus, in the transmission of integers where it is known that the error is less than 0.5, if what is received is 3.45, the received value can be corrected to 3.

Further errors can be corrected by introducing redundancy. Thus if 0 and 1 are transmitted as 000 and 111, then at most one bit-error during transmission can be corrected easily: A received 001 would be a interpreted as 0, and a received 101 would be interpreted as 1.

Quantum error correction codes are a generalization of the classical ones, but there are crucial differences. For one, the unknown qubits cannot be copied to incorporate redundancy as an error correction technique. Furthermore, errors present within the incoming data before the error-correction coding is introduced cannot be corrected.

Quantum cryptography

While the problem of noise is a serious challenge in the implementation of quantum computers, it isnt so in quantum cryptography, where people are dealing with single qubits, for single qubits can remain isolated from the environment for significant amount of time. Using quantum cryptography, two users can exchange the very large numbers known as keys, which secure data, without anyone able to break the key exchange system. Such key exchange could help secure communications between satellites and naval ships. But the actual encryption algorithm used after the key is exchanged remains classical, and therefore the encryption is theoretically no stronger than classical methods.

Quantum cryptography is being commercially used in a limited sense for high-value banking transactions. But because the two parties must be authenticated using classical protocols, and since a chain is only as strong as its weakest link, its not that different from existing systems. Banks are still using a classical-based authentication process, which itself could be used to exchange keys without loss of overall security.

Quantum cryptography technology must shift its focus to quantum transmission of information if its going to become significantly more secure than existing cryptography techniques.

Commercial-scale quantum computing challenges

While quantum cryptography holds some promise if the problems of quantum transmission can be solved, I doubt the same holds true for generalized quantum computing. Error-correction, which is fundamental to a multi-purpose computer, is such a significant challenge in quantum computers that I dont believe theyll ever be built at a commercial scale.

[ Youre smart and curious about the world. So are The Conversations authors and editors. You can get our highlights each weekend. ]

Subhash Kak, Regents Professor of Electrical and Computer Engineering, Oklahoma State University

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Image: Reuters

Read more:

Quantum Computers Are the Ultimate Paper Tiger - The National Interest Online

Quantum supremacy is here, but smart data will have the biggest impact – Quantaneo, the Quantum Computing Source

Making fast and powerful quantum computing available through the cloud can enable tasks to be processed millions of times faster, and could shape lives and businesses as we know it. For example, applications using quantum computing could reduce or prevent traffic congestion, cybercrimes, and cancer. However, reaching the quantum supremacy landmark doesnt mean that Google can take its foot off the gas. Rather, the company has thrown down the gauntlet and the race to commercialize quantum computing is on. Delivering this killer technology is still an uphill battle to harness the power of highly fickle machines and move around quantum bits of information, which is inherently error-prone.

To deliver quantum cloud services, whether for commercial or academic research, Google must tie together units of quantum information (qubits) and wire data, which is part of every action and transaction across the entire IT infrastructure. If quantum cloud services get to the big league, it will still rely on traffic flows based on wire data to deliver value to users. This raises a conundrum for IT and security professionals who must assure services and deliver a flawless user experience. On one hand, the quantum cloud service solves a million computations in parallel and in real time. On the other hand, the results are delivered through wire data across a cloud, SD-WAN, or 5G network. It does not matter if a quantum computer today or tomorrow can crank out an answer 100 million times faster than a regular computer chip if an application that depends on it experiences performance problems or a threat actor is lurking in your on-premises data centre or penetrated the IT infrastructure first and last lines of defence.

No matter what the quantum computing world will look like in the future, IT teams such as NetOps and SecOps will still need to use wire data to gain end-to-end visibility into their on-premises data centres and cloud environment. Wire data is used to fill the visibility gap and see what others cant; to gain actionable intelligence to detect cyber-attacks or quickly solve service degradations. Quantum computing may increase speed, but it also adds a new dimension of infrastructure complexity and the potential for something breaking anywhere along the service delivery path. With that said, reducing risk therefore requires removing service delivery blind spots. A proven way to do that is by turning wire data into smart data to cut through infrastructure complexity and gain visibility without borders. When that happens, the IT organization will fully understand with precise accuracy the issues impacting service performance and security.

In the rush to embrace quantum computing, wire data therefore cannot, and should not, be ignored. Wire data can be turned into contextually, useful smart data. With a smart data platform, the IT organization can help make quantum computing a success by protecting user experience across different industries including automotive, manufacturing and healthcare. Therefore, while Google is striving for high quality qubits and blazing new quantum supremacy trails, success ultimately relies on using smart data for service assurance and security in an age of infinite devices, cloud applications and exponential scalability.

Ron Lifton, Senior Enterprise Solutions Manager, NETSCOUT

View post:

Quantum supremacy is here, but smart data will have the biggest impact - Quantaneo, the Quantum Computing Source

Breakthrough in creation of gamma ray lasers that use antimatter – Big Think

Scientists are closer to taming the most powerful light in the Universe. A physicist at the University of California has figured out how to make stable positronium atoms, which may lead to the creation of gamma ray lasers.

Gamma rays are the product of electromagnetic radiation that is caused by the radioactive decay of atomic nuclei. Harnessing these extremely bright (and usually very brief) lights, which have the highest photon energy, could lead to next-generation technologies. The highly penetrating gamma rays are shorter in wavelength than x-rays, and can be utilized for spacecraft propulsion, advanced medical imaging and treating cancers.

Creating a gamma ray laser requires manipulating positronium, a hydrogen-like atom that is a mixture of matter and antimatter in particular, of electrons and their antiparticles known as positrons. The collision of a positron with an electron results in the production of gamma ray photons.

To make gamma-ray laser beams, the positronium atoms need to be in the same quantum state, called a Bose-Einstein condensate. The new study from professor Allen Mills of the UC Riverside Department of Physics and Astronomy, shows that hollow spherical bubbles filled with a positronium atom gas can be kept stable in liquid helium.

"My calculations show that a bubble in liquid helium containing a million atoms of positronium would have a number density six times that of ordinary air and would exist as a matter-antimatter Bose-Einstein condensate," said Mills.

Mills thinks helium would work as the stabilizing container because at extremely low temperatures, the gas would turn to liquid and actually repel positronium. This results from its negative affinity fo positronium and would cause bubbles to be created, which would be the source of the necessary Bose-Einstein condensates.

Testing these ideas and actually configuring an antimatter beam to produce such bubbles in liquid helium is the next goal for the Positron laboratory at UC Riverside that Mills directs.

"Near term results of our experiments could be the observation of positronium tunneling through a graphene sheet, which is impervious to all ordinary matter atoms, including helium, as well as the formation of a positronium atom laser beam with possible quantum computing applications," explained the physicist.

Check out the new study in Physical Review A.

Professor Allen Mills of the UC Riverside Department of Physics and Astronomy.

Credit: I. Pittalwala, UC Riverside.

Related Articles Around the Web

Original post:

Breakthrough in creation of gamma ray lasers that use antimatter - Big Think

InfoQ’s 2019, and Software Predictions for 2020 – InfoQ.com

Key Takeaways

Looking back, 2019 saw some significant announcements in Quantum computing. In May, IBM published a paper in Nature that suggested they may have found a path to dealing with decoherence in current quantum computers. Writing for InfoQ Sergio De Simone pointed out:

"The main issue with decoherence is the fast decay of a wave function, which has the undesirable effect of generating noise and errors after a very short time period. The paper proposes two approaches, one called probabilistic error correction and the other zero noise extrapolation, to keep decoherence under control."

In September, Google announced, also via a paper in Nature, it had built a machine that achieved quantum supremacy - the point at which a quantum computer can solve problems which classical computers practically cannot. The claim was disputed by IBM, and the practical application of Googles achievement is still limited, but both these announcements demonstrate real progress in the field.

Also significant - the news that Microsoft open-sourced Q#, its Language for Quantum Computing.

A surprise this year was the decline of interest in Virtual Reality, at least in the context of Smart-phone-based VR. Sergio notes:

"Google's decision to stop supporting its Daydream VR headset seemingly marks the end of phone-based virtual reality, a vision that attempted to combine the use of smartphones with 'dumb' VR headsets to bring VR experiences to the masses. Google's decision is accompanied by the BBC disbanding its VR content team after two years of successful experimentation."

JavaScript, Java, and C# remain the most popular languages we cover, but were also seeing strong interest in Rust, Swift, and Go, and our podcast with Bryan Cantrill on "Rust and Why He Feels Its The Biggest Change In Systems Development in His Career" is one of the top-performing podcasts weve published this year. Weve also seen a growing interest in Python this year, probably fuelled by its popularity for machine learning tasks.

After a rather turbulent 2018, Java seems to be settling into its bi-annual release cycle. According to our most-recent reader survey Java is the most used language amongst InfoQ readers, and there continues to be a huge amount of interest in the newer language features and how the language is evolving. We also continue to see strong and growing interest in Kotlin.

It has been interesting to see Microsofts growing involvement in Java, joining the OpenJDK, acquiring JClarity, and hiring other well known figures including Monica Beckwith.

Our podcast with Rod Johnson in which he chats about the early days of the Spring Framework, Languages Post-Java, & Rethinking CI/CD was another of our top-performing podcasts this year.

Matt Raibles JHipster book, now in its 5th version, was one of our most-downloaded books of the year.

In the Java programming language trends report, we noted increased adoption of non-HotSpot JVMs, and we believe OpenJ9 is now within the early-adopter stage. As the time we noted that:

"We believe that the increasing adoption of cloud technologies within all types of organisation is driving the requirements for JREs that embrace associated "cloud-native" principles such as fast start-up times and a low memory footprint. Graal in itself may not be overly interesting, but the ability to compile Java application to native binaries, in combination with the support of polyglot languages, is ensuring that we keep a close watch on this project."

Since the report came out, we feel that the GraalVM has demonstrated significant potential, and will continue to watch its progress with interest.

Our top performing content for Java this year included:

Although it didnt quite make it in the top five list, an honourable mention should go to Brian Goetzs fantastic "Java Feature Spotlight" article "Local Variable Type Inference."

The release of .NET Core 3 in September generated a huge buzz on InfoQ and produced some of our most-popular .NET content of the year. WebAssembly has been another area of intense interest, and we saw a corresponding surge in interest for Blazor, a new framework in ASP.NET Core that allows developers to create interactive web applications using C# and HTML. Blazor comes in multiple editions, including Blazor WebAssembly which allows single-page applications to run in the client's web browser using a WebAssembly-based .NET runtime.

According to our most-recent reader survey, C# is the second-most widely used language among InfoQ readers after Java, and interest in C#8 in particular was also strong.

Our top performing .NET content included:

Jonathan Allens piece in the list is part of an excellent series of articles and news posts that Jonathan Allen wrote for InfoQ during 2019. Others included:

Unsurprisingly, the majority of InfoQ readers write at least some JavaScript - around 70% according to the most recent reader survey - making it the most widely used language among our readers. The dominant JavaScript frameworks for InfoQ readers seem to currently be Vue and React. We also saw interest in using Javascript for machine learning via TensorFlow.js. Away from JavaScript, we saw strong interest in some of the transpiler options. In addition to Blazor, mentioned above, we saw strong interest in Web Assembly, Typescript, Elm and Svelte.

Top-performing content included:

Its unsurprising that distributed computing, and in particular the microservices architecture style, remains a huge part of our news and feature content. We see strong interest in related topics, with our original Domain Driven Design Quickly book, and our more-recent eMag "Domain-Driven Design in Practice" continuing to perform particularly well, and interest in topics like observability and distributed tracing. We also saw interest in methods of testing distributed systems, including a strong performance from our Chaos Engineering eMag, and a resurgence in reader interest in some for the core architectural topics such as API design, diagrams, patterns, and models.

Our top performing architecture content was:

Our podcast with Grady Booch on todays Artificial Intelligence reality and what it means for developers was one of our most popular podcasts of the year, and revealed strong interest in the topic from InfoQ readers.

Key AI stories in 2019 were MIT introducing GEN, a Julia-basd language for artificial intelligence, Googles ongoing work on ML Kit, and discussions around conversational interfaces, as well as more established topics such as streaming.

Its slightly orthogonal to the rest of the pieces listed here, but we should also mention "Postgres Handles More Than You Think" by Jason Skowronski which performed amazingly well.

Our top-performing content for AI and ML was:

If there was an overarching theme to our culture and methods coverage this year it might best be summed up as "agile done wrong" and many of our items focused on issues with agile, and/or going back to the principles outlined in the Agile Manifesto.

We also saw continued in interest in some of the big agile methodologies, notably Scrum, with both "Scrum and XP from the Trenches", and "Kanban and Scrum - Making the Most of Both" performing well in our books department.

We also saw strong reader interest in remote working with Judy Rees eMag on "Mastering Remote Meetings", and her corresponding podcast, performing well, alongside my own talk on "Working Remotely and Managing Remote Teams" from Aginext this year.

Our most popular published content for Culture and Methods was:

Our top performing culture podcasts were:

In our DevOps and Cloud trends report, we noted that Kubernetes has effectively cornered the market for container orchestration, and is arguably becoming the cloud-agnostic compute abstraction. The next "hot topics" in this space appear to be "service meshes" and developer experience/workflow tooling. We continue to see strong interest in all of thee among InfoQs readers.

A trend were also starting to note is a number of languages which are either infrastructure or cloud-orientated. In our Programming Languages trends report, we noted increased interest and innovation related to infrastructure-aware or cloud-specific languages, DSLs, and SDKs like Ballerina and Pulumi. In this context we should also mention Dark, a new language currently still in private beta, but already attracting a lot of interest. Somewhat related, we should also mention the Ecstasy language, co-created by Tangosol founders Cameron Purdy and Gene Gleyzer. Chris Swan, CTO for the Global Delivery at DXC Technology, spoke to Cameron Purdy about the language and the problems its designed to solve.

In the eMags department "Kubernetes: Past, Present and Future", and "DevSecOps in Practice" were among our top performers:

Making predictions in software in notoriously hard to do, but we expect to see enterprise development teams consolidate their cloud-platform choices as Kubernetes adoption continues. Mostly this will be focussed on the "big five" cloud providers - Amazon, Google, IBM (plus Red Hat), Microsoft, and VMware (plus Pivotal). We think that, outside China, Alibaba will struggle to gain traction, as will Oracle, Salesforce, and SAP.

In the platform/operations space were expecting that service meshes will become more integrated with the underlying orchestration frameworks (e.g. Kubernetes). Were also hopeful that the developer workflow for interacting with service meshes becomes more integrated with current workflows, technologies, and pipelines.

Ultimately developers should be able to control deploy, release, and debugging via the same continuous/progressive delivery pipeline. For example, using a "GitOps" style pipeline to deploy a service by configuring k8s YAML (or some higher-level abstraction), controlling the release of the new functionality using techniques like canarying or shadowing via the configuration of some traffic management k8s custom resource definition (CRD) YAML, and enabling additional logging or debug tooling via some additional CRD config.

In regards to architecture, next year will hopefully be the year of "managing complexity". Architectural patterns such microservices and functions(as-a-service) have enabled developers to better separate concerns, implement variable rates of change via independent isolated deployments, and ultimately work more effectively at scale. However, our ability to comprehend the complex distributed systems we are now building -- along with the availability of related tooling -- has not kept pace with these developments. Were looking forward to seeing what the open source community and vendors are working on in the understandability, observability, and debuggability space.

We expect to see more developers experimenting with "low code" platforms. This is partly fueled by a renewed push from Microsoft for its PowerApps, Flow, Power BI, and Power Platform products.

In the .NET ecosystem, we believe that Blazor will keep gaining momentum among web developers. .NET 5 should also bring significant changes to the ecosystem with the promised interoperability with Java, Objective-C, and Swift. Although it is early to say, Microsoft's recent efforts on IoT and AI (with ML.NET) should also help to raise the interest in .NET development. Related we expect to see the interest in Web Assembly continue and hope that the tooling hear will start to mature.

Despite the negative news around VR this year, we still think that something in the AR/VR space, or some other form of alternative computer/human interaction, is likely to come on the market in the next few years and gain significant traction, though it does seem that the form factor for this hasnt really arrived.

Charles Humble took over as editor-in-chief at InfoQ.com in March 2014, guiding our content creation including news, articles, books, video presentations and interviews. Prior to taking on the full-time role at InfoQ, Charles led our Java coverage, and was CTO for PRPi Consulting, a remuneration research firm that was acquired by PwC in July 2012. For PRPi he had overall responsibility for the development of all the custom software used within the company. He has worked in enterprise software for around 20 years as a developer, architect and development manager. In his spare time he writes music as 1/3 of London-based ambient techno group Twofish, whose debut album came out in February 2014 after 14 years of messing about with expensive toys, and spends as much time as he can with his wife and young family.Erik Costlow is a software security expert with extensive Java experience. He manages developer relations for Contrast Security and public Community Edition. Contrast weaves sensors into applications, giving them the ability to detect security threats based on how the application uses its data. Erik was the principal product manager in Oracle focused on security of Java 8, joining at the height of hacks and departing after a two-year absence of zero-day vulnerabilities. During that time, he learned the details of Java at both a corporate/commercial and community level. He also assisted Turbonomic's product management team to achieve $100M annual revenue in data center/cloud performance automation. Erik also lead product management for Fortify static code analyzer, a tool that helps developers find and fix vulnerabilities in custom source code. Erik has also published several developer courses through Packt Publishing on data analysis, statistics, and cryptography.

Arthur Casals is a Computer Science researcher working in the area of Artificial Intelligence / Multi-agent Systems. He has been developing software for 20+ years, in different markets/industries. Arthur has also assumed different roles in the past: startup founder, CTO, tech manager, software engineer. He holds a B.Sc. degree in Computer Engineering and an MBA degree.

Daniel Bryant works as an Independent Technical Consultant and Product Architect at Datawire. His technical expertise focuses on 'DevOps' tooling, cloud/container platforms, and microservice implementations. Daniel is a Java Champion, and contributes to several open source projects. He also writes for InfoQ, O'Reilly, and TheNewStack, and regularly presents at international conferences such as OSCON, QCon and JavaOne. In his copious amounts of free time he enjoys running, reading and traveling.

Bruno Couriol holds a Msc in Telecommunications, a BsC in Mathematics and a MBA by INSEAD. Most of his career has been spent as a business consultant, helping large companies addressing their critical strategical, organizational and technical issues. In the last few years, he developed a focus on the intersection of business, technology and entrepreneurship.

Ben Linders is an Independent Consultant in Agile, Lean, Quality and Continuous Improvement, based in The Netherlands. Author of Getting Value out of Agile Retrospectives, Waardevolle Agile Retrospectives, What Drives Quality, The Agile Self-assessment Game and Continuous Improvement. As an adviser, coach and trainer he helps organizations by deploying effective software development and management practices. He focuses on continuous improvement, collaboration and communication, and professional development, to deliver business value to customers. Ben is an active member of networks on Agile, Lean and Quality, and a frequent speaker and writer. He shares his experience in a bilingual blog (Dutch and English) and as an editor for Agile at InfoQ. Follow him on Twitter: @BenLinders.

Shane Hastie is the Director of Community Development for ICAgile, a global accreditation and certification body dedicated to improving the state of agile learning. Since first using XP in 2000 Shane's been passionate about helping organizations and teams adopt sustainable, humanistic ways of working irrespective of the brand or label they go by. Shane was a Director of the Agile Alliance from 2011 until 2016. Shane leads the Culture and Methods editorial team for InfoQ.com

Read more from the original source:

InfoQ's 2019, and Software Predictions for 2020 - InfoQ.com

ETH Istanbul Hard Fork : Here are the changes that will take… – TokenHell

Ethereums Istanbul hard fork is going to be held on Saturday, 7th Dec 2019. This Istanbul hard fork update involves specific changes in the Ethereum network.

The main focus of this upgrade would be on the improvement of Ethereums privacy, scalability, and better sidechain support. Here are these changes that would take place during this upgrade.

In this upgrade, Ethereum would be looking to update the Zero-knowledge cryptographic technology that is responsible for the privacy of ERC-based tokens. Also, this technology improves Ethereums scalability by using off-chain solutions.

Earlier before, this zero-knowledge cryptographic technology has been regarded as a future and positive addition to the Ethereum protocol.

This upcoming Ethereum Istanbul hard fork aims to add six Ethereum improvement proposals out of thirty. These proposals will enable smart contract developers to introduce new features including privacy protocol and side-chain scaling to the Ethereum chain.

Among these six proposals, first is EIP-1108. This usually optimizes routines for the elliptic curve arithmetic. The main purpose of this proposal is to change the computational pricing for the elliptic curve algorithms.

These changes are considered important for several projects building on this platform. For instance, AZTEC protocol and ZEther implement zero-knowledge proofs and confidential transactions. This AZTEC protocol provided by the proposal is regarded as an effective reduction by the team:

It currently costs 820,000 gas to validate the cryptography in a typical AZTEC confidential transaction. If the gas schedule for the precompiles correctly reflected their load on the Ethereum network, this cost would be 197,000 gas [about $0.23 at current average gas prices].

EIP-152 is the second proposal that allows direct integrations with the Zcash privacy coin. This upgrade involves the advanced versions of Blake2b hash functions.

The next optimization proposal is EIP-2028 to benefit the zero-knowledge based system. This upgrading proposal decreases the gas cost of Call data and allows an increase in the bandwidth of data transfer.

In order to improve security, the EIP-1344 proposal is used that will bring a new method that will be used to return the ID of the current chain.

The last two Ethereum improvement proposals are EIP-2200 and EIP-1884. These are responsible for improving the structure of gas and for re-pricing different functions respectively.

Ethereum Istanbul hard fork will improve its privacy, scalability, and also the side-chain support. For this purpose, Ethereum would be upgrading six proposals and Zero-knowledge cryptography technology.

More:
ETH Istanbul Hard Fork : Here are the changes that will take... - TokenHell

China to Pilot Digital Yuan With Four Banks in Two Cities – Report – Cryptonews

Shenzhen. Source: iStock/Nikada

Chinas central bank, the Peoples Bank of China (PBoC) is preparing to test the digital yuan in Suzhou and Shenzhen, with a pilot launch now imminent, per a new report.

Media outlet Caijing Magazine says it has learned details about the pilot project, which it says is being jointly led by the PBoC, as well as the so-called Chinese big four state-owned commercial banks, namely the Industrial and Commercial Bank of China, the Agricultural Bank of China, the Bank of China and the China Construction Bank.

Chinas three largest telecoms providers are also set to take part in the pilot with the state-owned China Mobile, China Telecom and China Unicom all named in the report.

Caijing adds that the PBoC could seek to increase the scope of its pilots, with other, as yet unnamed mainland locations being considered.

The PBoC has previously spoken about the possibility of conducting pilots before issuing a nationwide rollout, and has stated that it is aiming to introduce the digital yuan gradually.

The same media outlet also quotes an unnamed senior technical expert as stating that the PBoC has been working on a range of issues, but there is still a long way to go on the technical front before a rollout can be approved.

And the reports authors believe that the project will make use of cloud-based technology and possibly 5G networks two of Huaweis core business areas. The Chinese tech giant has been repeatedly mentioned as a possible digital yuan partner for the PBoC, with Caijing also stating that the company may well lend its support.

As previously reported, Shenzhen has shown a great willingness to build up its blockchain sector. A state-owned PBoC subsidiary company went on a recruiting drive in summer this year.

And Caijing says a central bank-owned fintech company based in Suzhou has been rushing to recruit blockchain talent of late. The companys official remit is conducting digital currency and cryptography-related research.

View post:
China to Pilot Digital Yuan With Four Banks in Two Cities - Report - Cryptonews