Superconductor or Not? Exploring the Identity Crisis of This Weird Quantum Material – SciTechDaily

Northeastern researchers have used a powerful computer model to probe a puzzling class of copper-based materials that can be turned into superconductors. Their findings offer tantalizing clues for a decades-old mystery, and a step forward for quantum computing.

The ability of a material to let electricity flow comes from the way electrons within their atoms are arranged. Depending on these arrangements, or configurations, all materials out there are either insulators or conductors of electricity.

But cuprates, a class of mysterious materials that are made from copper oxides, are famous in the scientific community for having somewhat of an identity issue that can make them both insulators and conductors.

Under normal conditions, cuprates are insulators: materials that inhibit the flow of electrons. But with tweaks to their composition, they can transform into the worlds best superconductors.

The finding of this kind of superconductivity in 1986 won its discoverers a Nobel Prize in 1987, and fascinated the scientific community with a world of possibilities for improvements to supercomputing and other crucial technologies.

But with fascination came 30 years of bewilderment: Scientists have not been able to fully decipher the arrangement of electrons that encodes for superconductivity in cuprates.

Arun Bansil, University Distinguished Professor of physics and Robert Markiewicz, professor of physics, are part of a team of researchers who are describing the mechanism by which copper-oxide materials turn from insulators to superconductors. Credit: Matthew Modoono/Northeastern University

Mapping the electronic configuration of these materials is arguably one of the toughest challenges in theoretical physics, says Arun Bansil, University Distinguished Professor of physics at Northeastern. And, he says, because superconductivity is a weird phenomenon that only happens at temperatures as low as -300 F (or about as cold as it gets on Uranus), figuring out the mechanisms that make it possible in the first place could help researchers make superconductors that work at room temperature.

Now, a team of researchers that includes Bansil and Robert Markiewicz, a professor of physics at Northeastern, is presenting a new way to model these strange mechanisms that lead to superconductivity in cuprates.

In a study published in Proceedings of the National Academy of Sciences, the team accurately predicted the behavior of electrons as they move to enable superconductivity in a group of cuprates known as yttrium barium copper oxides.

In these cuprates, the study finds, superconductivity emerges from many types of electron configurations. A whopping 26 of them, to be specific.

During this transition phase, the material will, in essence, become some kind of a soup of different phases, Bansil says. The split personalities of these wonderful materials are being now revealed for the first time.

The physics within cuprate superconductors are intrinsically weird. Markiewicz thinks of that complexity as the classical Indian myth of the blind men and the elephant, which has been a joke for decades among theoretical physicists who study cuprates.

According to the myth, blind men meet an elephant for the first time, and try to understand what the animal is by touching it. But because each of them touches only one part of its bodythe trunk, tail, or legs, for examplethey all have a different (and limited) concept of what an elephant is.

In the beginning, we all looked [at cuprates] in different ways, Markiewicz says. But we knew that, sooner or later, the right way was going to show up.

The mechanisms behind cuprates could also help explain the puzzling physics behind other materials that turn into superconductors at extreme temperatures, Markiewicz says, and revolutionize the way they can be used to enable quantum computing and other technologies that process data at ultra-fast speeds.

Were trying to understand how they come together in the real cuprates that are used in experiments, Markiewicz says.

The challenge of modeling cuprate superconductors comes down to the weird field of quantum mechanics, which studies the behavior and movement of the tiniest bits of matterand the strange physical rules that govern everything at the scale of atoms.

In any given materialsay, the metal in your smartphoneelectrons contained within just the space of a fingertip could amount to the number one followed by 22 zeros, Bansil says. Modeling the physics of such a massive number of electrons has been extremely challenging ever since the field of quantum mechanics was born.

Bansil likes to think of this complexity as butterflies inside a jar flying fast and cleverly to avoid colliding with each other. In a conducting material, electrons also move around. And because of a combination of physical forces, they also avoid each other. Those characteristics are at the core of what makes it hard to model cuprate materials.

The problem with the cuprates is that they are at the border between being a metal and an insulator, and you need a calculation that is so good that it can systematically capture that crossover, Markiewicz says. Our new modeling can capture this behavior.

The team includes researchers from Tulane University, Lappeenranta University of Technology in Finland, and Temple University. The researchers are the first to model the electronic states in the cuprates without adding parameters by hand to their computations, which physicists have had to do in the past.

To do that, the researchers modeled the energy of atoms of yttrium barium copper oxides at their lowest levels. Doing that allows researchers to trace electrons as they excite and move around, which in turn helps describe the mechanisms supporting the critical transition into superconductivity.

That transition, known as the pseudogap phase in the material, could be described simply as a door, Bansil says. In an insulator, the structure of the material is like a closed door that lets no one through. If the door is wide openas it would be for a conductorelectrons pass through easily.

But in materials that experience this pseudogap phase, that door would be slightly open. The dynamics of what transforms that door into a really wide open door (or, superconductor) remains a mystery, but the new model captures 26 electron configurations that could do it.

With our ability to now do this first-principles-parameter-free-type of modeling, we are in a position to actually go further, and hopefully begin to understand this pseudogap phase a bit better, Bansil says.

Reference: Competing stripe and magnetic phases in the cuprates from first principles by Yubo Zhang, Christopher Lane, James W. Furness, Bernardo Barbiellini, John P. Perdew, Robert S. Markiewicz, Arun Bansil, and Jianwei Sun, 8 November 2019, Proceedings of the National Academy of Sciences.DOI: 10.1073/pnas.1910411116

Read the rest here:
Superconductor or Not? Exploring the Identity Crisis of This Weird Quantum Material - SciTechDaily

Where will technology take us in 2020? – Digital News Asia

FROM cognitive intelligence, in-memory-computing, fault-tolerant quantum computing, new materials-based semiconductor devices, to faster growth of industrial IoT, large-scale collaboration between machines, production-grade blockchain applications, modular chip design, and AI technologies, we can expect technology advancements and breakthroughs to gain momentum and generate a great impact on our daily lives in the year ahead.

Here are the top 10 technology trends for 2020, as seen by the Alibaba Damo Academy, Alibaba Groups global research initiative.

Artificial intelligence evolves from perceptual intelligence to cognitive intelligence

Artificial intelligence has reached or surpassed humans in the areas of perceptual intelligence such as speech to text, natural language processing, video understanding etc; but in the field of cognitive intelligence that requires external knowledge, logical reasoning, or domain migration, it is still in its infancy.

Cognitive intelligence will draw inspiration from cognitive psychology, brain science, and human social history, combined with techniques such as cross domain knowledge graph, causality inference, and continuous learning to establish effective mechanisms for the stable acquisition and expression of knowledge. These make machines understand and utilise knowledge, achieving key breakthroughs from perceptual intelligence to cognitive intelligence.

In-Memory-Computing addresses the "memory wall" challenges in AI computing

In Von Neumann architecture, memory and processor are separate and the computation requires data to be moved back and forth. With the rapid development of data-driven AI algorithms in recent years, it has come to a point where the hardware becomes the bottleneck in the explorations of more advanced algorithms.

In Processing-in-Memory (PIM) architecture, in contrast to the Von Neumann architecture, memory and processor are fused together and computations are performed where data is stored with minimal data movement. As such, computation parallelism and power efficiency can be significantly improved. We believe the innovations on PIM architecture are the tickets to next-generation AI.

Industrial IoT powers digital transformations

In 2020, 5G, the rapid development of IoT devices, cloud computing and edge computing will accelerate the fusion of information, communications, and industrial control systems. Through advanced Industrial IoT, manufacturing companies can achieve automation of machines, in-factory logistics, and production scheduling, as a way to realise C2B smart manufacturing.

In addition, interconnected industrial systems can adjust and coordinate the production capability of both upstream and downstream vendors. Ultimately it will significantly increase the manufacturers productivity and profitability.

Large-scale collaboration between machines becomes possible

Traditional single intelligence cannot meet the real-time perception and decision needs of large-scale intelligent devices. The development of collaborative sensing technology of between the Internet of Things and 5G communication technology will realise the collaboration among multiple agents -- machines cooperate and compete with each other to complete target tasks.

The group intelligence brought by the cooperation of multiple intelligent bodies will further amplify the value of the intelligent system: large-scale intelligent traffic light dispatching will realise dynamic and real-time adjustment, while warehouse robots will work together to complete cargo sorting more efficiently; driverless cars can perceive the overall traffic conditions on the road, and group unmanned aerial vehicle (UAV) collaboration will get through last -mile delivery more efficiently.

Modular design makes chips easier and faster by stacking chiplets together

Traditional chip design cannot efficiently respond to the fast evolving, fragmented and customised needs of chip production. The open source SoC chip design based on RISC-V, high-level hardware description language, and IP-based modular chip design methods have accelerated the rapid development of agile design methods and the ecosystem of open source chips.

In addition, the modular design method based on chiplets uses advanced packaging methods to package chiplets with different functions together, which can quickly customise and deliver chips that meet specific requirements of different applications.

Large-scale production-grade blockchain applications will gain mass adoption

BaaS (Blockchain-as-a-Service) will further reduce the barriers of entry for enterprise blockchain applications. A variety of hardware chips embedded with core algorithms used in edge, cloud and designed specifically for blockchain will also emerge, allowing assets in the physical world to be mapped to assets on blockchain, further expanding the boundaries of the Internet of Value and realising "multi-chain interconnection".

In the future, a large number of innovative blockchain application scenarios with multi-dimensional collaboration across different industries and ecosystems will emerge, and large-scale production-grade blockchain applications with more than 10 million DAI (Daily Active Items) will gain mass adoption.

A critical period before large-scale quantum computing

In 2019, the race to reach Quantum Supremacy brought the focus back to quantum computing. The demonstration, using superconducting circuits, boosted the overall confidence on superconducting quantum computing for the realisation of a large-scale quantum computer.

In 2020, the field of quantum computing will receive increasing investment, which comes with enhanced competition.

The field is also expected to experience a speed-up in industrialisation and the gradual formation of an ecosystem. In the coming years, the next milestones will be the realisation of fault-tolerant quantum computing and the demonstration of quantum advantages in real-world problems. Either is a great challenge given present knowledge. Quantum computing is entering a critical period.

New materials will revolutionise semiconductor devices

Under the pressure of both Moore's Law and the explosive demand of computing power and storage, it is difficult for classic Si based transistors to maintain sustainable development of the semiconductor industry.

Until now, major semiconductor manufacturers still have no clear answer and option to chips beyond 3nm. New materials will make new logic, storage, and interconnection devices through new physical mechanisms, driving continuous innovation in the semiconductor industry.

For example, topological insulators, two-dimensional superconducting materials, etc. that can achieve the lossless transport of electrons and spin can become the basis for new high-performance logic and interconnect devices; while new magnetic materials and new resistive switching materials can realise high-performance magnetics Memory such as SOT-MRAM and resistive memory.

Growing adoption of AI technologies that protect data privacy

The compliance costs demanded by recent data protection laws and regulations related to data transfer are increasing. In light of this, there has been growing interests in using AI technologies to protect data privacy.

The essence is to enable the data user to compute a function over input data from different data providers while keeping the data private. Such AI technologies promise to solve the problems of data silos and the lack of trust in today's data sharing practices, and will truly unleash the value of data in the foreseeable future.

Cloud becomes the centre of IT technology innovation

With the ongoing development of cloud computing technology, the cloud has grown far beyond the scope of IT infrastructure, and gradually evolved into the center of all IT technology innovations.

Cloud has a close relationship to almost all IT technologies, including new chips, new databases, self-driving adaptive networks, Big Data, AI, IoT, blockchain, quantum computing and so forth.

Meanwhile, it creates new technologies, such as serverless computing, cloud-native software architecture, software-hardware integrated design, as well as intelligent automated operation.

Cloud computing is redefining every aspect of IT, making new IT technologies more accessible for the public. Cloud has become the backbone of the entire digital economy.

Continued here:
Where will technology take us in 2020? - Digital News Asia

January 9th: France will unveil its quantum strategy. What can we expect from this report? – Quantaneo, the Quantum Computing Source

It is eagerly awaited! The "Forteza" report, named after its rapporteur, Paula Forteza, Member of Parliament for La Rpublique en Marche (political party of actual President Emmanuel Macron), should finally be officially revealed on January 9th. The three rapporteurs are Paula Forteza, Member of Parliament for French Latin America and the Caribbean, Jean-Paul Herteman, former CEO of Safran, and Iordanis Kerenidis, researcher at the CNRS. Announced last April, this report was initially due at the end of August, then in November, then... No doubt the complex agenda, between the social movements in France, and the active participation of the MP in the Parisian election campaign of Cdric Villani, mathematician and dissident of La Rpublique en Marche... had to be shaken up. In any case, it is thus finally on January 9th that this report entitled "Quantum: the technological shift that France will not miss", will be unveiled.

"Entrusted by the Prime Minister in April 2019, the mission on quantum technologies ends with the submission of the report by the three rapporteurs Paula Forteza, Jean-Paul Herteman, and Iordanis Kerenidis. Fifty proposals and recommendations are thus detailed in order to strengthen France's role and international position on these complex but highly strategic technologies. The in-depth work carried out over the last few months, fueled by numerous consultations with scientific experts in the field, has led the rapporteurs to the conclusion that France's success in this field will be achieved by making quantum technologies more accessible and more attractive. This is one of the sine qua non conditions for the success of the French strategy", explains the French National Congress in the invitation to the official presentation ceremony of the report.

The presentation, by the three rapporteurs, will be made in the presence of the ministers for the army, the economy and finance, and higher education and research. The presence of the Minister of the Armed Forces, as well as the co-signature of the report by the former president of Safran, already indicates that military applications will be one of the main areas of proposals, and possibly of funding. Just as is the case in the United States, China or Russia.

Of course, the report will go into detail about the role of research, and of the CNRS, in advances in quantum computing and communication. Of course, the excellent work of French researchers, in collaboration with their European peers, will be highlighted. And of course, France's excellence in these fields will be explained. France is a pioneer in this field, but the important questions are precisely what the next steps will be. The National Congress indicates that this report will present 50 "proposals and recommendations". Are we to conclude that it will be just a list of proposals? Or will we know how to move from advice to action?

These are our pending questions:

- The United States is announcing an investment of USD 1.2 billion, China perhaps USD 10 billion, Great Britain about 1 billion euros, while Amazon's R&D budget alone is USD 18 billion... how can a country like France position itself regarding the scale of these investments? To sum up, is the amount of funds allocated to this research and development in line with the ambitions?

- Mastering quantum technologies are becoming a geopolitical issue between the United States and China. Should Europe master its own technologies so as not to depend on these two major powers? On the other hand, is this not the return of a quantum "Plan calcul from the 60s? How can we avoid repeating the same mistakes?

- Cecilia Bonefeld-Dahl, Managing Director of DigitalEurope recently wrote that Europe risks being deprived of the use of quantum technologies if it does not develop them itself. Christophe Jurzcak, the head of Quantonation, stated that it is not certain that France will have access to quantum technologies if it does not develop them itself. Is this realistic? Do we have the ressources?

- French companies currently invest very little in research in the field of quantum computing. With the exception of Airbus, the main feedback that we know of is in Canada, Australia, Spain, Germany, etc. Should we also help companies to embrace these technologies, or should we only finance research and development on the part of universities and business creators? Is there a support component for companies? So that technologies are not simply developed in France and sold elsewhere, but that France is the leading market for local developments.

See you on January 9th on Decideo for more details and our objective analysis of the content of this document.

More here:
January 9th: France will unveil its quantum strategy. What can we expect from this report? - Quantaneo, the Quantum Computing Source

Goldman Sachs and QC Ware Join Forces to Develop Quantum Algorithms in Finance – Quantaneo, the Quantum Computing Source

"During the past year, researchers at QC Ware and Goldman Sachs have worked on analyzing the effect of noise on the accuracy of quantum algorithms for approximate counting," said Paul Burchard, lead researcher for R&D at Goldman Sachs. "The research confirmed that the current state-of-the-art quantum algorithms for Monte Carlo sampling and approximate counting will eventually lead to more efficient simulation, but that these algorithms are sensitive to noise in current quantum hardware. As a result, implementing these algorithms on near term quantum hardware will depend on techniques analogous to importance sampling that reduce the circuit depth of these algorithms."

"QC Ware's work with Goldman Sachs is essential to gaining a better understanding of how quantum computing algorithms can eventually be used in finance and how to make the practical use of quantum computing a reality faster," said Matt Johnson, CEO, QC Ware.

"QC Ware believes that quantum computing will significantly impact the future of finance," said Wim van Dam, Head of Quantum Algorithms, QC Ware. "Current quantum computers are limited in the number of qubits and the circuit depth that they support. We are focused on applying QC Ware's expertise in meeting this challenge by delivering access to QC Ware's Forge cloud service to test near-term quantum applications and help build in-house quantum computing skills."

Follow this link:
Goldman Sachs and QC Ware Join Forces to Develop Quantum Algorithms in Finance - Quantaneo, the Quantum Computing Source

The World Keeps Growing Smaller: The Reinvention Of Finance – Seeking Alpha

In the prominent headlines, we keep reading about the attempts to keep the world fragmented by imposing tariffs and constraining the exchange of ideas in many ways, but information keeps spreading, and with the continued spread of information, the world progresses. John Thornhill writes in the Financial Times about how China is completely redesigning finance...

Yes, the United States is working through the FinTech era, where efforts are being made to use evolving finance and technology to deliver familiar services more efficiently, but the Chinese effort, writes Mr. Thornhill, is trying to do something entirely different.

China wants to change the platform.

In the past, I have written about how the United States banking industry has lagged behind the rest of the world is moving toward a more electronic and integrated finance platform. Even in some less-developed countries, payment systems have been evolving at a faster pace than in the United States because of the need to reduce the impact of geographical distances.

Only in the past year or two have some of the larger US banks moved forward, trying to develop a more advanced system.

Commercial banks in the United States have been the biggest and most important banks in the world and have concentrated upon the more sophisticated areas of finance, rather than the basic payments systems that are the foundation of the whole financial system. And, although there have been efforts to advance the financial platforms of the American banks, it is somewhat ironic that several of the largest banks have moved toward quantum computers to revolutionize activities like risk management and trading.

Richard Waters writes about how JPMorgan Chase & Co., Goldman Sachs and Citigroup have entered this space in the last couple of years.

For example, Mr. Waters quotes Paul Burchard, a senior researcher at Goldman Sachs: We think theres a possibility this becomes a critical technology.

And Despite the challenges, advances in quantum hardware have persuaded the banks the time has come to leap.

One can smile at this leap, but what about the basics of banking?

Here Mr. Thornhill writes that, The speed at which China has moved from a cash to a digital-payments economy is staggering: some $17 trillion of transactions were conducted online in 2017. Chinas mobile payment volumes are more than 50 times those in the US.

The growth has come from two corporate sources, Alibaba (BABA) and Tencent (OTCPK:TCEHY). The number of users is staggering.

However, the biggest potential lies ahead. As Mr. Thornhill states, the most enticing opportunities lie abroad. About 1.7 billion people in the world remain unbanked. When they come online they will be looking for cheap, convenient, integrated digital financial services, such as China has pioneered.

China has the chance to rewire 21st-century finance.

The implication here is that United States banks will have to adjust to this payment system that China is spreading to the rest of the world.

In other words, information spreads, and even though the spread of information may be constrained in certain parts of the world, it will expand in the areas where there are fewer constraints. This is the way it has always worked throughout history. Quantum computing is currently not the answer for the US banking system.

Oh, yes, it will be fun to design new types of algorithms for quantum computers, as Mr. Waters writes, and the first of these involves a class of optimization problems that take advantage of the probabilistic nature of quantum computing to analyze a large number of possible outcomes and pick the most desirable...

But who is going to own the payments platform?

Mr. Thornhill believes that the trend in finance over the next decade will be led by the Chinese and the payments system that is being developed within China.

This has all sorts of implications for the US banking system, the US economy and the US political system. A question coming from this conclusion concerns whether or not the US dollar can maintain its position within the world financial system.

When we start trying to insulate ourselves from the world and try and control little pieces of it for ourselves, we tend to lose our place in the bigger picture. This is just another one of the unintended consequences we find in the field of economics.

But it has huge implications for American banks and the United States banking system. Consequently, this has huge implications for investors in the commercial banking industry. And it should be put within the context of what is just happening in the United States.

I guess that banking in 2030 will not look at all like what is going on right now.

Disclosure: I/we have no positions in any stocks mentioned, and no plans to initiate any positions within the next 72 hours. I wrote this article myself, and it expresses my own opinions. I am not receiving compensation for it (other than from Seeking Alpha). I have no business relationship with any company whose stock is mentioned in this article.

Read more:
The World Keeps Growing Smaller: The Reinvention Of Finance - Seeking Alpha

Achieving Paperless Operations and Document Automation with AI and ML – ReadWrite

Paper is an essential commodity for office operations. Most conventional offices rely on paper for completing the simplest tasks. Even after digitization, the dream of a completely paperless office is far from reality. Humans are used to a standard form of note-taking and documentation. Here is how to achieve paperless operations and document automation with AI and ML.

Progressive technologies like artificial intelligence and machine learning help enterprises achieve their goal of paperless offices. Using these technologies, the issues associated with managing large volumes of data documented on paper can be efficiently solved.

Paperless enterprises run on digital devices with minimum paper consumption. In a digitally connected world, this gives businesses an unprecedented edge. All data is stored digitally, on the cloud or on-premise, which can be used in real-time to derive valuable insights about operational efficiency, marketing campaigns, employee engagement and a lot more.

Machine Learning (ML) is making it possible to achieve next-gen digital transformation by automating several business operations, that requires filling up loads of paper documents. Already, businesses are making an effort to integrate machine learning and artificial intelligence to go digital and achieve higher efficiency.

Source: https://www.fingent.com/blog/machine-learning-to-accelerate-paperless-offices

Automation can offer several benefits to modern enterprises. Not only the tedious task of filing and storing a large number of documents can be minimized, but organizations can improve their data discovery and utilization capabilities. Here are some of the benefits of adopting paperless processes:

Digitization through artificial intelligence and machine learning allows companies to organize all information in easily accessible formats. This saves time as employees dont have to waste hours searching for a document. Also, this promotes remote working culture and bring next-level authentication as the origin of digital information can be identified.

One of the biggest drawbacks of paper-based data storage is associated with the security and safety of data. Conventionally, office cultures were not serious about data protection and stored critical information either in filing cabinets or any similar method.

All these methods are prone to data theft or damage due to unavoidable circumstances. Paperless office enhances the security measures as companies can take a backup of data, secure data through passwords and take steps to enforce security measures.

Storing data using paper-based techniques is a cumbersome and costly affair. Companies can save millions of dollars annually by eliminating the need for paper, copier equipment, and maintenance. Also, companies dont have to waste valuable real estate for storage of files and other documents.

Paperless digitization promotes easy accessibility from anywhere, which means less money is spent on the physical transmission of data using conventional methods.

Digitally stored data serves as a massive data pool to derive real-time insights from available data. This means that the information available to an enterprise can be put to better use for boosting efficiency. Marketing managers can utilize real-time data gathered from various campaigns; production teams can understand customer preferences.

Machine learning and artificial intelligence can enhance data analysis capabilities and make organizational processes closer to the customers needs and preferences.

AI/ML-based paperless workflows will significantly improve the productivity of law firms. Traditionally, the legal profession is seen as a labor-intensive task- browsing through thousands of legal case files, reviewing past case studies, examining legal contracts and more.

AI can reduce manual intervention for data analysis and processing, leaving more time with advocates, lawyers and legal firms to advise their clients and appeal in courts. Artificial intelligence (AI) can be leveraged to keep a record of legal contracts and provide real-time alerts on renewals, proofreading legal documents and locate valuable information in seconds. For the legal system, artificial intelligence is the key to paper-free litigation and trials in the future.

2. Automobile industries

The automobile industry is one of the biggest beneficiaries of the AI/ML innovation. Machine learning has allowed automobile factories to create autonomous systems for managing large volumes of data generated during the manufacturing process.

Moreover, AI is reducing the effort required for filing claims in case of shop-floor accidents as data is digitized and form filing can be automated. Also, ML algorithms allow customers to get real-time diagnostic support without needing to file paper-based forms as a vehicle can be directly connected to the manufacturer via cloud infrastructure. This means that repairs, service and general performance issues can be reported in real-time without the need for paper.

The insurance sector can use machine learning to automate claims will prove delightful for customer service processes. Machine learning and artificial intelligence can be leveraged to create sophisticated rating systems for evaluating risks and predicting an efficient pricing structure for each policy. All this can be automated, which reduces the need for manual intervention from human agents for classifying risks.

Also, artificial intelligence can streamline workflow by managing a large volume of claims data, policy benefits, medical/personal records, digitally. The data stored on the cloud can be used by an AI algorithm to derive real-time insights about policyholders and bring efficiency to the fraud detection process.

Wrapping Up

Artificial intelligence has the potential to revolutionize workspaces like never before. With the help of an AI development company, small, medium and large-scale enterprises can make a substantial move towards a paperless future. Not only will it reduce the cost of operations but it will boost the overall efficiency of the existing business processes. The industry use cases suggested above is just the tip of a massive iceberg.

The possibilities are limitless. An AI-driven product development companycan understand your existing business processes and suggest custom solutions that can be a suitable fit for your business operations.

Im Namee, a digital marketer working for Azilen Technologies. Im also passionate about exploring and writing about innovation, technology including AI, IOT, Big Data and HR Tech.

Read the original here:
Achieving Paperless Operations and Document Automation with AI and ML - ReadWrite

Growth strategies for Encryption Software Market for period of 2019 to 2026 explained in a new research – News Cast Report

Polaris Market Research published its latest findings in a new study on Encryption Software Market. The analysts expect that the market will reach USD 20.44 billion by 2026 growing at a CAGR of 15.9% from 2019-2026.

The study provides in-depth analysis on different factors such as industry growth potential, market drivers, restraints and challenges. The study also focuses on different market dynamics which are expected to affect the market. The value-chain analysis in the report helps in understanding the overall market from both supply side and demand side.

Request A Sample Report At: https://www.polarismarketresearch.com/industry-analysis/encryption-software-market/request-for-sample

The study includes major players in the Encryption Software Market such as Microsoft Corporation, Symantec Corporation, IBM Corporation, EMC Corporation, CISCO Systems Inc., Intel Security, Check Point Software Technologies Ltd., Oracle Corporation, Trend Micro, Inc.

The study evaluates the overall Encryption Software Market by the following segments:

Have Any Query Or Specific Requirement? Feel Free To Ask Our Industry Experts At: https://www.polarismarketresearch.com/industry-analysis/encryption-software-market/speak-to-analyst

Key Takeaways of the report

About Polaris Market Research

Polaris Market Research is a global market research and consulting company. The company specializes in providing exceptional market intelligence and in-depth business research services for our clientele spread across different enterprises. We at Polaris are obliged to serve our diverse customer base present across the industries of healthcare, technology, semi-conductors and chemicals among various other industries present around the world

Contact Us

Polaris Market Research

Phone: 1-646-568-9980

Email:sales@polarismarketresearch.com

Web:www.polarismarketresearch.com

Martin is a regular contributor to news Cast, His Excellence in Research and Analytics makes him one of a famous writer in Business Forecasting.

Read more:
Growth strategies for Encryption Software Market for period of 2019 to 2026 explained in a new research - News Cast Report

PGP keys, software security, and much more threatened by new SHA1 exploit – Ars Technica

Three years ago, Ars declared the SHA1 cryptographic hash algorithm officially dead after researchers performed the worlds first known instance of a fatal exploit known as a "collision" on it. On Tuesday, the dead SHA1 horse got clobbered again as a different team of researchers unveiled a new attack thats significantly more powerful.

The new collision gives attackers more options and flexibility than were available with the previous technique. It makes it practical to create PGP encryption keys that, when digitally signed using SHA1 algorithm, impersonate a chosen target. More generally, it produces the same hash for two or more attacker-chosen inputs by appending data to each of them. The attack unveiled on Tuesday also costs as little as $45,000 to carry out. The attack disclosed in 2017, by contrast, didnt allow forgeries on specific predetermined document prefixes and was evaluated to cost from $110,000 to $560,000 on Amazons Web Services platform, depending on how quickly adversaries wanted to carry it out.

The new attack is significant. While SHA1 has been slowly phased out over the past five years, it remains far from being fully deprecated. Its still the default hash function for certifying PGP keys in the legacy 1.4 version branch of GnuPG, the open-source successor to PGP application for encrypting email and files. Those SHA1-generated signatures were accepted by the modern GnuPG branch until recently, and were only rejected after the researchers behind the new collision privately reported their results.

Git, the world's most widely used system for managing software development among multiple people, still relies on SHA1 to ensure data integrity. And many non-Web applications that rely on HTTPS encryption still accept SHA1 certificates. SHA1 is also still allowed for in-protocol signatures in the Transport Layer Security and Secure Shell protocols.

In a paper presented at this weeks Real World Crypto Symposium in New York City, the researchers warned that even if SHA1 usage is low or used only for backward compatibility, it will leave users open to the threat of attacks that downgrade encrypted connections to the broken hash function. The researchers said their results underscore the importance of fully phasing out SHA1 across the board as soon as possible.

This work shows once and for all that SHA1 should not be used in any security protocol where some kind of collision resistance is to be expected from the hash function, the researchers wrote. Continued usage of SHA1 for certificates or for authentication of handshake messages in TLS or SSH is dangerous, and there is a concrete risk of abuse by a well-motivated adversary. SHA1 has been broken since 2004, but it is still used in many security systems; we strongly advise users to remove SHA1 support to avoid downgrade attacks.

To recap, a hash is a cryptographic fingerprint of a message, file, or other type of digital input that, like traditional fingerprints, looks unique. Also known as message digests, hashes play a vital role in ensuring that software updates, cryptographic keys, emails, and other types of messages are the authentic product of a specific person or entity, as opposed to a counterfeit input created by an adversary. These digital fingerprints come in the form of a fixed sequence of numbers and letters that are generated when the message is inputted into a hash algorithm or function.

The entire security of a hashing scheme rests on the infeasibility of finding two or more different inputs that produce the same fingerprints. A function with a bit length of n should require a brute force attacker to test 2n/2 inputs before finding a collision (a mathematical concept known as the birthday paradox significantly reduces the number of guesses required, accounting for the n/2 in the equation). Hash functions with sufficient bit lengths and collision resistance are secure because they require an attacker to devote an infeasible amount of time and computing resources to generate a collision. Hash functions are considered broken when collisions can be found using fewer than 2n/2 tries.

The 128-bit MD5 hash function was one of the earlier widely used entrants to fall to collision attacks. Although researchers warned as early as 1996 that flaws in MD5 made it prone to collisions, it remained a key part of software and Web authentication for more than two decades afterwards.

Then, in 2008, researchers used MD5 collisions to create an HTTPS certificate for any website of their choosing. The demonstration eventually convinced browser-trusted certificate authorities to drop MD5, but the function continued to be widely used for other purposes. The full deprecation of MD5 for authentication purposes didnt come until 2012, when the Flame espionage malware, which the US and Israel are reported to have used to spy on sensitive Iranian networks, wielded a collision attack to hijack Microsoft's Windows Update mechanism so Flame could spread from computer to computer inside an infected network.

The attackwhich cost as little as $110,000 to carry out on Amazon's cloud computing platformwas what cryptographers call a classical collision attack. Also known as an identical prefix collision, it results when two inputs have the same predetermined prefixor beginningand differing data that follows. Even though the two inputs are distinctly different, they can hash to the same value if additional data is appended to the files. Stated another way, for a hash function H, two distinct messages M1 and M2 will lead to the same hash output: H(M1) = H(M2).

Identical prefix collisions are powerful and a fatal blow against the security of a hash function, but their utility to attackers is also limited. A far more powerful form of collision is known as a chosen prefix attack, which is what allowed the MD5 attacks against the HTTPS certificate system in 2008 and against Microsofts update mechanism in 2012. While harder to carry out than identical prefix collisions, the chosen prefix cousins are generally much more useful.

Thats because chosen prefix attacks allow attackers to take two or more different prefixesas opposed to the same prefix in traditional collision attacksand append data to each so they hash to the same value. Given two message prefixes P1 and P2, an attacker can compute two messages M1 and M2 such that H(P1 || M1) = H(P2 || M2), where || denotes concatenation, or the act of linking the two. A more detailed explanation of chosen prefix collisions is available in this 2015 post from Nick Sullivan, head of research and cryptography at content delivery network Cloudflare.

The attack demonstrated Tuesday is the first known chosen prefix collision on SHA1. To demonstrate its potency, researchers Gatan Leurent and Thomas Peyrin of Inria France and the Nanyang Technological University in Singapore respectively, used the collision to perform a PGP/GnuPG impersonation attack. In their Real World Crypto paper the researchers explain:

The chosen prefixes correspond to headers of two PGP identity certificates with keys of different sizes, an RSA-8192 key and an RSA-6144 key. By exploiting properties of the OpenPGP and JPEG format, we can create two public keys: key A with the victim name, and key B with the attacker name and picture, such that the identity certificate containing the attacker key and picture has the same SHA-1 hash as the identity certificate containing the victim key and name. Therefore, the attacker can request a signature of his key and picture from a third party (from the Web of Trust or from a CA) and transfer the signature to key A. The signature will still be valid because of the collision, while the attacker controls key A with the name of the victim, and signed by the third party. Therefore, he can impersonate the victim and sign any document in her name.

In a post further demonstrating the attack, the researchers provided both messageA and messageB. Despite containing differing user ID prefixes, they both map to the same SHA1 hash value of 8ac60ba76f1999a1ab70223f225aefdc78d4ddc0.

The researchers results significantly improve the efficiency of SHA1 attacks, with a speedup factor of about 10. More precisely, the new attacks reduce the cost of an identical prefix collision attack from 264.7 to 261.2, and the cost of a chosen-prefix collision attack from 267.1 to 263.4 when performed on a GTX 970 graphics processor.

The researchers carried out the attack over a two-month period on a cluster of 900 Nvidia GTX 1060 GPUs they rented online. They said the rented cluster is a much more economical platform than Amazon Web Services and competing cloud services. The attack cost $74,000 when carried out a few months ago, but with an optimized implementation and computation costs that have continued to fall, the researchers say the same attack now costs $45,000. By 2025, the researchers estimate the attack will cost $10,000. The result: the same chosen prefix attacks that have been possible against MD5 since 2009 are now practical against SHA1 as well and will only become more affordable over time.

The researchers privately reported their results to developers of software that is most affected. They included developers for:

Given the number of applications and protocols that continue to rely on SHA1 for collision-resistant hashes, however, the researchers were unable to contact all affected developers. To prevent the attacks from being actively used in the wild, the researchers are withholding many of the collision details for the time being.

Matt Green, a John Hopkins University professor specializing in cryptography, said the results were impressive and underscored the oft-repeated observation that SHA1 can no longer be considered secure.

For a secure hash function, a [speedup] factor of 10 shouldnt make much of a difference, but when youre down to something thats pretty close to broken, those kinds of efficiencies really make a difference, especially when theres lots of mining hardware out there, he said in an interview. We knew that one shoe had dropped and this is the next shoe dropping.

Continue reading here:
PGP keys, software security, and much more threatened by new SHA1 exploit - Ars Technica

How to cope with a FileVault recovery key disappearing while you write it down – Macworld

How to cope with a FileVault recovery key disappearing while you write it down | Macworld ');consent.ads.queue.push(function(){ try { IDG.GPT.addDisplayedAd("gpt-superstitial", "true"); $('#gpt-superstitial').responsiveAd({screenSize:'971 1115', scriptTags: []}); IDG.GPT.log("Creating ad: gpt-superstitial [971 1115]"); }catch (exception) {console.log("Error with IDG.GPT: " + exception);} }); The key cant be re-displayed once its dismissed.

Today's Best Tech Deals

Picked by Macworld's Editors

Top Deals On Great Products

Picked by Techconnect's Editors

FileVault is an extraordinary bit of macOS technology. Introduced years ago, it encrypts the entire contents of your startup volume so that when the data is at restwhen your Mac is powered downthe drive is effectively full of garbage nonsense to anyone who doesnt possess either the password to an account authorized to log in via FileVault or the special recovery key set when you turn FileVault on.

When you use the Security & Privacy preference panes FileVault tab to enable this encryption, macOS prompts you with two choices:

In both cases, a recovery key is set. However, if you use iCloud to store your key, you never see it, and Apple manages the recovery process. All you need is your iCloud password and, if you turned on two-factor authentication, a trusted device or access to a trusted phone number. But this introduces risk, as someone who obtained your computer and discovered your password could potentially unlock the drive, too.

I prefer the second choice, as it provides entirely local control. No secret is stored remotely. You only face a problem if you forget the passwords to all macOS accounts approved for FileVault-based cold start (from a shutdown state) loginsand you lose your recovery key. (I have heard of cases in which account information becomes corrupted, though, and the recovery key is the only way to start up a Mac.)

What happens if, while youre trying to write down the recovery key, it disappears from the screen? While this seems unlikely it happened to one reader, who doesnt believe they clicked a button or otherwise caused the key message to dismiss. They wrote in to ask how they could recover the recovery key?

Unfortunately, theres no method to retrieve the key once its been displayed and dismissed. The recovery key is generated and passed through a strong one-way encryption process; only the result is used to further protect the keys used in FileVault encryption. The recovery key is displayed once. When you dismiss the dialog, macOS tosses this original version of it forever. (Entering the precise original recovery key, which is fed through the same one-way process, unlocks the data that it protects.)

If you werent able to write the key down before it disappeared from view, you have to disable FileVault encryption and re-enable it to generate a new recovery key:

In the Security & Privacy system preference pane, click the FileVault tab.

Click the lock icon at the lower-left corner and enter an account name and password with administrative access.

Click the Turn Off FileVault button.

Confirm you want to disable FileVault by clicking Restart & Turn Off Encryption.

Your Mac now restarts. After you log back in using an account with FileVault permission, macOS begins decrypting the entire contents of the drive. This can take quite a while.

When decryption is complete, you can return to the FileVault tab and click Turn On FileVault.

At the Recovery Key prompt, choose the Create a recovery key option and write the key down. You might even quickly take a picture of it as a backup. (But be sure to delete that photo and then permanently delete it from the Recently Deleted album to avoid any chance of someone gaining access to it.)

Restart again and FileVault begins the slow process of encrypting the startup volume once more.

This Mac 911 article is in response to a question submitted by Macworld reader Michael.

Weve compiled a list of the questions we get asked most frequently along with answers and links to columns: read our super FAQ to see if your question is covered. If not, were always looking for new problems to solve! Email yours to mac911@macworld.com including screen captures as appropriate, and whether you want your full name used. Not every question will be answered, we dont reply to email, and we cannot provide direct troubleshooting advice.

More:
How to cope with a FileVault recovery key disappearing while you write it down - Macworld

Why Have VPNs Become So Important To Corporations? – Forbes

Virtual private networks (VPNs) have become popular, particularly with businesses, because they provide advanced security without compromising convenience. They are easy to set up and use and are one of the most affordable cybersecurity options available today. A business VPN is part of almost every enterprises IT infrastructure. But how did their rise to prominence occur?

How did VPNs become popular?

In recent years, there has been a significant increase in the number of cyberattacks on companies. And the cybercrime business is only getting larger -- it's expected to cost the world $6 trillion by 2021, according to The Herjavic Group.

A recent Verizon report claimed that nearly 43% of small businesses were targeted through cyberattacks. The report was based on research conducted in over 86 countries worldwide, and its findings have led to a surge in demand for stronger security solutions like VPN services.

How does a VPN work?

The purpose of a business VPN is to provide end-to-end encryption for every device in your companys network, which means no snoops, hackers or even your internet service provider can see your location or data. This provides a private, secure connection to the internet no matter where you are.

Today, VPNs provide advanced security solutions to over 26% of internet users across the world, according to audience analytics firm GlobalWebIndex.

Why do businesses today use VPN services?

The way companies do business has changed considerably over the last 15 years. Desktop computers have given way to laptops, iPads and other mobile devices. There's more flexibility with remote working options and bring your own device (BYOD) policies -- all of which have resulted in the need for more digital security and, subsequently, the growth of the business VPN market.

The threat of cyberattacks alone has led many organizations to use secure and advanced solutions like VPNs to keep their data protected. VPNs help companies encrypt data and scan devices for malware to prevent hacking threats. Data shows the global VPN market was valued at $15 billion in 2016 and is projected to grow by 18% before 2022. Two of the most critical drivers of this growth are security and privacy.

Slow speeds have traditionally been a weakness for VPN services across the world, but advancements to VPN technology, such as layer 2 tunneling protocols (L2TP) and virtual private LAN service (VPLS), have helped overcome many restrictions, including internet speed. These new technologies use point-to-point topology and are building even more confidence in the VPN market by offering improved encryption and more security options.

Many tech giants have already started offering proprietary VPN services. McAfee recently acquired TunnelBear VPN, and Symantec started Norton Wi-Fi VPN. Facebook has also set its sights on reintroducing a VPN service, but the social media giant has been criticized for the way it handles data of the terminated Facebook Onavo VPN.

Are VPNs worth the money?

If you're looking to cut costs for your business, it may be tempting to avoid paying for a VPN. But if your security fails, you could end up paying a significantly higher price. Insurance carrier Hiscox says that businesses of all sizes lose up to $200,000 on average in the event of a data breach.

When it comes time to get a VPN, don't skimp out. Contrary to what you may think, VPNs are not expensive. In fact, they're some of the most cost-effective services around and don't require a big IT budget.

Protecting your data and keeping your servers safe is a huge priority and promotes confidence in your business. Plenty of cloud VPN providers are available without any major installations needed on your servers or from your team.

If you're ready to regain control of your online activities, it's important to find a business VPN that suits your needs. Now is not the time to claim ignorance; it's time to act. Secure your company for the future by protecting all of your crucial business data and servers with a strong business VPN service.

See the original post here:
Why Have VPNs Become So Important To Corporations? - Forbes