Quantum Computing and the Cryptography Conundrum – CXOToday.com

By: Anand Patil

On October 23, 2019, researchers from Google made an official announcement of a major breakthrough one that scientists compared to the Wright Brothers first flight, or even mans first moon landing. They said to have achieved Quantum Supremacy, meaning that they had created a Quantum Computer that could perform a calculation that is considered impossible by the classical computers of today. The announcement was a landmark, highlighting the possibilities of Quantum Computing.

The concept of Quantum Computing itself isnt new. It is a field that has been a point of interest of physicists and computer researchers since the 1980s. Googles announcement, however, has brought it to the mainstream, and shone a spotlight on the promise that this niche field of innovation holds. Of course, like someone once said, with great power comes with great responsibility, so this field isnt without complexities.

The Possibilities of Quantum Computing

Quantum Computing is a branch of computer science that is focused on leveraging the principles of quantum physics to develop computer technology. Quantum Computers hold the promise to power major advances in various fields that require complex calculations from materials science and pharmaceuticals to aerospace and artificial intelligence (AI).

So far, Quantum Computers have been nothing more than fancy laboratory experiments large and expensive but they have successfully demonstrated that the underlying principles are sound and have the potential to transform industries and accelerate innovation like never before. This has spurred scientific and industrial interest in this nascent field, giving rise to multiple projects across the world in pursuit of creating a viable, general-use Quantum Computer. That said, it may still be many years before Quantum Computers are commercially and generally available.

So Why Does It Matter Today?The possibility of Quantum Computers poses a serious challenge to cryptographic algorithms deployed widely today. Todays key-exchange algorithms, like RSA, Diffie-Hellman, and others, rely on very difficult mathematical problems such as prime factorization for their security, which a Quantum computer would be able to solve much faster than a classical computer.

For example, it would take a classical computer centuries or even longer, to break modern algorithms like DH, RSA-2048 etc. by using brute-force methods. However, given the power and efficiency of quantum machines in calculations such as finding prime factors of large numbers it may be possible for a quantum computer to break current asymmetric algorithms in a matter of days

So, while the encrypted internet is not at risk at the moment, all that a bad actor has to do is capture the encrypted data today including the initial key exchange, and then wait until a powerful enough quantum computer is available to decrypt it. This is particularly a problem for organizations that have large amounts of sensitive data that they need to protect over the long term such as Banks, Governments and Defense agencies.

What Can I Do Now?For organizations that could be at risk in the future, this is the best time to start evaluating post-quantum cryptography. Simply put, this means moving to algorithms and/or keys that are a lot more robust and can withstand a brute-force attack by a quantum computer i.e. quantum resistant.

The National Institute of Standards and Technology (NIST) in the US is leading the effort towards the standardization of post-quantum secure algorithms. However, given the lengthy process involved, this may take many years to fructify.

An alternative is to use Quantum Key Distribution (QKD) techniques with existing algorithms that are considered quantum-safe. This involves using a dedicated optical channel to exchange keys using the quantum properties of photons. Any attempt to tap this secure channel will lead to a change in the quantum state of the photon and can be immediately detected and therefore the key is unhackable. One of the limitations of QKD in this method is the need for a dedicated optical channel that cannot span more than 50km between the two terminals. Of course, this also means that the existing encryption devices or routers should be capable of ingesting such Quantum-Generated keys.

Post-Quantum Cryptography and CiscoCisco is an active contributor to the efforts to standardize post-quantum algorithms. However, recognizing that an implementable standard may be some years away, there is work ongoing to ensure that organizations are able to implement quantum-resistant encryption techniques in the interim, that leverage existing network devices like routers which are most commonly used as encryptors.

To start with, a team of veteran technical leaders and cryptography experts from Cisco US David McGrew, Scott Fluhrer, Lionel Florit and the engineering team in Cisco India lead by Amjad Inamdar and Ramas Rangaswamy developed an API interface called the Secure Key Import Protocol or SKIP through which Cisco routers can securely ingest keys from an external post-quantum key source. This allows existing Cisco routers to be quantum-ready, with just the addition of an external QKD system. Going forward, this team is working on a way to deliver quantum-safe encryption keys without the need for short-range point-to-point connections.

The advantage of this method is that organizations can integrate post-quantum key sources with existing networking gear in a modular fashion without the need to replace anything already installed. In this manner, you could create a quantum-ready network for all traffic with minimal effort.

Getting Ready for the Post-Quantum WorldQuantum Supremacy is an event which demonstrates that a quantum machine is able to solve a problem that no classical computer can solve in a feasible amount of time. This race has gathered momentum in the recent past with several companies joining the bandwagon, and some even claiming to have achieved it.

There is an unprecedented amount of attention focused on making a commercially viable quantum computer. Many believe it is inevitable, and only a question of time. When it does happen, the currently used cryptography techniques will become vulnerable, and therefore be limited in their security. The good news is, there are methods available to adopt strong encryption techniques that will remain secure even after quantum computers are generally available.

If you are an organization that wants to protect its sensitive data over the long term, you should start to evaluate post-quantum secure encryption techniques today. By leveraging existing networking infrastructure and adding suitable post-quantum key distribution techniques, it is possible to take a quantum leap in securing your data.

(The author is Director, Systems Engineering, Cisco India and SAARC and the views expressed in this article are his own)

See the article here:
Quantum Computing and the Cryptography Conundrum - CXOToday.com

Quantum Computing in Aerospace and Defense Market Trends and Forecast to 2028 – TechnoWeekly

Quantum Computing in Aerospace and Defense

COVID-19 Industry impact

The market research extensively explores the effect of the COVID-19 outbreak on the market for Quantum Computing in Aerospace and Defense Market. Limits resulting in low sales and sector operators dominating the hospitality industry are at risk due to the lockdowns imposed to contain the spread of the virus, as cafes and restaurants have closed temporarily. Demand from food service providers is expected to recover, as the COVID-19 pandemic restrictions are easy. However, some participants may be forced to leave the sector.

Sample Copy of This Report @ https://www.quincemarketinsights.com/request-sample-29723?utm_source=TW/LY

Features of Key Market Research

Overview of the Market Study:

The market research also analyses methods such as PORTER analysis, PEST analysis, and SWOT analysis to provide companies with quality evaluation. It helps arrange and inspire companies investment strategies for a particular business segment in the near future. The review of market attributes, market overview, industry chain, historical and future data by categories, applications, and regions, and competition landscape are included in this market research. Industry research involves analyzing the global environment in order to estimate the markets vulnerabilities, assets, opportunities, and risks.

Insights on the Market

The purpose of the market study is to include evidence, estimates, statistics, historical data, and market data verified by the industry, as well as the appropriate methodology and evaluation for a full market evaluation. The market research also helps understand the structure by evaluating the dynamics of the market segments. Market segmentation is split on the basis of content, form, end-user, and region.

Segmentation of the Market

This detailed market analysis of Quantum Computing in Aerospace and Defense Market also provides a thorough summary and description of every segment offered in the analysis. Based on their market size, growth rate, and general attractiveness in terms of investment information and incremental value growth, the main segments are benchmarked. Market segmentation is divided into sub-groups, based on certain significant common attributes, into a wide customer or business market.

Segmented By Component (Hardware, Software, Services), By Application (QKD, Quantum Cryptanalysis, Quantum Sensing, Naval)

Get ToC for the overview of the premium report @ https://www.quincemarketinsights.com/request-toc-29723?utm_source=TW/LY

Regional Estimation:

In terms of different geographies, the Quantum Computing in Aerospace and Defense Market report provides a comprehensive perspective on industry growth over the projected period, including Asia Pacific ( APAC), Europe (EU), North America (NA), Latin America (LATAM), and Middle East & Africa (MEA) revenue estimates.

Business Competitive Background:

The competitive market for Quantum Computing in Aerospace and Defense is measured by the number of domestic and foreign players participating in the market. The main focus is on the companys growth, merger, acquisition, and alliance, along with new product creation as measured strategies implemented by influential corporations to improve their customer market presence. D-Wave Systems Inc, Qxbranch LLC, IBM Corporation, Cambridge Quantum Computing Ltd, 1qb Information Technologies Inc., QC Ware Corp., Magiq Technologies Inc., Station Q-Microsoft Corporation, and Rigetti Computing are the prominent market participants examined and profiled in this study.

Highlights of the Market

The market study presents information on key manufacturers of Quantum Computing in Aerospace and Defense Market and revenues, profits, recent growth, and market share of key players. In order to evaluate the global and key regionsQuantum Computing in Aerospace and Defense Market advantages, potentials, opportunity, constraints, threat, and risks, the report has divided the breakdown data by category, regions, businesses, and applications.

By covering all markets, offering quality analysis, and insights to help our customers make the right choices, the market study offers solutions. The latest trends, niche areas, and leading company profiles are included in the study. To provide reliable and useful information, the market research database consists of numerous reports updated on a regular basis.

If You Have Any Query, Ask Our Experts @ https://www.quincemarketinsights.com/ enquiry-before-buying/enquiry-before-buying-29723?utm_source=TW/LY

About us:

QMI has the most varied products and services available on the internet for analysis. We provide research from nearly all major publications and periodically update our list to give you instant online access to the worlds most extensive and up-to-date set of expert insights into the global economy.

Contact Us:Quince Market InsightsOffice No- A109,Pune, Maharashtra 411028Phone: APAC +91 706 672 4848 / US +1 208 405 2835 / UK +44 1444 39 0986Email:[emailprotected]Web: http://www.quincemarketinsights.com

Read more:
Quantum Computing in Aerospace and Defense Market Trends and Forecast to 2028 - TechnoWeekly

New York needs to be reimagined with technology and job training – Crain’s New York Business

Our response to Covid-19 offers a similar opportunity. Although theres no doubt we must focus on addressing immediate problems (schools, contact tracing, saving small businesses), we also should put thought into New Yorks future. Repairing is one thing, but designing a foundation is another. The new street grid, transit reforms and development policies that came out of 9/11 attest to the importance of the latter.

New York leaders should therefore take a few steps to chart the 21st century. In addition to controlling the virus and helping people in need, we must develop a grand strategy that recognizes the economic changes that were already happening before the pandemic, and leverage them in a way that benefits everyone.

Step one: capitalizing on emerging industries. Here the tech sector is a good starting point. Not only will tech companies continue to grow, but so too will tech aid and fuel the growth of every other kind of business. The areas that we should invest in include cybersecurity, quantum computing, artificial intelligence, transportation and smart manufacturing. Not only are they slated to create many jobs, but they also will increasingly undergird every other industry. A recent study on the projected impact of quantum computing on the New York economy, for instance, found that more than 57,000 new jobs will be generated in this area during the next five years, with that number expected to continue to grow as the technology advances. Policymakers and entrepreneurs need to work together to ensure that momentum keeps moving into the next decade, and create the right business conditions for New York to become an emerging tech hub.

Another way of putting this is reinvention by necessity. With more and more of our lives happening in a virtual world, the safety and efficiency challenges facing organizations have changed. Cyber threats, for example, are now a regular vulnerability for businesses and governments alike. Companies need rapid data processing like never before. Quantum computing and advanced malware detection are crucial for the economy. Not only will emerging tech generate growth, but it will also be a necessary component for the economy of tomorrow.

The next steps are doubling down on workforce development and ensuring that people can actually break into the sectors. Job openings in AI and cybersecurity dont mean much if New Yorkers arent qualified for them. We, therefore, need to expand our roster of digital skills programmingwhich includes computer science in the classroom, boot camps for aspiring coders, and a bevy of private training classes for entrepreneurs and workers. If the tech economy is to be inclusive, well need to put as much emphasis on teaching people the requisite skills as we do teaching them arithmetic.

Closing the digital divide is another step. Before Covid-19, we were already spending a lot of time online. In the midst of the pandemic, that trend has been amplified. People now need speedy, affordable internet connections to do their job, go to school, pay bills and get through each day. The fact that there are disparities in internet access is an impediment to the economy and only exacerbates existing inequalities. A strong 5G network throughout the city and state would help solve that issue and ultimately allow workers to take the necessary steps to move into the tech sector.

The good news is we already have parts of the foundation. New York has nearly unlimited investment resources, and state and local leaders have shown their appreciation for what tech can do.

The key is tying all the parts together and creating a new economy that offers opportunities to all.

Lynn McMahon is themanaging director of Accentures metro New York office. Julie Samuels is the executive director of Tech:NYC.

See the original post here:
New York needs to be reimagined with technology and job training - Crain's New York Business

Forum Teratec 2020 Gathered Experts in Simulation, HPC, Big Data and AI – HPCwire

Oct. 19, 2020 Held in digital format on October 13 and 14, 2020, given the circumstances of COVID-19, Forum Teratec gathered over 1200 experts in Simulation, HPC, Big Data and Artificial Intelligence. It brought together industrialists, users, suppliers and political decision-makers around the essential issue of digital. As President of Teratec Daniel Verwaerde said in his introduction: This crisis demonstrates the fundamental importance of digital, and especially HPC and HPDA in our lives and in our economy.

The Forum Teratec 2020 was up to the challenge of previous years editions, welcoming more than 1,200 participants. It brought together major European decision-makers virtually, including Thierry Breton, the European Commissioner, Florence Parly, the French Minister of the Armed Forces and many industrialists. More than sixty companies and innovative projects presented their latest results with the ability for participants to share experiences during business meetings. In addition, six thematic workshops attended by national and international experts provided an opportunity to review the latest technological advances, in the fields of digital-twin in medicine, quantum computing, satellite data and the environment, AI and scientific computing, Cloud computing and HPC or Exascale.

One strategic stake, both political and economical

In all economic fields, these technologies will be essential and companies able to master them will be the leaders of tomorrow. Thierry Breton, European Commissioner for the Internal Market clearly stated: High-Performance Computing represents a major strategic challenge for Europe as much industrial, technological and, of course, scientific. It is also one of the pillars of our digital autonomy.

Digital autonomy for European States will require the implementation of a network of supercomputers on their territory for all users in industry, research and the public sector.

The European Commission has identified HPC as one of key pillars of the digital decade and decided to invest, together with Member States and industry, more than 8 billion in new-generation supercomputers under the EuroHPC Joint Undertaking.

Beyond supercomputers, European sovereignty is also conditioned by Europes ability to produce processors at best global scope, in order to reduce its dependence in this strategic area. It is also in the process of bringing together all the players involved (research organizations, small and large enterprises, public authorities) within digital ecosystems capable of mastering those technologies that will guarantee Europes competitiveness in the global economy.

Key technologies for all economic sectors

For Florence Parly, French Minister of the Armed Forces: Artificial Intelligence, High-Performance Computing, Quantum computing and, more generally, breakthrough innovations linked to data are subjects of prime importance for the Ministry of the Armed Forces. They are therefore at the heart of innovation and investment strategies, with the aim of devoting them 1 billion a year from 2022.

HPC in the COVID-19 era

During the roundtable discussion How can digital technology serve health in the age of COVID-19?, major sponsors of the Forum Teratec discussed the contribution of HPC and HPDA to the health sector, with obvious particular focus on the COVID-19 pandemic. They were thus able to demonstrate the value of these technologies in the management of the pandemic and in research for treatments and vaccines.

Innovation is core for the 6th Trophies for Simulation and AI 2020

The 6th Simulation and AI 2020 Trophies, organized with LUsine Digitale in partnership with Ansys, the CEA, Inria and Smart 4D, rewarded innovative projects or companies that have carried out an outstanding operation in the field of digital simulation, high-performance computing, Big Data or AI, or their application to healthcare. For each category, the winners are:

Closing the Forum Teratec, Daniel Verwaerde concluded: The Forum Teratec 2020 has shown the major importance of HPC and HPDA for the management of the health crisis and for industrial recovery. I would like to thank over than 1,200 participants who made it a remarkable success, and I look forward to seeing them again at the Forum Teratec 2021 next June, 22 and 23.

https://teratec.eu/forum

Source: Teratec

Read the original:
Forum Teratec 2020 Gathered Experts in Simulation, HPC, Big Data and AI - HPCwire

The Week of October 19, 2020 – FYI: Science Policy News

DOE Selects Reactor Projects for New Demonstration Program

On Oct. 13, the Department of Energy announced awards of $80 million each for two nuclear reactor development projects, funding the first year of new cost-sharing partnerships that aim to demonstrate working prototypes. One of the recipients is TerraPower, a venture backed by Microsoft founder Bill Gates that is developing a reactor design known as Natrium, which uses molten salt as a coolant and aims to be more economical than traditional nuclear power plants. The other recipient is X-energy, which is developing a reactor called Xe-100 that is cooled by helium gas and fueled by TRISO (TRi-structural ISOtropic) fuel pellets that are designed to make meltdowns impossible and enable refueling without a plant shutdown. Congress created the demonstration program through last years appropriations legislation and, while the Trump administration has proposed discontinuing the awards, DOE anticipates it will spend a total of $3.2 billion on them over the next seven years if the funding is made available. The department also expects to make smaller awards in December to between two and five reactor development projects for reducing technical risks, and to at least two early-stage reactor concept development projects. Through its Project Pele, the Defense Department is also funding the development of three TRISO-based designs for mobile nuclear reactors, including one proposed by X-energy, and may eventually support one of the projects through to a prototype demonstration.

The Wall Street Journal reported on Oct. 17 that Chinese government representatives have privately warned U.S. officials that Americans in China may be detained in response to recent arrests of scientists with ties to Chinas military. This summer, the Department of Justice charged three visiting researchers and one graduate student with visa fraud, alleging they lied about their connections to the Chinese military on visa applications. It also charged a visiting researcher for destroying a hard drive, arguing the act interfered with an investigation into possible transfer of sensitive software to Chinas National University of Defense Technology. The department did not confirm the threats to the Journal, but stated, We are aware that the Chinese government has, in other instances, detained American, Canadian, and other individuals without legal basis to retaliate against lawful prosecutions and to exert pressure on their governments, with a callous disregard of the individuals involved. In 2018, China arrested two Canadian citizens shortly after Canada detained the chief financial officer of the telecommunications company Huawei, whom the U.S. had charged with evading sanctions against Iran.

The American Physical Society announced last week it has filed a Freedom of Information Act request with the State Department seeking details on therecent revocation of more than 1,000 visas held by Chinese students and researchers. A May 2020 proclamation by President Trump empowered the department to cancel visas for certain Chinese graduate students and researchers deemed to have current or past ties to an unnamed set of institutions affiliated with the Chinese military. APS states that no administration officials they met with could or were willing to provide any details, such as: an example of a case of student espionage involving university basic research; the number of students the administration claims have engaged in or are charged with espionage; or, an estimate of the impact to the U.S. of the alleged espionage that would form the basis for the proclamation. The FOIA request seeks all internal policy documents related to the proclamation, the names of institutions it applies to, and the names of the U.S. institutions the visa holders were planning to attend, among other details. The request argues, Lacking any public explanation, the denial of visas will only contribute to the growing view that the United States is unwelcoming to foreigners and thereby diminish the ability of the United States to attract top talent, as the APS has seen in its annual survey of international students. (APS is an AIP Member Society.)

The White House published a National Strategy for Critical and Emerging Technologies last week that outlines general steps the U.S. is taking to bolster the National Security Innovation Base and protect technology advantage, such as fostering public-private partnerships and expanding export controls. The strategy also lists 20 broad types of critical and emerging technologies that are identified as priorities across the government. The list overlaps with the White Houses Industries of the Future framework and includes additional items such as energy technologies and chemical, biological, radiological, and nuclear mitigation technologies. In a statement on the strategy, the Commerce Department highlighted its implementation of multilateral export controls on certain emerging technologies pursuant to the Export Control Reform Act of 2018. The latest set, published this month, applies to hybrid additive manufacturing/computer numerically controlled tools; computational lithography software designed for the fabrication of extreme ultraviolet masks; technology for finishing wafers for five nanometer integrated circuit production; digital forensics tools that circumvent authentication or authorization controls on a computer and extract raw data; software for monitoring and analysis of communications and metadata acquired from a telecommunications service provider via a handover interface; and sub-orbital spacecraft.

On Oct. 15, the National Academies announced that its newly established National Science, Technology, and Security Roundtable will be led by MIT Vice President for Research Maria Zuber, former National Intelligence Council Chair John Gannon, and former Nuclear Regulatory Commission Chair Richard Meserve. The roundtable will serve as a forum for representatives of the scientific community, federal science agencies, the intelligence community, and law enforcement officials to discuss concerns and activities related to securing research against exploitation by foreign governments. Congress mandated its creation through the Securing American Science and Technology Act, enacted as part of the National Defense Authorization Act for Fiscal Year 2020. The National Academies has long played a role in advising the government on research security matters, such as through the 1982 Corson report and the 2009 report Beyond Fortress America.

In its quarterly tranche of recommendations released last week, the National Security Commission on Artificial Intelligence proposes a set of broad STEM workforce development initiatives as well as more targeted efforts in microelectronics, quantum computing, and biotechnology. Among its 66 recommendations are for Congress to provide the National Science Foundation with $8 billion over five years to fund 25,000 STEM undergraduate scholarships, 5,000 STEM graduate fellowships, and 500 postdoctoral positions. It also proposes creating a National Microelectronics Scholar Program modeled on the Department of Defenses SMART scholarship-for-service program. For quantum computing, the commission recommends providing researchers with access to quantum computers through a national cloud computing infrastructure and incentivizing domestic manufacturing of component materials through tax credits and loan guarantees. The commission also calls for the White House to create a Technology Competitiveness Council chaired by the vice president to focus government attention on technological innovation.

Among the 97 recommendations released last week by the House Select Committee on the Modernization of Congress is a proposal to reconstitute the long-defunct Office of Technology Assessment as a Congressional Technology and Innovation Lab. The committee explains the new entity would go beyond the mandate of the original OTA by proactively studying and testing new technologies rather than waiting for directives to study technologies. It adds that the lab would employ nonpartisan experts, visiting professors, and graduate students to provide fresh perspectives to members of Congress and their staff. In recent years, there has been a renewed push within Congress to revive OTA, though House appropriators backed away from the idea this year, instead favoring continued expansion of the Government Accountability Offices Science, Technology, Assessment, and Analytics team.

The United Kingdom-based scientific journal Nature officially endorsed Democratic presidential candidate Joe Biden on Oct. 14.Having previously published a news article reviewing ways that President Trump has damaged science, the journal's editorsfurther evaluateTrumps record on issues connected to science and criticizes his divisive approach to politics more generally. TheyargueBiden would chart a starkly different course on matters such as the pandemic, climate change, environmental regulation, and immigration, and urge, Joe Biden must be given an opportunity to restore trust in truth, in evidence, in science and in other institutions of democracy, heal a divided nation, and begin the urgent task of rebuilding the United States reputation in the world. While some scientific publications have broken longstanding positions of neutrality to weigh in on this years election, Nature previously backed Hillary Clinton in 2016, when it referred to Trump as a demagogue not fit for high office, and in 2008 it issued a more measured endorsement of Barack Obama.

More than 1,000 current and former officers of the Centers for Disease Control and Preventions Epidemiology Intelligence Service fellowship programsigned a letter published this month that proteststhe ominous politicization and silencing of the agency. Representing more than a quarter of the people who have participated in the program throughout its nearly 70 year history, the letter adds to the mounting criticism of how the Trump administration has sought control over CDCs pandemic-response efforts. This past week, the Associated Press reported that in June the Trump administration assigned two appointees to the agencys headquarters tasked with keeping an eye on CDC Director Robert Redfield, according to a half-dozen CDC and administration officials. The assignment was made during the same period that the chief spokesperson and a science adviser at the agencys parent department sought to exert control over CDC messaging and scientific products. Both those individuals departed the department last month under a cloud of scandal.

During her nomination hearing last week to fill the Supreme Court vacancy left by the death of Justice Ruth Bader Ginsberg, Amy Coney Barrett declined to explain her personal views on climate change when pressed by Democratic senators. In one exchange, vice presidential candidate Sen. Kamala Harris (D-CA) asked Barrett if she believes smoking causes cancer and whether coronavirus is infectious before then asking if she believes climate change is occurring. Barrett agreed that the coronavirus is infectious and smoking causes cancer, but declined to provide a direct response on climate change, stating, I will not express a view on a matter of public policy, especially one that is politically controversial because thats inconsistent with the judicial role. Harris observed that Barretts appointment to the court could have implications for climate policy, noting Justice Ginsberg voted in favor of the landmark 5-to-4 Massachusetts v. EPA case, which enabled the government to regulate greenhouse gases under the Clean Air Act.

Here is the original post:
The Week of October 19, 2020 - FYI: Science Policy News

Physicists Propose New Field of Study Related to Coherent Ising Machine – Business Wire

PALO ALTO, Calif.--(BUSINESS WIRE)--NTT Research, Inc., a division of NTT (TYO:9432), today announced that Dr. Yoshihisa Yamamoto, the Director of its Physics and Informatics (PHI) Lab, along with colleagues at several academic institutions, has proposed an interdisciplinary research agenda that amounts to a new field of academic study. Their proposal, which arises in the course of addressing a fundamental research problem, appears in an article titled Coherent Ising Machines: Quantum optics and neural network perspectives, published as a Perspectives cover article in Applied Physics Letters (APL) (117 (16) (2020)). The collaborating authors from Stanford University are Drs. Surya Ganguli and Hideo Mabuchi, associate professor and professor, respectively, of applied physics in the School of Humanities and Sciences at Stanford University.

A Coherent Ising Machine (CIM) is a special-purpose processor designed to address particularly difficult types of problems that can be mapped to an Ising model, such as combinatorial optimization problems. The Ising model, named after the physicist Ernst Ising, consists of variables that represent interacting spins, i.e. forms of a fundamental particles angular momentum. A CIM is actually a network of optical parametric oscillators (OPOs) and solves problems by finding the spin configuration that minimizes a problems Ising energy function. (Here is a visualization from MITs Lincoln Laboratory of how a CIM resolves the textbook combinatorial optimization problem of the traveling salesperson; potential current applications range from logistics to medicine to machine learning and beyond.) One condition for the optimal spin state is that it occur well above the lasing threshold, the point at which optical gain of the laser is balanced against its losses. A basic problem of the CIM, however, is that when the laser pump rate is increased from below to above threshold, the machine may be prevented from relaxing to true ground state, for reasons related to the behavior of eigenvectors with minimum values. This article explores two approaches to that problem. The first involves coherent spreading over local minima via quantum noise correlation; the second, implementing real-time error correction feedback. In their discussion of these approaches, the authors offer various perspectives based on a range of interdisciplinary viewpoints that span quantum optics, neural networks and message passing.

Along the way, write the co-authors in the article, we will touch upon connections between the CIM and foundational concepts spanning the fields of statistical physics, mathematics and computer science, including dynamical systems theory, bifurcation theory, chaos, spin glasses, belief propagation and survey propagation.

One reason for engaging in a cross-pollination of ideas across classical, quantum and neural approaches to combinatorial optimization is that, to date, CIM studies could be characterized as primarily experimentally-driven. Large-scale measurement feedback coupling coherent Ising machine (MFB-CIM) prototypes constructed by NTT Basic Research Laboratories are reaching levels of computational performance that, in a fundamental sense, we do not really understand, write the authors. That situation stands in marked contrast to that of mainstream quantum computing, in which laboratory efforts have lagged behind theoretical analyses.

We look forward to accelerated advancement of learning in both the theoretical and experimental studies of CIMs, said Dr. Yoshihisa Yamamoto, director of the PHI Lab at NTT Research, and one of the articles co-authors. Although there is no well-defined method for launching a new academic field of study, we see many rich possibilities for future interdisciplinary research, focused around a multifaceted theoretical and experimental approach to combinatorial optimization that unites perspectives from statistics, computer science, statistical physics and quantum optics, and we are grateful to the editors of APL for providing a forum from which to launch this proposal.

A publication of AIP Publishing, a wholly owned, not-for-profit subsidiary of the American Institute of Physics (AIP), APL features concise, up-to-date reports on significant new findings in applied physics. Perspectives are a new invitation-only article type for the journal, seeking personal views and scientific directions from experts in the field, said APL Editor-in-Chief Lesley F. Cohen. We are absolutely delighted that Dr. Yamamoto and his colleagues accepted our invitation to produce their fascinating and timely Perspective article on this emerging and important topic.

The NTT Research PHI Lab has itself already cast a wide net, as part of its long-range goal to radically redesign artificial computers, both classical and quantum. It has established joint research agreements with seven universities, one government agency and quantum computing software company, covering a wide range of topics. Those universities are California Institute of Technology (CalTech), Cornell University, Massachusetts Institute of Technology (MIT), Notre Dame University, Stanford University, Swinburne University of Technology and the University of Michigan. The government entity is NASA Ames Research Center in Silicon Valley, and the private company is 1QBit.

About NTT Research

NTT Research opened its Palo Alto offices in July 2019 as a new Silicon Valley startup to conduct basic research and advance technologies that promote positive change for humankind. Currently, three labs are housed at NTT Research: the Physics and Informatics (PHI) Lab, the Cryptography and Information Security (CIS) Lab, and the Medical and Health Informatics (MEI) Lab. The organization aims to upgrade reality in three areas: 1) quantum information, neuro-science and photonics; 2) cryptographic and information security; and 3) medical and health informatics. NTT Research is part of NTT, a global technology and business solutions provider with an annual R&D budget of $3.6 billion.

NTT and the NTT logo are registered trademarks or trademarks of NIPPON TELEGRAPH AND TELEPHONE CORPORATION and/or its affiliates. All other referenced product names are trademarks of their respective owners. 2020 NIPPON TELEGRAPH AND TELEPHONE CORPORATION

Read the rest here:
Physicists Propose New Field of Study Related to Coherent Ising Machine - Business Wire

AutoML Alleviates the Process of Machine Learning Analysis – Analytics Insight

Machine Learning (ML)is constantly being adopted by diverse organizations in an enthusiasm to acquire answers and analysis. As the embracing highly increases, it is often forgotten that machine learning has its flaws that need to be addressed for acquiring a perfect solution.

Applications of artificial intelligence andmachine learning are using new toolsto find practical answers to difficult problems. Companies move forward with the emerging technologies to get a competitive edge on their working style and system. Through the process, organizations are learning a very important lesson that one strategy doesnt fit for all.Business organizations want machine learningto do analysis on large data, which is complex and difficult. They neglect the fact that machine learning cant perform on diverse data storage and even if it does, it will conclude with a wrong prediction.

Analysing unstructured and overwhelming large datasets on machine learning is dangerous. Machine learning might conclude with a wrong solution while performing predictive analysis on such data. The implementation of the misconception in a companys working system might drag down its improvement. Many products that incorporatemachine learning capabilitiesuse predetermined algorithms and many diverse ways to handle data. However, each organizations data has different technical characteristics that might not go well with the existing machine learning configuration.

To address the problems where machine learning falls short, AutoML takes head-on in the companys data analysis perspective. AutoML takes over labour intensive job of choosing and tuning machine learning models. The new technology takes on many repetitive tasks where skilful problem definition and data preparation are needed. It reduces the need to understand algorithm parameters and shortening the compute time needed to produce better models.

Machine learning is an application of artificial intelligence that provides systems with the ability to automatically learn and improve from experience without being explicitly programmed. The technology focuses on the development of computer programs that can access data and use it for themselves. It is a model created and trained on a set of previously gathered data, often known as outcomes. The model can be used tomake predictions using that data.

However, machine learning cant get accurate results all the time. It depends on the data scientist handling the machine learning configurations and data inputs. A data scientist studies the input data and understands the desired output to solve business problems. They choose the apt mathematical algorithm from a dozen and tune those parameters called hyperparameters and evaluate the resulting models. The data scientist has the responsibility to adjust the algorithms tuning parameters again and again until the machine learning model produces the desired result. If the results are not tactic, then the data scientist might even start from the very beginning.

Machine learning system struggles to function when the data is too large or unorganised. Some of the other machine learning issues are,

Classification- The process of labeling data can be thought to as a discrimination problem, modeling the similarities between groups.

Regression- Machine learning staggers to predict the value of a new unpredicted data.

Clustering- Data can be divided into groups based on similarity and other measures of natural structure in data. But, human hands are needed to assign names to the groups.

As mentioned earlier, machine learning alone cant address the datasets of an organisation to find predictions. Here are some reasons why tuning a machine learning algorithm is challenging to choose and how AutoML can prove to be useful at such instances.

Choosing the right algorithm: It is not always obvious to choose a perfect algorithm that might work well for building real-value predictions, anomaly detection and classification models for a particular data set. Data scientists have to go through many well-known algorithms of machine learning that could suit the real-world situation. It could take weeks or even months to come up with the right algorithm.

Selecting relevant information: Data storage has diverse data variables or predictors. Henceforth, it is hard to tell which of those data points are significant for making a decision. This process of selecting relevant information to include in data models is called feature selection.

Training machine learning models: The most difficult process in machine learning is to choose a subset of data that can be used for training a machine learning model. In some cases, training against some data variables or predictors can increase training time while actually reducing the accuracy of the ML model.

Automated machine learning (AutoML)basically involves automating the end-to-end process of applying machine learning to real-world problems that are actually relevant in the industry.AutoML makes well-educated guessesto select a suitable ML algorithm and effective initial hyperparameters. The technology tests the accuracy of training the chosen algorithms with those parameters and makes tiny adjustments, and tests the results again. AutoML also automates the creation of small, accurate subsets of data to use for those iterative refinements, yielding excellent results in a fraction of the time.

In a nutshell, AutoML acts as a right tool that quickly chooses, builds and deploys machine learning models that deliver accurate results.

See the original post here:
AutoML Alleviates the Process of Machine Learning Analysis - Analytics Insight

Synopsys and SiMa.ai Collaborate to Bring Machine Learning Inference at Scale to the Embedded Edge – AiThority

Engagement Leverages Synopsys DesignWare IP, Verification Continuum, and Fusion Design Solutions to Accelerate Development of SiMa.ai MLSoC Platform

Synopsys, Inc.announced its collaboration with SiMa.ai to bring its machine learning inference at scale to the embedded edge. Through this engagement, SiMa.ai has adopted key products from SynopsysDesignWare IP,Verification Continuum Platform, andFusion Design Platformfor the development of their MLSoC, a purpose-built machine-learning platform targeted at specialized computer vision applications, such as autonomous driving, surveillance, and robotics.

Recommended AI News: Medical Knowledge Group Continues Growth With Acquisiton Of Magnolia Innovation To Provide Expanded Services To Biopharmaceutical Industry

SiMa.ai selected Synopsys due to its expertise in functional safety, complete set of proven solutions and models, and silicon-proven IP portfolio that will help SiMa.ai deliver high-performance computing at the lowest power. With Synopsys automotive-grade solutions, SiMa.ai can accelerate their SoC-level ISO 26262 functional safety assessments and qualification while achieving their target ASILs.

Working closely with top-tier customers, we have developed a software-centric architecture that delivers high-performance machine learning at the lowest power. Our purpose-built, highly integrated MLSoC supports legacy compute along with industry-leading machine learning to deliver more than 30x better compute-power efficiency, compared to industry alternatives, said Krishna Rangasayee, founder and CEO, at SiMa.ai. We are delighted to collaborate with Synopsys towards our common goal to bring high-performance machine learning to the embedded edge. Leveraging Synopsys industry-leading portfolio of IP, verification, and design platforms enables us to reduce development risk and accelerate the design and verification process.

Recommended AI News: Building A Private Database-As-A-Service Is Emerging As A Prime Alternative To Managed Cloud Databases

We are pleased to support SiMa.ai as it brings MLSoC chip to market, saidManoj Gandhi, general manager of the Verification Group at Synopsys. Our collaboration aims to address SiMa.ais mission to enable customers to build low-power, high-performance machine learning solutions at the embedded edge across a diverse set of industries.

Since SiMa.ais inception it has strategically collaborated with Synopsys to support all aspects of their MLSoC architecture design and verification.

Recommended AI News: NEC Selects NXP RF Airfast Multi-Chip Modules For Massive MIMO 5G Antenna Radio Unit For Rakuten Mobile In Japan

View original post here:
Synopsys and SiMa.ai Collaborate to Bring Machine Learning Inference at Scale to the Embedded Edge - AiThority

Quantum Computing Market 2020 | Outlook, Growth By Top Companies, Regions, Types, Applications, Drivers, Trends & Forecasts by 2025 – PRnews…

Market Study Report, LLC, has added a research study on Quantum Computing market which delivers a concise outline of the market share, market size, revenue estimation, geographical outlook and SWOT analysis of the business. The report further offers key insights based on growth opportunities and challenges as experienced by leaders of this industry, while evaluating their present standing in the market and growth strategies.

The new Quantum Computing market research report presents a granular analysis of the business outlook and also covers the world market overview. It throws lights on various market segmentations based on product type, application spectrum, well-established companies, and regions.

Request a sample Report of Quantum Computing Market at:https://www.marketstudyreport.com/request-a-sample/2855012?utm_source=prnewsleader.com&utm_medium=SK

Additionally, the document analyses the impact of COVID-19 on the market growth.

Key features of Quantum Computing market report:

Regional Analysis of Quantum Computing market:

Quantum Computing Market Segmentation: Americas, APAC, Europe, Middle East & Africa

Overview of the regional terrain of Quantum Computing market:

Product types and application scope of Quantum Computing market:

Product landscape:

Product types: Hardware, Software and Cloud Service

Key factors enclosed in the report:

Ask for Discount on Quantum Computing Market Report at:https://www.marketstudyreport.com/check-for-discount/2855012?utm_source=prnewsleader.com&utm_medium=SK

Application Landscape:

Application segmentation: Medical, Chemistry, Transportation, Manufacturing and Others

Details stated in the report:

Other details specified in the report:

Competitive spectrum of the Quantum Computing market:

Competitive landscape of Quantum Computing market: D-Wave Solutions, IBM, Microsoft, Rigetti Computing, Google, Anyon Systems Inc., Intel, Cambridge Quantum Computing Limited and Origin Quantum Computing Technology

Major features as per the report:

For More Details On this Report: https://www.marketstudyreport.com/reports/global-quantum-computing-market-growth-status-and-outlook-2020-2025

Related Reports:

1. Global Mortgage Brokerage Services Market Growth (Status and Outlook) 2020-2025Read More: https://www.marketstudyreport.com/reports/global-mortgage-brokerage-services-market-growth-status-and-outlook-2020-2025

2. Global Auto Leasing Services Market Growth (Status and Outlook) 2020-2025Read More: https://www.marketstudyreport.com/reports/global-auto-leasing-services-market-growth-status-and-outlook-2020-2025

Related Report : https://www.marketwatch.com/press-release/latest-figures-global-smart-home-healthcare-market-to-witness-us-30-billion-by-2025-2020-10-15

Contact Us:Corporate Sales,Market Study Report LLCPhone: 1-302-273-0910Toll Free: 1-866-764-2150 Email: [emailprotected]

Go here to read the rest:
Quantum Computing Market 2020 | Outlook, Growth By Top Companies, Regions, Types, Applications, Drivers, Trends & Forecasts by 2025 - PRnews...

What is an algorithm? How computers know what to do with data – The Conversation US

The world of computing is full of buzzwords: AI, supercomputers, machine learning, the cloud, quantum computing and more. One word in particular is used throughout computing algorithm.

In the most general sense, an algorithm is a series of instructions telling a computer how to transform a set of facts about the world into useful information. The facts are data, and the useful information is knowledge for people, instructions for machines or input for yet another algorithm. There are many common examples of algorithms, from sorting sets of numbers to finding routes through maps to displaying information on a screen.

To get a feel for the concept of algorithms, think about getting dressed in the morning. Few people give it a second thought. But how would you write down your process or tell a 5-year-old your approach? Answering these questions in a detailed way yields an algorithm.

To a computer, input is the information needed to make decisions.

When you get dressed in the morning, what information do you need? First and foremost, you need to know what clothes are available to you in your closet. Then you might consider what the temperature is, what the weather forecast is for the day, what season it is and maybe some personal preferences.

All of this can be represented in data, which is essentially simple collections of numbers or words. For example, temperature is a number, and a weather forecast might be rainy or sunshine.

Next comes the heart of an algorithm computation. Computations involve arithmetic, decision-making and repetition.

So, how does this apply to getting dressed? You make decisions by doing some math on those input quantities. Whether you put on a jacket might depend on the temperature, and which jacket you choose might depend on the forecast. To a computer, part of our getting-dressed algorithm would look like if it is below 50 degrees and it is raining, then pick the rain jacket and a long-sleeved shirt to wear underneath it.

After picking your clothes, you then need to put them on. This is a key part of our algorithm. To a computer a repetition can be expressed like for each piece of clothing, put it on.

Finally, the last step of an algorithm is output expressing the answer. To a computer, output is usually more data, just like input. It allows computers to string algorithms together in complex fashions to produce more algorithms. However, output can also involve presenting information, for example putting words on a screen, producing auditory cues or some other form of communication.

So after getting dressed you step out into the world, ready for the elements and the gazes of the people around you. Maybe you even take a selfie and put it on Instagram to strut your stuff.

Sometimes its too complicated to spell out a decision-making process. A special category of algorithms, machine learning algorithms, try to learn based on a set of past decision-making examples. Machine learning is commonplace for things like recommendations, predictions and looking up information.

[Deep knowledge, daily. Sign up for The Conversations newsletter.]

For our getting-dressed example, a machine learning algorithm would be the equivalent of your remembering past decisions about what to wear, knowing how comfortable you feel wearing each item, and maybe which selfies got the most likes, and using that information to make better choices.

So, an algorithm is the process a computer uses to transform input data into output data. A simple concept, and yet every piece of technology that you touch involves many algorithms. Maybe the next time you grab your phone, see a Hollywood movie or check your email, you can ponder what sort of complex set of algorithms is behind the scenes.

Read the original post:
What is an algorithm? How computers know what to do with data - The Conversation US

Most Read articles – LED drivers, Foundry market, Arm staffing – Electronics Weekly

What areas are covered? Theres Nexperia LED drivers, Fujitsu quantum computing, STs acquisition of SOMOS Semiconductor, Chinas share of the foundry market and the issue of Arm being legally required to hire more staff

5. Nexperia launches LED drivers in compact packageNexperia has brought out a range of LED drivers in the DFN2020D-6 (SOT1118D) package. This case style features side-wettable flanks (SWF) which facilitate the use of AOI (automated optical inspection), and improve reliability. This is the first time LED drivers have been available in this package. The leadless devices join Nexperias wide range of LED drivers in leaded packages offering equivalent performance yet reducing PCB space by up to 90% compared to SOT223.

4. Fujitsu collaborates to make practical quantum computing a realityFujitsu has joined with Riken and the universities of Tokyo, Osaka and Delft to make practical quantum computing a reality. The collaboration aims to achieve comprehensive and efficient advances in quantum computing by applying quantum computing to various fields currently facing problems that are extremely difficult to solve. Currently, even using superconducting chips which are leading the way in quantum computing, systems remain limited to about 50-qubits, making it hard to perform useful calculations.

Read the original post:
Most Read articles - LED drivers, Foundry market, Arm staffing - Electronics Weekly

Quantum computing | Ground-breaking commercial opportunities from solving the (as yet) unsolvable – Lexology

Quantum computers will support powerful commercial applications by solving currently intractable problems. We consider the pathway to bringing this technology to market, and some of the related legal issues.

Even with the biggest supercomputers in the world, some problems remain too complex to unravel.

This is not just an academic issue many valuable commercial applications can be imagined but not delivered, or can currently only be delivered in a limited, constrained way. This includes:

As our guest speaker Michael Beverland of Microsoft Quantum explained at our recent webinar , these are the kinds of problem which quantum computers are expected to be able to resolve. The ability to solve "intractable" problems is expected to be transformative in ways we can currently only guess at. From a legal and policy perspective, quantum computing is already considered to be a strategic technology in many jurisdictions, potentially subject to export controls and with increasing scrutiny around investments and acquisitions concerning businesses in this field.

Mathematicians have already proved that quantum computing will be able to outperform classical computing in relation to cybersecurity. It will be possible to break currently secure forms of encryption such as RSA with a sufficiently powerful quantum computer. We'll be considering this issue and the resulting legal risks at a further webinar.

The power of qubits

Quantum computing is different to classical computing in every sense. It rethinks computer processing by structuring it around the utterly different physics that applies at the sub-atomic level quantum mechanics. Classical computers rely on processing bits in two alternative states: one or zero. However large and sophisticated the machine, each bit represents a single state and processing happens in a linear fashion, one task at a time.

Put very simply, quantum processing units, or "qubits", work differently: not only can an infinite number of positions be represented on a single qubit, but an individual qubit interacts with all other qubits in the system. This means that their states can be understood simultaneously, rather than the sequential approach of single state bits of classical computing. With a sufficiently large quantum computer, all elements of a problem can be represented and processed at the same time.

As the number of qubits in a single system increases, its processing power increases exponentially. So a 20 qubit machine is loosely as powerful as a smart phone; a 30 qubit machine is comparable to a laptop and a 50 qubit machine roughly the point which quantum hardware development has currently reached is equal to the world's most powerful supercomputers.

The aspiration of quantum computing researchers is to build far greater quantum computers which can unravel currently intractable problems. This is not just a matter of increasing the qubits: a further challenge is the currently high error rate in outputs. As this is addressed, the useable power of a quantum computer will increase.

The timeframe for change

Forecasting when quantum systems will reach the point where they can solve these complex problems depends on the problem. Not all intractable problems are equal. As Michael explained in our webinar, modelling certain molecules, for example, is likely to be achieved some time before RSA 2048 encryption will be broken. Views differ, but some predictions put the advent of game-changing machines as close as five years away.

Systems that harness quantum mechanics in less complex ways are also being developed and are already available commercially quantum annealing machines are one such example, which can be used for addressing optimisation problems. Even if scalable quantum computing is not yet with us, many businesses are already involved or investing in pathway projects to expand ideas about what can be achieved, and to build understanding of how to deliver those ideas.

Bringing the technology to market

Developments in relation to hardware tend to get the most publicity, but extensive and difficult work is also needed to create the full software stack. This technology promises to be a whole new ecosystem within the tech sector. Michael commented that this multidisciplinary research requires physicists, mathematicians, computer scientists, and various engineering specialisms; and that significant problems remain in each field.

The complexity of the hardware is one factor driving the expectation that the commercial delivery of quantum computing will follow the cloud services model quantum computing "as a Service". This expectation is reinforced by the ubiquitous shift into cloud-based delivery for processing of all types.

Liability frameworks

How will "Quantum-as-a-Service" develop? Cloud services contracts for full stack applications are well established. Granting "Beta" access to software and systems which are still in development is also common. Nevertheless, the particularities of quantum computing may require rethinking the contractual frameworks for suppliers and users of this technology.

For example, liability under English law requires foreseeability of harm. Exclusion clauses seek to deal with losses that flowed naturally from the breach of contract and were in the contemplation of the parties. The complex and "spooky" interactions of the qubits at the heart of a quantum system will remain considerably less stable, and potentially also less predictable, than the robust reliability of classical systems for some time to come. Such complexity may mean that it is much more difficult for the parties to anticipate the difficulties which could arise. As commercial access to these systems develops, it may well be that a new approach is needed to framing and allocating liability.

It may also be that there is greater scope for negotiating bespoke cloud-based access than would usually be the case with the major cloud service providers. Smaller scale access to the technology at lower contract values may well be on standard terms and conditions, with limited flexibility. But strategic partnerships may well be bespoke arrangements, particularly in the early years and particularly for projects which feed into greater understanding of how to harness the power of quantum systems, and which develop and expand understanding of commercial applications.

Service levels

Similarly, although overall performance is expected to be higher, the lower stability and higher error rate (compared to classical systems) of quantum computing may also mean that the parameters for measuring service levels need to be re-thought. The ways of proving the level of service actually delivered to the service user may equally require a fresh approach given how quantum computing solutions operate.

Quantum computing is currently in its early, pioneering phase similar to classical computing in the middle of the last century, before silicon or miniaturisation and before anyone had any concept of how the internet, smart phones or machine learning would change how we operate. We are looking forward to working with our clients to realise and deliver this exciting new era of computing.

If you'd like to discuss the issues discussed above further, please dont hesitate to contact the authors or your usual Osborne Clarke contact.

Follow this link:
Quantum computing | Ground-breaking commercial opportunities from solving the (as yet) unsolvable - Lexology

Inside Quantum Technology Europe Virtual Conference Looks to the Future of Quantum Computing, Networking, Sensors and Cryptography, October 26-30,…

NEW YORK, Oct. 14, 2020 /PRNewswire/ --3DR Holdings today announced details regarding the second annual edition of Inside Quantum Technology Europe, the premier conference dedicated to the business of quantum computing, quantum networking, quantum sensors, and quantum technology.

Following its 2019 European debut in The Hague, this year's Inside Quantum Technologyevent will be Europe's largest online quantum technology event, featuring five days of presentations that will run for 4 hours daily, with archived sessions available to all registrants through the end of November.

Launching on October 26 with exhibitor presentations and virtual networking, each day is vertically focused on Quantum Computing (10/27), Quantum Computing Software and Applications (10/28), Quantum Communications (10/29), and Quantum Sensors, Quantum Policy and Quantum Investments (10/30).

In addition to sessions led by leaders from the worlds of research, academia, finance, pharma and technology, Inside Quantum Technology Europe features presentations from top executives and technologists at the industry's pioneering companies, including QuTech, D-Wave, QC Ware, Cambridge Quantum Computing and Riverlane.

Further, attendees will have the opportunity to learn about quantum initiatives from innovators at the world's leading corporations, including:

"We've come a long way since we held the first ever quantum technology conference in Boston two years ago, and as quantum development continues at a rapid pace our event will focus on insight from end-users, technology firms and policy makers that are making quantum technology a reality," said Lawrence Gasman, President of Inside Quantum Technology. "And as the conference features speakers from European multinationals and European-based quantum startups, attendees will have the opportunity to learn about quantum development on a global scale."

Inside Quantum Technology Europe conference session topics include:

For additional details about Inside Quantum Technology Europe, including the complete agenda, registration information, sponsorship and exhibition options, please visit https://europe.iqtevent.com.

About 3DR Holdings3DR Holdings is a technology media organization with website, research and international trade show interests in the fields of 3D Printing and Quantum Technology. For more information, please visit https://3drholdings.com.

About Inside Quantum TechnologyFounded by Lawrence Gasman and Alan Meckler, Inside Quantum Technologyis the first company entirely dedicated to meeting the strategic information and analysis needs of the emerging quantum technology sector.In addition to arranging conferences and publishing articles of critical importance to the quantum technology sector, the company's consulting group, provides published reports on important revenue opportunities in quantum technology including quantum computer markets and software, quantum key distribution,post-quantum cryptography,quantum sensors, and on important verticals such as the military, the financial sector, big pharma, and more. For additional information, please visit https://www.insidequantumtechnology.com.

Media Contact: Barry Schwartz, Schwartz Public Relations[emailprotected], 212-677-8700 ext. 118

SOURCE 3DR Holdings

Read the original post:
Inside Quantum Technology Europe Virtual Conference Looks to the Future of Quantum Computing, Networking, Sensors and Cryptography, October 26-30,...

Menlo Micro, a startup bringing semiconductor tech to the humble switch, is ready for its closeup – TechCrunch

Sixteen years ago a group of material scientists and engineers at General Electric banded together to reinvent the circuit breaker. Now, Menlo Microsystems, the spin-off commercializing that technology, is ready to bring its revolutionary new switches to market, with huge implications for everything from 5G technologies to quantum computing.

Based in Irvine, California, Menlo Micro takes its name from the Menlo Park, New Jersey research lab where Thomas Edison patented the first light switch back in 1893 and the companys ties to GE run deep.

Researchers at GE spent more than a decade working internally on Menlo Micros core technology, a novel process that applies semiconductor manufacturing techniques to the production of micro electro-mechanical systems, before spinning it out into a new business five years ago.

Using a novel alloy, Menlo Micro is able to reduce the size of the switches it makes to 50 microns by 50 microns, or roughly the width of a human hair. This miniaturization can enable hardware manufacturers to come up with completely new designs for a host of products that used to require much larger components.

The micro electro-mechanical system that we use to make this thats not new, said Russ Garcia, the companys chief executive. The problem was the first level innovation is how do you take a mechanical switch like the light switch or a relay and scale that down to a wafer.

Many companies have tried to make MEMs contact switches, spending hundreds of millions of dollars, but Garcia said that the reliability and durability of the switches was always an issue. The material science behind Menlos switches solves the problem, he said.

Menlos switches pack lots and lots of MEMS relays onto a single chip that can function like a massive mechanical relay, reducing something that was the size of a fist to something the size of a microchip.

The companys founders think the potential uses are pretty limitless, thanks to the massive size reduction and increased durability that its switches offer.

Closeup of a Menlo Micro switch. Image Credit: Menlo Micro

One way to look at this is in edge and IOT applications, said company co-founder Chris Giovanniello, a former vice president of business development at GE Ventures and Menlo Micros current senior vice president of worldwide marketing. What we tend to think about and what most of the industry thinks about is low energy Bluetooth and Wi-Fi and low power processing for decision making. Once youve sensed it, communicated it and made a decision, you have to do something about it.

Initially Menlo Micro spun out from GE with Giovanniello and co-founders including chief technology officer Chris Keimel and Jeff Baloun, the senior vice president of operations. Garcia, who saw the companys initial pitch at a semiconductor conference where GE was touting the technology, was brought on board by one of Menlo Micros early investors, Paladin Capital Group.

Paul Conley of Paladin Capital sent me this deck and said Wow there might be something there. We met Chris and then met up with the other Chris they wanted me to help out with strategy, Garcia said. He wound up coming on board as a founding executive.

Current solid state technologies tasked with making something happen based on the data currently use more power than the rest of the systems that theyre tied to. Menlo Micros chips would substantially reduce energy loss and improve the efficiency of entire systems, he said.

If you think of the light switch in your house, its two metal contacts that come together. If that contact is really good and clean the electricity flows through very efficiently and when you turn it off no electricity can flow through and [nothing] happens at all, said Garcia, a longtime executive in the MEMS industry. In a semiconductor, theres loss in that contact. When you run a transistor on it it allows the energy to flow through but loses some of that energy in heat, and when you turn it off it allows some of that energy to flow through. When you take the billions of switches all of that incremental energy is completely lost.

The benefits of the technology mean demand from the defense industry, which wants to put the new switches in radar, radio and satellite networks. Commercial applications include Wi-Fi connectivity, 5G cell networks and for radio frequency and microwave switching. Consumers could see the switches in cell phones meaning fewer dropped calls, higher speeds and capacity for data, and longer battery life.

Menlo has already sent samples from its production line to 30 lead customers in aerospace and defense, telecommunications and test and measurement. And the company has raised $44 million in new funding from investors, including Nest founder Tony Fadells Future Shape Group, to boost its production capacity to meet potential demand.

The concept of an ideal switch was theoretical something companies have been working to achieve for decades until Menlo Micro, said Marianne Wu, the former head of GE Ventures and current managing director of 40 North Ventures, which led Menlo Micros latest round. We are incredibly excited to work with such a dynamic, experienced team on a core technology that is disrupting nearly every industry.

Series of Menlo Micro switches on a circuit board. Image Credit: Menlo Micro

Over the last 30 months, Menlo Micro said it has completed the transfer and qualification of its manufacturing process, moving from a four-inch research fab to a new eight-inch high-volume manufacturing line.

That means the company is able to increase production for its initial products and boost its capacity. With the qualification in hand, the company expects to bring production up to over 100,000 units per month by the end of 2020 and reach production capacity for millions of switches per month in 2021.

So beyond telecommunications and defense, there are target markets in energy storage, automotive and aerospace because of the miniaturization while quantum computing companies are interested in the technology because of its durability.

The relay is a large mechanical device that you can hold in your hand and used in many applications for turning on and off the power that goes to an industrial piece of equipment to your car to motors that need to be driven, said Giovanniello. Theyre very hard to integrate because theyre so big. We can take the electrical characteristics of having a true metal to metal on low-loss connection and then, when its open theres an air gap that no current can flow through We can integrate [the switches] into completely different architectures.

Ultimately, Giovanniello said the go-to-market strategy is to focus on the rule of 99.

Were able to reduce the size, the weight and the power of the box that [the switch] is going into by up to 99%. Thats a huge improvement in infrastructure and cost, he said.

For companies developing quantum computers, the value proposition is not just about the size of the MEMS, but the durability of the alloy that Menlo Micro has developed. For quantum, you have to have devices that operate at close to absolute zero Semiconductors dont work down to those temperatures so they use old-fashioned mechanical relays [which] can take hours to get back to temperature, Giovanniello said. Our materials are so robust they work [at temperatures] down to a few milikelvins.

Its this flexibility and the potential redesign of old industrial technologies that havent been updated for nearly a century that has enabled the company to bring in $78 million in funding from investors, including Piva, Paladin Capital Group, Vertical Venture Partners, Future Shape and strategic investors like Corning and Microsemi.

For 40+ years, the industry has been searching for a switch that has the perfect combination of the electromechanical relay and the silicon transistor, said Tony Fadell, in a statement. [This technology] is a tiny, efficient, reliable micro-mechanical switch with unmatched RF-performance and, counterintuitively, high-power handling of 1,000s of Watts. As our world moves to the electrification and wireless of everything, Menlo Micros deep innovation is already triggering massive cross-industry upheaval.

More:
Menlo Micro, a startup bringing semiconductor tech to the humble switch, is ready for its closeup - TechCrunch

Put Employees at the Center of Your Post-Pandemic Digital Strategy – Harvard Business Review

Executive Summary

Its time to rethink your digital strategy in the context of people. Its not just about adding new technologies like quantum computing, IoT, or AI, but how that tech will make your employees connect more effectively with their work. Its also time to shift from the here-and-now and look further out, revisiting your long-term strategies. To get the most out of your technology investments, you need to hit the pause button and think more about how you can connect your people to the goals you hope to achieve with that technology.

When the pandemic hit in March, many companies long-term plans and strategies were thrown out the window, as everyone from the frontlines to the C-suite shifted into fire-fighting mode. Many worked around the clock by leveraging remote technology. Its often been exhausting, as each day seems to bring new challenges and obstacles to overcome. As a result, the past six months have felt more like six years to a lot of us.

This pace isnt sustainable. While you may have needed your organization to run at 200 miles-per-hour as you learned to adjust to the new realities of the pandemic, youre now risking serious burnout among your team. Research shows that employees are reporting alarming levels of stress and fatigue, and the risk for depression among U.S. workers has risen by 102% as a result of the Covid-19 pandemic.

This is becoming a serious threat to organizations, including those who have already been forced to lay off staff or downsize. The paradox is that while many organizations have gained new efficiencies from embracing digital transformation using technologies such as Zoom to keep their workforce functioning remotely they may now risk losing their best employees, many of whom feel disconnected and disengaged in this new digital workplace. A recent survey from the consultancy KPMG found that losing talent is now the number one risk organizations face.

Thats why its time to rethink your digital strategy in the context of people. Its not just about adding new technologies like quantum computing, IoT, or AI, but how that tech will make your employees connect more effectively with their work. Its also time to shift from the here-and-now and look further out, revisiting your long-term strategies. To get the most out of your technology investments, you need to hit the pause button and think more about how you can connect your people to the goals you hope to achieve with that technology.

Over the course of my career, Ive studied more than 1,000 organizations and have coached more than 100 organizations that have undergone significant transformations. Over the past five years, Ive been particularly interested in the impact of DT and how organizations can leverage technology for growth. What Ive learned is that most digital transformation efforts fail often spectacularly which leads to hundreds of billions of dollars in wasted investment and the deterioration of employee engagement.

My mission has been to help coach organizations to achieve more positive outcomes through their digital transformation efforts. More recently, Ive been researching how the model I developed last year a transformation framework in partnership with the Project Management Institute (PMI), called The Brightline Transformation Framework can be applied to Covid-19 and its impact on organizational efforts to embrace digital transformation.

Specifically, this approach aligns the inside-out which means aligning every employees most important personal aspiration with the outside-in, where employees understand and embrace the companys strategic vision, so that everyone is working toward the same objectives.

Outside-In Approach. Employees must first understand and embrace the companys north star, including customer insights and megatrends, so everyone is working toward the same objectives.

Inside-Out Approach. Aligning every employees purpose or personal north star with those of the company includes:

Taking this approach is more relevant than ever in the wake of the pandemic, as it emphasizes that employees personal goals and engagement are the critical factors underpinning every successful transformation much more so than other elements like technology or business processes.

For organizations to thrive in a post-Covid world, while simultaneously tackling the challenges of burnout and the threat to employee retention, there is an urgent need to rethink these two key areas:

1. Bring the Outside In

The pandemic has changed the landscape of many industries ecosystems leading to an existential crisis for many organizations. Consider Airbnb, whose business suffered a loss of a billion dollars due to guest cancellations all while paying out some $250 million to compensate their hosts for their losses. The company now recognizes that nothing will ever be the same again. To help engage their team in adjusting to the new realities of the marketplace, the leadership team embarked on an outside-in transformation exercise that helped them identify their new north star; the transformational goal they wanted to achieve that could help propel the company forward for the long run.

As CEO Brian Chesky framed it, the companys new goal was to get back to our roots, back to the basics, back to what is truly special about Airbnb everyday people who host their homes and offer experiences. One of the trends Chesky and his team identified was that, as a result of the pandemic, there is a growing acceptance that people can now work from anywhere which could open up new opportunities to service customers interested in traveling and experiencing unique communities and cultures for an extended time. At the same time, the company has begun winding down activities that werent core to the business such as scaling back on investments in transports, hotels, and luxury properties.

2. Align Your Inside-Out with the Outside-In

Once Airbnb had established where it wanted to go, the company embarked on an inside-out journey with its employees helping them connect to the companys new north star by creating personal/team vision statements that aligned with the greater goal to help create the human connections that so many people miss these days. The idea was to enlist employees help in rebuilding the business, and to enlist their feedback on how they could directly impact the companys efforts to scale and prosper again.

Another Outside-In/Inside-Out transformation effort has been occurring at Kasikornbank (KBank), one of the largest banks in Thailand. [Disclosure: they are a client of mine.] The companys north star was not only to save jobs they kept all their workers during the pandemic but also to save their customers: small and medium-sized businesses. KBank and its employees worked closely with thousands of their clients to help them weather the storm by offering to delay their loan payments, as long as those businesses also avoided layoffs the kind of program usually only initiated by governments. Its estimated that KBanks efforts saved some 41,000 jobs, which gave their employees a sense of purpose, confidence, and loyalty as a result of their organization making such a positive difference to their country.

Covid-19 has taught us how connected and integrated we all are with each other and with the communities in which we operate. Its now time to give your employees the opportunity to understand how your organizations north star aligns with their desire to contribute to a meaningful cause. Thats how you get them to re-engage while recharging their emotional energy stores. The longer you wait to make these connections, the more your organization is at risk of losing the human capital it requires to thrive into the future, regardless of how much you spend on technology.

Originally posted here:
Put Employees at the Center of Your Post-Pandemic Digital Strategy - Harvard Business Review

The Future of Computing: Hype, Hope, and Reality – CIOReview

Bill Reichert, Partner, Pegasus Tech Ventures

For roughly 75 years, the fundamental architecture of computers has not changed much. Certainly, the hardware has changed tremendously, and software has evolved accordingly. But the basic idea of storing instructions and data in binary code, and using on/off digital hardware to execute mathematical and logical operations, has remained roughly the same for decades.

All that is changing.

The same advances in semiconductor fabrication technology that powered Moores Lawthe exponential increase in the power of computers over the last several decadeshave enabled hardware engineers to develop new architectures that promise to transform the computing landscape over the coming decades.

At the same time, software engineering is also progressing. Marc Andreessen has famously said, Software is eating the world. What he did not make clear, though, is that virtually all the progress in computing over the past 30 years has been thanks to hardware, not software.

Heterogeneous Computing

New architectures, however,require that both software engineers and hardware engineers work together. A new class of hardware is emerging that takes advantage of what is called heterogeneous computing, multi-core chips that incorporate multiple different co-processors on the chip that are optimized for specialized tasks. Writing software that takes full advantage of these new chips is extremely challenging, and so companies like SambaNova Systems are developing operating systems and software compilers that optimize the application code automatically and allocate resources to compute tasks dynamically in real-time as computing demands change.

AI Chips

With the emergence of deep neural network software, engineers realized that Graphics Processing Units, an architecture commercialized by Nvidia, were nicely designed for doing the massive matrix calculations required by neural network models. But GPUs are not exactly optimized for AI, and so there has been an explosion of startups seeking to develop chips that offer 10x or 100x the performance and power efficiency of GPUs. On the server side, companies like Cerebras Systems and Graphcore, and more recently SambaNova, are promising order of magnitude improvements. And on the edge, companies like Gyrfalcon Technology, Syntiant, and Blaize are promising even greater improvements inperformance and power efficiency.

Virtually all the progress in computing over the past 30 years has been thanks to hardware, not software

Edge Computing

The second half of the 20th century was all about moving computing from centralized mainframe computers to desktop and laptop distributed computers. With the development of a high-speed Internet, the thinking shifted, and an application could sit in the cloud and support thousands, even millions, of users. But as the Internet of Things took off and enabled data collection from literally billions of devices, moving all that data up to the cloud in order to crunch it has become a challenge. Now companies are looking to process data at the edge, at the point of collection, rather than sending it up to the cloud, thereby reducing latency and cutting bandwidth and storage costs. At its simplest level, edge computing filters out unimportant data and sends only the most important data to the cloud. For more complex tasks, such as autonomous driving, edge computing requires processing massive AI models and making very accurate judgments in milliseconds. For these tasks, the new special-purpose chips discussed above and below are fighting for design wins.

Analog Computing

As brilliant as binary code is for enabling absolutely precise calculations, the real world is analog, not digital, and many compute tasks could be done more efficiently if we could operate with analog values rather than having to digitize them. But analog computing is imprecise, and most computing problems require exact values, not approximate values. (How much money do you have in your bank account?) Some problems, like AI inference and monitoring sensor data, do not need six sigma precision to get the right answer or make the right decision. Companies like Mythic, Analog Inference, and Aspinity are incorporating analog computing architectures into their chips to make them up to 100x more efficient solving problems involving data from our analog world.

Photonic Computing

Light has been used for digital communications and computer networks for decades, but using photons to do the math and putting photonic processors on a chip are extremely challenging. That is what several startups are trying to do. Spinning technologies out of MIT and Princeton, three companies, Lightelligence, Lightmatter, and Luminous Computing, are racing to commercialize the first photonic chip for doing AI inference at the edge.

Neuromorphic Computing

In spite of what the media portrays as the imminent cyber-apocalypse where robots rebel against their human masters and take over the world, we are a long way away from the science fiction world imagined in popular culture. The fact is that the human brain is still massively more powerful and efficient than the most powerful supercomputers on earth. But computer scientists think there is a path to create an artificial brain. The branch of artificial intelligence that uses neural network mathematical frameworks to compute information in a manner similar to the human brain is sometimes referred to as neuromorphic, because it mimics human neuro-biology. But researchers have been working on models that even more closely mimic the human brain in its design and efficiency. The brain sends signals as electrochemical spikes, not digital bytes, and the brains roughly 86 billion neurons are interconnected in a way that is very different from transistors on a chip. Researchers at Stanford, Intel, IBM, and several startup companies, such as Rain Neuromorphics and BrainChip, are trying to develop hardware and software that uses neuromorphic principles to deliver very high-power computing on very small semiconductor chips.

Quantum Computing

Almost certainly the most radical initiative in computing is the attempt to harness the potential of quantum computing. At the subatomic level, particles of matter behave in extraordinary and wonderful ways they can exist in more than one state simultaneously, and they can entangle with one another across a distance without any apparent physical connection. It turns out that electronic devices like transistors and diodes wouldnt even work if the universe were strictly Newtonian. If we can figure out how to control the quantum properties of light and matter the way we figured out how to use gears to make adding machines and transistors to make computers, we will be able to make quantum computers that are as superior to current supercomputers as supercomputers are to adding machines.

Some people say we are still a long way away from quantum supremacy, when quantum computers can solve problems that no classical computer can solve. But recent advances indicate that we may not be that far away from quantum advantage, when quantum computers can solve certain specialized problems faster than classical computers.

Already big players like IBM, Google, Intel, Honeywell, and Microsoft are demonstrating machines that can execute quantum algorithms and startups like Rigetti Computing,IonQ, and PsiQuantum are joining the race, along with quantum software companies like QC Ware, Cambridge Quantum Computing, and Zapata Computing. Big corporations and governments are investing in projects that will take advantage of the power of quantum computing in chemistry, pharmaceuticals, finance, logistics, failure analysis, and artificial intelligence.

Each of these emerging technologies promises to significantly advance computing, and with these advances will come new technology leaders. The evolution of computing has given rise to multiple generations of spectacular success stories like IBM, Intel, Microsoft, Nvidia, Google, and Amazon Web Services. Most of these companies are trying to reinvent themselves to catch the next wave of computing technology, but certainly new companies will emerge in these new sectors, and some famous names will founder and go the way of the dinosaurs, like Univac, Digital Equipment, MIPS, and Silicon Graphics. Meanwhile, corporate CIOs will have to decide where to place their bets and start investing in these new technologies, if they havent already.

More here:
The Future of Computing: Hype, Hope, and Reality - CIOReview

Semiconductor Industry Announces Research and Funding Priorities to Sustain U.S. Leadership in Chip Technology – goskagit.com

WASHINGTON, Oct. 15, 2020 /PRNewswire/ -- The Semiconductor Industry Association (SIA) and the Semiconductor Research Corporation (SRC) today released a preview of their upcoming "Decadal Plan for Semiconductors," a report outlining chip research and funding priorities over the next decade that will help strengthen U.S. semiconductor technology and spur growth in emerging technologies such as artificial intelligence, quantum computing, advanced wireless communications. The Decadal Plan, developed with contributions from a broad cross-section of leaders in academia, government, and industry, identifies five "seismic shifts" shaping the future of chip technology and calls for an annual $3.4 billion federal investment over the next decade to fund semiconductor R&D across these five areas.

[DOWNLOAD THE INTERIM REPORT | ONE-PAGER]

"Federal government and private sector investments in semiconductor R&D have propelled the rapid pace of innovation in the U.S. semiconductor industry, spurring tremendous growth throughout the U.S. and global economies," said John Neuffer, SIA president and CEO. "As we enter a new era, however, a renewed focus on public-private research partnerships is necessary to address the seismic shifts facing chip technology. The federal government must invest ambitiously in semiconductor research to keep America on top in semiconductors and the game-changing future technologies they enable."

The Decadal Plan's proposed additional federal investment of $3.4 billion annually would strengthen the U.S. semiconductor industry's global leadership position, add $161 billion to U.S. GDP, and create half a million U.S. jobs in the next 10 years, according to findings from an earlier SIA study. The Decadal Plan makes specific recommendations on how this increased funding should be allocated, identifying the following seismic shifts that require a renewed focus on semiconductor research:

"The future holds unlimited potential for semiconductor technology, with emerging applications such as artificial intelligence, quantum computing, and advanced wireless technologies promising incalculable societal benefit," said Dr. Todd Younkin, SRC president and CEO. "The Decadal Plan provides a blueprint for how we can convert this potential into a reality. Working together, we can boost semiconductor technology to keep it strong, competitive, and at the tip of the spear of innovation."

The full Decadal Plan is scheduled to be published in December 2020. SIA and SRC will host a virtual workshop coinciding with the release of the full report. Learn more and download the interim report at http://www.src.org/decadalplan.

Media Contacts

Dan RossoSemiconductor Industry Association240-305-4738drosso@semiconductors.org

David HenshallSemiconductor Research Corporation919-941-9440david.henshall@src.org

About SIA

The Semiconductor Industry Association (SIA) is the voice of the semiconductor industry, one of America's top export industries and a key driver of America's economic strength, national security, and global competitiveness. Semiconductors the tiny chips that enable modern technologies power incredible products and services that have transformed our lives and our economy. The semiconductor industry directly employs nearly a quarter of a million workers in the United States, and U.S. semiconductor company sales totaled $193 billion in 2019. represents 95 percent of the U.S. semiconductor industry by revenue and nearly two-thirds of non-U.S. chip firms. Through this coalition, SIA seeks to strengthen leadership of semiconductor manufacturing, design, and research by working with Congress, the Administration, and key industry stakeholders around the world to encourage policies that fuel innovation, propel business, and drive international competition. Learn more atwww.semiconductors.org.

About SRC

Semiconductor Research Corporation (SRC), a world-renowned, high technology-based consortium serves as a crossroads of collaboration between technology companies, academia, government agencies, and SRC's highly regarded engineers and scientists. Through its interdisciplinary research programs, SRC plays an indispensable part to address global challenges, using research and development strategies, advanced tools and technologies. Members of SRC work synergistically together, gain access to research results, fundamental IP, and highly experienced students to compete in the global marketplace and build the workforce of tomorrow. Learn more at http://www.src.org.

See the original post here:
Semiconductor Industry Announces Research and Funding Priorities to Sustain U.S. Leadership in Chip Technology - goskagit.com

IBM: Five ways technology will shape our lives | Technology & AI | Business Chief North America – Business Chief

Capturing carbon dioxide to slow climate change and repurposing existing drugs to produce a vaccine for COVID-19 are two of the predictions from IBMs annual 5 in 5 technology report.

Five ways technology will change our lives within five years, is the IBM Research paper which outlines how accelerating the process of discovery will result in a sustainable future.

Each year, IBM showcases how they believe technology will reshape business and society, informed by work occurring within IBM Researchs global labs and industry trends.

Today, the convergence of emerging technologies including Artificial Intelligence (AI) and quantum computing is enabling us to consider a wider range of questions once thought out of reach, states the report.

We urgently need to design new materials to tackle pressing societal challenges addressed in the UN Sustainable Development Goals, from fostering good health and clean energy to bolstering sustainability, climate action and responsible production.

Top five predictions by IBM Research include:

Carbon dioxide conversion - Slow climate change by the capture and reuse of CO2 in the atmosphere

Antivirals Repurpose drugs to reduce time spent on drug discovery to beat COVID-19 and future pandemics

Energy storage - Accelerated discovery of new materials for better batteries to meet global demand for electricity without raising the temperature of the Earth

Nitrogen fixation - AI and quantum computing will come up with a solution to enable nitrogen fixation to feed the worlds growing population (estimated to be 10 billion by 2050)

Photoresists - Scientists will embrace a new approach to materials that lets the tech industry more quickly to produce sustainable materials to produce semiconductors and electronic devices

Taking a closer look at the five predictions reveal the following points:

IBM predicts that it will be possible to capture and reuse carbon dioxide from the atmosphere in a bid to slow down climate change.

It is reported that climate change will lead to higher levels of CO2 by 2025 than those seen during the warmest period of the last 3.3 million years. A team of IBM researchers are creating a cloud-based knowledge base of existing methods and materials to capture CO2

Progressing carbon capture and sequestration before it is too late requires an acceleration of the discovery process. Sophisticated AI systems and AI-guided automatic lab experiments would test large numbers of chemical reactions.

The goal over the next five years is to make CO2 capture and reuse efficient enough to scale globally so it can reduce the amount of CO2 released into the atmosphere and slow climate change.

IBM predicts medical researchers will identify new opportunities for drug repurposing which would help find a vaccine against COVID-19 and future viruses.

Scientists estimate there are more than a million viruses in nature with a potential to spread like COVID-19. It can take up to $2.6 billion and more than a decade for a new drug to reach the market.

One way to kick-start the process is to identify potential therapies from existing drugs - jumpstarting subsequent research to help enable rapid clinical trials and regulatory review.

IBM Research outlines that solutions could include a combination of AI analytics and data that could potentially help with real-world medical evidenceto suggest new candidates for drug repurposing.

In the context of COVID-19, researchers used this technology with real-world evidence to suggest the use of two existing drugs. The first was approved for specific immunological and endocrine disorders and the second was one in use for treating prostate cancer.

Energy storage - Rethinking batteries

IBM predicts it will be possible to discover new materials for safer and more environmentally preferable batteries capable of supporting a renewable-based energy grid and more sustainable transportation.

Many renewable energy sources are intermittent and require storage. The use of AI and quantum computing will result in batteries built with safer and more efficient materials for improved performance, stresses the report.

IBM predict that it will be possible to replicate natures ability to convert nitrogen in the atmosphere into nitrate-rich fertiliser, feeding the growing world population while reducing the environmental impact of fertilisers.

Using the accelerated discovery cycle, researchers will sift through existing knowledge about catalysts. In a few years, a quantum computer might be able to precisely simulate different nitrogen fixation catalytic processes, further augmenting our knowledge.

Well come up with an innovative solution to enable nitrogen fixation at a sustainable scale.

Semiconductor transistors have shrunk, giving us smaller, more powerful gadgets as more processing power onto a single chip. This shrinking has been enabled by materials known as photoresists.

But with billions of phones, TVs, and cars in the world it is imperative all the chemicals, materials and processes used in their manufacture are sustainable.

IBM predicts it will be possible to advance materials manufacturing, enabling semiconductor manufacturers to improve the sustainability of their coveted products.

Read more

For more information on business topics in the United States and Canada, please take a look at the latest edition of Business Chief North America.

Follow Business Chief on LinkedIn and Twitter.

View post:
IBM: Five ways technology will shape our lives | Technology & AI | Business Chief North America - Business Chief

Cancer. How close are we to winning the war? – Switzer

When I started my medical degree in the 1970s, a diagnosis of cancer was typically a death sentence. Occasionally, if cancer was detected early, there was some hope for surgical resection. The radiotherapy and chemotherapy used back in those days was rather primitive and certainly often extremely toxic to the body.

Now in the year 2020, despite significant emphasis being appropriately placed on improved therapies and vaccines for COVID-19, the medical world is closing in on a cure for cancer. Over the past decade, the widespread use of immunotherapy and the somewhat newer CAR-T therapies & their spinoffs have revolutionised the treatments of many cancers including haematologic cancers, such as leukaemia and lymphoma but also the common solid tumours such as breast, prostate, colon, lung and melanoma, to name a few.

One of the issues with cancers is that they form a shield around individual tumour cells making them almost invisible to the immune system. Many of the newer immunotherapies help break down the shield, allowing the bodys own immune system to attack the tumour cells.

One of the best group of tumour killing cells in the immune system are T cells known as Tumour Infiltrating Lymphocytes (TILs). Although these are probably the best soldiers of the immune system, once they enter the battlefield of the tumour micro-environment they are disabled by many of the stressors present in this situation. All cells, whether they be our own naturally occurring cells or those of tumours that have formed in our body, require a supply of nutrients and oxygen to function correctly.

Thus, when a TIL enters a tumour to do its work, it is competing with the tumour for local nutrients and oxygen. The tumour being extremely greedy steals these nutrients leading to a reduction in the function of a component of all cells known as the mitochondria. Mitochondria are the fuel supply of the cell creating energy for the cell, allowing it to do its work and exist. If we deprive the mitochondria of nutrients, the mitochondria are converted into a sluggish state known as terminal exhaustion. Also, when the T cells mitochondria are starting to age, there is a natural process that breaks down the cells and replaces them with younger, healthier T cells to continue the job of trying to destroy any tumours that are present.

The function of any living cell, including T cells and tumour cells, is to survive. Cancer cells create particular antigens which stimulate a protein known as PD-1, which suppresses the T-cell response. Thus, when you have cancer, there is this constant battle being waged inside your body between your own immune system and the tumour.

Now for the good news. Scientists from the US have discovered that a commonly used supplement for anti-ageing known as NAD-riboside enhances mitochondrial function in these failing T cells and allows them to recharge themselves to enhance their attack against tumours. When NAD-riboside was added to specific immunotherapy drugs, this significantly inhibited the growth of a variety of tumours in mice.

Professor David Sinclair from Harvard University has pioneered the use of NAD-riboside for anti-ageing demonstrating that this is a safe and effective supplement, prolonging the life of laboratory animals by 20%. Prof Sinclair has also demonstrated the same anti-ageing markers in human beings. There now appears to be another feather in the cap for this safe and seemingly effective supplement, which may become standard care to be added to all new cancer therapies.

A word of caution in that this has not been trialled in humans as additional therapy but, many people, including myself, already take NAD-riboside or a related supplement e.g. NMN; NAD plus, as a potential anti-ageing therapy. I have been saying for a number of years that vitamin B3 and its variety of analogues such as NAD-riboside are a vital part of good health. This is yet more evidence to support these claims. Although we have not achieved a cure for cancer in 2020, we are certainly edging closer.

Go here to read the rest:
Cancer. How close are we to winning the war? - Switzer

Brexit casts doubt over UK pension rights – Interactive Investor

Future retirees are facing uncertainty over the level of state pension they will receive if they move abroad and could miss out on around 140,000 of income once the Brexit transitional period ends, Aegon claims.

The pension company is warning that the outcome of Brexit negotiations could have a huge impact on the retirement prospects of UK citizens who plan to move to and retire in the European Union (EU) or Switzerland.

Currently, those who are already living in the EU before 31 December 2020 have been reassured that they will receive the same increases to UK state pensions as paid to those living in the UK.

This is based on the triple lock, which guarantees the state pension will rise by the highest of earnings growth, price inflation or 2.5% each year.

However, aside from those who move to Ireland, there has been no guarantee from the UK on the level of the state pension for those retiring and living in other countries.

Steven Cameron, pensions director at Aegon, says: The outcome of last-minute Brexit negotiations could have a huge impact on those who may be planning to retire abroad to another EU country.

With many people living 20 or more years after state pension age, any form of inflation proofing is highly valuable, with the triple lock particularly so.

An inflation linked state pension of 175.20 a week is worth 336,500 whereas one that doesnt increase is worth 191,000 which is 145,500 less.

Cameron adds that while the treatment of state pensions may not be top of the agenda for last minute Brexit negotiations, he warns decisions in these areas could make a huge difference to those planning to move abroad in future for their retirement years.

David Sinclair, director of thinktank the International Longevity Centre, says: For some older people, the state pension provides a significant part of their retirement income.

These people in particular will want to be confident that they can afford their retirement aspirations irrespective of whether they stay in the UK or move abroad.Older people who want to move abroad need to be confident that value of their savings alongside their pension will be adequate.

Sinclair adds that retirees will need to make their wealth last 20-plus years and says this uncertainty is unhelpful for those who want to plan for the long term.

These articles are provided for information purposes only. Occasionally, an opinion about whether to buy or sell a specific investment may be provided by third parties. The content is not intended to be a personal recommendation to buy or sell any financial instrument or product, or to adopt any investment strategy as it is not provided based on an assessment of your investing knowledge and experience, your financial situation or your investment objectives. The value of your investments, and the income derived from them, may go down as well as up. You may not get back all the money that you invest. The investments referred to in this article may not be suitable for all investors, and if in doubt, an investor should seek advice from a qualified investment adviser.

Full performance can be found on the company or index summary page on the interactive investor website. Simply click on the company's or index name highlighted in the article.

Read more from the original source:
Brexit casts doubt over UK pension rights - Interactive Investor