Quantum Computing and the Cryptography Conundrum – CXOToday.com

By: Anand Patil

On October 23, 2019, researchers from Google made an official announcement of a major breakthrough one that scientists compared to the Wright Brothers first flight, or even mans first moon landing. They said to have achieved Quantum Supremacy, meaning that they had created a Quantum Computer that could perform a calculation that is considered impossible by the classical computers of today. The announcement was a landmark, highlighting the possibilities of Quantum Computing.

The concept of Quantum Computing itself isnt new. It is a field that has been a point of interest of physicists and computer researchers since the 1980s. Googles announcement, however, has brought it to the mainstream, and shone a spotlight on the promise that this niche field of innovation holds. Of course, like someone once said, with great power comes with great responsibility, so this field isnt without complexities.

The Possibilities of Quantum Computing

Quantum Computing is a branch of computer science that is focused on leveraging the principles of quantum physics to develop computer technology. Quantum Computers hold the promise to power major advances in various fields that require complex calculations from materials science and pharmaceuticals to aerospace and artificial intelligence (AI).

So far, Quantum Computers have been nothing more than fancy laboratory experiments large and expensive but they have successfully demonstrated that the underlying principles are sound and have the potential to transform industries and accelerate innovation like never before. This has spurred scientific and industrial interest in this nascent field, giving rise to multiple projects across the world in pursuit of creating a viable, general-use Quantum Computer. That said, it may still be many years before Quantum Computers are commercially and generally available.

So Why Does It Matter Today?The possibility of Quantum Computers poses a serious challenge to cryptographic algorithms deployed widely today. Todays key-exchange algorithms, like RSA, Diffie-Hellman, and others, rely on very difficult mathematical problems such as prime factorization for their security, which a Quantum computer would be able to solve much faster than a classical computer.

For example, it would take a classical computer centuries or even longer, to break modern algorithms like DH, RSA-2048 etc. by using brute-force methods. However, given the power and efficiency of quantum machines in calculations such as finding prime factors of large numbers it may be possible for a quantum computer to break current asymmetric algorithms in a matter of days

So, while the encrypted internet is not at risk at the moment, all that a bad actor has to do is capture the encrypted data today including the initial key exchange, and then wait until a powerful enough quantum computer is available to decrypt it. This is particularly a problem for organizations that have large amounts of sensitive data that they need to protect over the long term such as Banks, Governments and Defense agencies.

What Can I Do Now?For organizations that could be at risk in the future, this is the best time to start evaluating post-quantum cryptography. Simply put, this means moving to algorithms and/or keys that are a lot more robust and can withstand a brute-force attack by a quantum computer i.e. quantum resistant.

The National Institute of Standards and Technology (NIST) in the US is leading the effort towards the standardization of post-quantum secure algorithms. However, given the lengthy process involved, this may take many years to fructify.

An alternative is to use Quantum Key Distribution (QKD) techniques with existing algorithms that are considered quantum-safe. This involves using a dedicated optical channel to exchange keys using the quantum properties of photons. Any attempt to tap this secure channel will lead to a change in the quantum state of the photon and can be immediately detected and therefore the key is unhackable. One of the limitations of QKD in this method is the need for a dedicated optical channel that cannot span more than 50km between the two terminals. Of course, this also means that the existing encryption devices or routers should be capable of ingesting such Quantum-Generated keys.

Post-Quantum Cryptography and CiscoCisco is an active contributor to the efforts to standardize post-quantum algorithms. However, recognizing that an implementable standard may be some years away, there is work ongoing to ensure that organizations are able to implement quantum-resistant encryption techniques in the interim, that leverage existing network devices like routers which are most commonly used as encryptors.

To start with, a team of veteran technical leaders and cryptography experts from Cisco US David McGrew, Scott Fluhrer, Lionel Florit and the engineering team in Cisco India lead by Amjad Inamdar and Ramas Rangaswamy developed an API interface called the Secure Key Import Protocol or SKIP through which Cisco routers can securely ingest keys from an external post-quantum key source. This allows existing Cisco routers to be quantum-ready, with just the addition of an external QKD system. Going forward, this team is working on a way to deliver quantum-safe encryption keys without the need for short-range point-to-point connections.

The advantage of this method is that organizations can integrate post-quantum key sources with existing networking gear in a modular fashion without the need to replace anything already installed. In this manner, you could create a quantum-ready network for all traffic with minimal effort.

Getting Ready for the Post-Quantum WorldQuantum Supremacy is an event which demonstrates that a quantum machine is able to solve a problem that no classical computer can solve in a feasible amount of time. This race has gathered momentum in the recent past with several companies joining the bandwagon, and some even claiming to have achieved it.

There is an unprecedented amount of attention focused on making a commercially viable quantum computer. Many believe it is inevitable, and only a question of time. When it does happen, the currently used cryptography techniques will become vulnerable, and therefore be limited in their security. The good news is, there are methods available to adopt strong encryption techniques that will remain secure even after quantum computers are generally available.

If you are an organization that wants to protect its sensitive data over the long term, you should start to evaluate post-quantum secure encryption techniques today. By leveraging existing networking infrastructure and adding suitable post-quantum key distribution techniques, it is possible to take a quantum leap in securing your data.

(The author is Director, Systems Engineering, Cisco India and SAARC and the views expressed in this article are his own)

View post:
Quantum Computing and the Cryptography Conundrum - CXOToday.com

Quantum computing will impact the enterprise–we just don’t know how – TechRepublic

Quantum computing promises to take on problems that were previously unsolvable. This whole new level of compute power will make it possible to crunch incredible volumes of data that traditional computers cant manage. It will allow researchers to develop new antibiotics, polymers, electrolytes, and so much more.

While the options for quantum computing uses may seem endless, the enterprise is still deciding if this is all just a pipe dream or a future reality.

TechRepublic Premium recently surveyed 598 professionals to learn what they know about quantum computing and what they dont. This report will fill in some of those gaps.

The survey asked the following questions:

Quantum computing is unknown territory for almost all of the survey respondents, as 90% stated that they had little to no understanding of the topic. In fact, only 11% of the 598 respondents said they had an excellent understanding of quantum computing.

Further, 36% of respondents said they were not sure which company was leading the race to develop a quantum computer. IBM got 28% of the votes, and Google got 18%. 1QBit and D-Wave each got 6% of votes. Honeywell came in at 3%.

In terms of industry impact, more than half of the respondents (58%) said that quantum computing will have either a significant impact or somewhat of an impact on the enterprise. While all industries will benefit through different use cases because quantum computing allows data to be consumed and processed faster while using less energy, 42% of survey respondents said IT would benefit the most. The pharmaceutical and finance sectors followed at 14% and 12%, respectfully.

To read all of the survey results, plus analysis, download the full report.

View post:
Quantum computing will impact the enterprise--we just don't know how - TechRepublic

A Measured Approach to Regulating Fast-Changing Tech – Harvard Business Review

Executive Summary

Innovations driving what many refer to as the Fourth Industrial Revolution are as varied as the enterprises affected. Industries and their supply chains are already being revolutionized by several emerging technologies, including 5G networks, artificial intelligence, and advanced robotics, all of which make possible new products and services that are both better and cheaper than current offerings. Unfortunately, not every application of transformational technology is as obviously beneficial to individuals or society as a whole. But rather than panic, regulators will need to step back, and balance costs and benefits rationally.

Amid the economic upheaval caused by Covid-19, technology-driven disruption continues to transform nearly every business at an accelerating pace, from entertainment to shopping to how we work and go to school. Though the crisis may be temporary, many changes in consumer behavior are likely permanent.

Well before the pandemic, however, industries and their supply chains were already being revolutionized by several emerging technologies, including 5G networks, artificial intelligence, and advanced robotics, all of which make possible new products and services that are both better and cheaper than current offerings. That kind of big bang disruption can quickly and repeatedly rewrite the rules of engagement for incumbents and new entrants alike. But is the world changing too fast? And, if so, are governments capable of regulating the pace and trajectory of disruption?

The answers to those questions vary by industry, of course. Thats because the innovations driving what many refer to as the Fourth Industrial Revolution are as varied as the enterprises affected. In my recent book, Pivot to the Future, my co-authors and I identified ten transformative technologies with the greatest potential to generate new value for consumers, which is the only measure of progress that really matters. They are: extended reality, cloud computing, 3D printing, advanced human-computer interactions, quantum computing, edge and fog computing, artificial intelligence, the Internet of Things, blockchain, and smart robotics.

Some of these disruptors, such as blockchain, robotics, 3D printing and the Internet of things, are already in early commercial use. For others, the potential applications may be even more compelling, though the business cases for reaching them are less obvious. Today, for example, only the least risk-adverse investors are funding development in virtual reality, edge computing, and new user interface technologies that interpret and respond to brainwaves.

Complicating both investment and adoption of transformative technologies is the fact that the applications with the biggest potential to change the world will almost certainly be built on unanticipated combinations of several novel and mature innovations. Think of the way ride-sharing services require existing GPS services, mobile networks, and devices, or how video conferencing relies on home broadband networks and high-definition displays. Looking at just a few of the most exciting examples of things to come make clear just how unusual the next generation of disruptive combinations will be, and how widespread their potential impact on business-as-usual:

Unfortunately, not every application of transformational technology is as obviously beneficial to individuals or society as a whole. Every one of the emerging technologies we identified (and plenty of those already in mainstream use) come with potential negative side effects that may, in some cases, outweigh the benefits. Often, these costs are both hard to predict and difficult to measure.

As disruption accelerates, so too does anxiety about its unintended consequences, feeding what futurist Alvin Toffler first referred to half a century ago as Future Shock. Tech boosters and critics alike are increasingly appealing to governments to intervene, both to promote the most promising innovations and, at the same time, to solve messy social and political conflicts aggravated by the technology revolution.

On the plus side, governments continue to support research and development of emerging technologies, serving as trial users of the most novel applications. The White House, for example, recently committed over $1 billion for continued exploration of leading-edge innovation in artificial intelligence and quantum computing. The Federal Communications Commission has just concluded one its most successful auctions yet for mobile radio frequencies, clearing bandwidth once considered useless for commercial use but now seen as central to nationwide 5G deployments. Palantir, a data analytics company that works closely with governments to assess terrorism and other complex risks, has just filed for a public offering that values the start-up at over $40 billion.

At the same time, a regulatory backlash against technology continues to gain momentum, with concerns about surveillance, the digital divide, privacy, and disinformation leading lawmakers to consider restricting or even banning some of the most popular applications. And the increasingly strategic importance of continued innovation to global competitiveness and national security has fueled increasingly nasty trade disputes, including some between the U.S., China, and the European Union.

Together with on-going antitrust inquiries into the competitive behavior of leading technology providers, these negative reactions underscore what author Adam Thierer sees as the growing prevalence of techno-panics generalized fears about personal autonomy, the fate of democratic government, and perhaps even apocalyptic outcomes from letting some emerging technologies run free.

Disruptive innovation is not a panacea, but nor is it a poison. As technology transforms more industries and becomes the dominant driver of the global economy, it is inevitable both that users will grow more ambivalent, and, as a result, that regulators will become more involved. If, as a popular metaphor of the 1990s had it, the digital economy began as a lawless frontier akin to the American West, its no surprise that as settlements grow socially complex and economically powerful, the law will continue to play catch up, likely for better and for worse.

But rather than panic, regulators need to step back, and balance costs and benefits rationally. Thats the only way well achieve the exciting promise of todays transformational technologies, but still avoid the dystopias.

See the original post here:
A Measured Approach to Regulating Fast-Changing Tech - Harvard Business Review

IBM and Mastercard among partners of 11.1m Irish quantum project – Siliconrepublic.com

A new 11.1m project has launched with the aim of uniting Irelands various quantum computer research groups.

Some of the biggest names in tech and research have joined forces with the aim of bolstering Irelands quantum computer efforts. The 11.1m Quantum Computing in Ireland (QCoir) initiative will work on a software platform integrating multiple quantum bit technologies being developed in Ireland.

Unlike a traditional binary computer that uses binary bits which can be either one or zero a quantum bit (qubit) can be one, zero or both at the same time. This gives quantum computers the power to solve some of the worlds most complex problems in a fraction of the time that it would take a binary computer.

QCoir partners include Equal1 Labs, IBM, Rockley Photonics, Maynooth University, the Tyndall National Institute, University College Dublin and Mastercard. The project received 7.3m in funding under the Disruptive Technologies Innovation Fund, a 500m fund established under Project Ireland 2040.

Quantum computing is seen as the future of computer technology, said Dr Emanuele Pelucchi, head of epitaxy and physics of nanostructures at Tyndall, based at University College Cork.

Its computing built on the principles of quantum physics, creating, storing and accessing data at atomic and subatomic levels to create vastly powerful computers.

Sources of multiple entangled photons uniquely allow for preparation of highly entangled quantum states. QCoir will leverage the on-chip photonic qubit platform based on site-controlled III-V quantum dots. These unique dots were developed at Tyndall.

Tyndalls CEO, Prof William Scanlon, added that the partnership will set the foundations for a national quantum ecosystem.

It brings together hardware and software providers with application users, and sees multinationals working side by side with researchers and SMEs, he said.

These kinds of industry and academic research partnerships are what will allow Ireland to build a quantum value proposition at international scale.

Quantum computing research is continuing to progress in Ireland. Earlier this year, a team from Trinity College Dublin said it had taken a major step towards the holy grail of quantum computing: a stable, small-scale quantum computer.

See original here:
IBM and Mastercard among partners of 11.1m Irish quantum project - Siliconrepublic.com

Quantum Computing Market Research including Growth Factors, Types and Application by regions by 2026 – Eurowire

TheQuantum Computing market research report offers a comprehensive analysis of market size, segmentation market growth, market share, competitive landscape, regional and country-level market size, the impact of Covid-19 on Quantum Computing industry & revenue pocket opportunities, sales analysis, impact of domestic and global market players, value chain optimization, new developments, M&A, opportunities analysis, strategic market growth analysis, product launches, area marketplace expanding, and technological innovations.

The meticulous data of the Quantum Computing market helps to know the current & future business situation. This report helps to take decisions for industry leaders include business professionals such as Chief Executive Officer (CEO), general managers, vice presidents, decision-makers and sales directors. The global Quantum Computing market showing promising growth opportunities over the forthcoming years.

The Quantum Computing market size is expected to grow at a CAGR of 21.26% in the forecast period of 2020 to 2026 and will expected to reach USD 381.6 Mn by 2026, from USD 81.6 Mn in 2018.

Browse Full Research report along with TOC, Tables & Figures:https://www.alltheresearch.com/report/150/Quantum Computing

Forproduct type segment, this report listed the main product type of Quantum Computing market

Forapplications segment, this report focuses on the status and outlook for key applications. End users are also listed.

This report covers the following regions:

Request for a sample copy of the report to get extensive insights into Quantum Computing market @https://www.alltheresearch.com/sample-request/150

Key segments covered in the Quantum Computing market report:Major key companies, product type segment, end use/application segment and geography segment.

Company segment, the report includes global key players of Quantum Computing as well as some small players:

The information for each competitor includes:

Any questions or need help to explore more? Speak to our Industry Analyst:https://www.alltheresearch.com/speak-to-analyst/150

Key Questions Answered in the Report:

We also can offer a customized report to fulfill the special requirements of our clients. Regional and Countries report can be provided as well.

Ask for more details or request custom reports from our industry experts @https://www.alltheresearch.com/customization/150

About AllTheResearch:

AllTheResearch was formed with the aim of making market research a significant tool for managing breakthroughs in the industry. As a leading market research provider, the firm empowers its global clients with business-critical research solutions. The outcome of our study of numerous companies that rely on market research and consulting data for their decision-making made us realise, that its not just sheer data-points, but the right analysis that creates a difference. While some clients were unhappy with the inconsistencies and inaccuracies of data, others expressed concerns over the experience in dealing with the research-firm. Also, same-data-for-all-business roles was making research redundant. We identified these gaps and built AllTheResearch to raise the standards of research support.

FOR ALL YOUR RESEARCH NEEDS, REACH OUT TO US AT:

Contact Name: Rohit B.

Email: [emailprotected]

Phone: 1-888-691-6870

Read the original:
Quantum Computing Market Research including Growth Factors, Types and Application by regions by 2026 - Eurowire

U. Forward Fest to engage community on research and development for the future – The Daily Princetonian

On Oct. 23 and 24, the University will kick off its first monthly Forward Fest event, featuring high-level administrators and accomplished faculty members who work in innovation, as well as alumni hosts and moderators.

According to a press release, the online series, which will continue throughout the year, aims to spark dialogue across the global University community to engage with and explore big ideas and their infinite possibilities for shaping the future.

The Forward Fest speakers, or Forward Thinkers, are drawn from a variety of disciplines. Their presentations will discuss how their research and approaches have pivoted to analyze and address urgent contemporary issues.

The first half of the inaugural Forward Fest will take place on Friday, Oct. 23, at 8:00 p.m. EDT. President Christopher Eisgruber 83, Provost Deborah Prentice, and a number of other administrators are slated to speak on what is to come in A Year of Forward Thinking and how the University community can engage with topics such as public health and bioengineering with an orientation towards the future.

One of the featured administrators, Dean of Engineering and Applied Science Andrea Goldsmith, wrote in an email to the The Daily Princetonian that she will discuss plans to significantly grow our engineering faculty and to build a new neighborhood for the school that will foster collaboration within engineering and across all of Princeton.

I also plan to discuss our vision to launch interdisciplinary initiatives in bioengineering, quantum computing, robotics, smart cities, and data science, Goldsmith wrote. Advances in these topics will enhance health and medicine, spur new computing paradigms, improve the efficiency and robustness of our infrastructure, and mitigate climate change and energy shortages.

The second day of the event will take place on Saturday at 1:00 p.m. EDT and feature three panels of faculty members on the subjects of public health, social justice, and the U.S. election, respectively.

History professor Kevin Kruse, who will participate in the election panel, said the discussions aim to serve community members in an email to the Prince.

Forward Fest was designed to focus attention on the in service part of the Universitys mission, and the webinar on the 2020 election is designed to be a service to students, faculty, alumni and others who have questions and concerns about this pivotal moment, Kruse wrote.

Kruse added that hell be providing context about a few key issues people have been talking about these past few months voting rights and voter suppression, possible reforms to the Electoral College, Congress, and the Supreme Court, and generally how this election compares to past ones.

Professor of computer science Andrew Appel 81, another faculty member on the election panel, plans to focus on the technology of how we vote, and how its inaccuracy, insecurity, and outright hackability can alter the outcome of elections, he wrote in an email to the Prince.

[M]ost (but not all) jurisdictions vote on technology that is accurate and is securable though not for the reasons you might think and now we should pay attention to the audits and procedures that would make our voting systems truly secure and trustworthy, Appel added.

Forward Fest is part of A Year of Forward Thinking, the Universitys recently-announced community engagement campaign.

Forward Fest events are free and open to the public. All programming will be livestreamed on the Forward Fest website and the Universitys YouTube channel.

Link:
U. Forward Fest to engage community on research and development for the future - The Daily Princetonian

Why AI Geniuses Haven’t Created True Thinking Machines – Walter Bradley Center for Natural and Artificial Intelligence

As we saw yesterday, artificial intelligence (AI) has enjoyed a a string of unbroken successes against humans. But these are successes in games where the map is the territory. Therefore, everything is computable.

That fact hints at the problem tech philosopher and futurist George Gilder raises in Gaming AI (free download here). Whether all human activities can be treated that way successfully is an entirely different question. As Gilder puts it, AI is a system built on the foundations of computer logic, and when Silicon Valleys AI theorists push the logic of their case to a singularity, they defy the most crucial findings of twentieth-century mathematics and computer science.

Here is one of the crucial findings they defy (or ignore): Philosopher Charles Sanders Peirce (18391914) pointed out that, generally, mental activity comes in threes, not twos (so he called it triadic). For example, you see a row of eggs in a carton and think 12. You connect the objects (eggs) with a symbol, 12.

In Peirces terms, you are the interpretant, the one for whom the symbol 12 means something. But eggs are not 12. 12 is not eggs. Your interpretation is the third factor that makes 12 mean something with respect to the eggs.

Gilder reminds us that, in such a case, the map is not the territory (p. 37) Just as 12 is not the eggs, a map of California is not California. To mean anything at all, the map must be read by an interpreter. AI supremacy assumes that the machines map can somehow be big enough to stand in for the reality of California and eliminate the need for an interpreter.

The problem, he says, is that the map is not and never can be reality. There is always a gap:

Denying the interpretant does not remove the gap. It remains intractably present. If the inexorable uncertainty, complexity, and information overflows of the gap are not consciously recognized and transcended, the gap fills up with noise. Congesting the gap are surreptitious assumptions, ideology, bias, manipulation, and static. AI triumphalism allows it to sink into a chaos of constantly changing but insidiously tacit interpretations.

Ultimately AI assumes a single interpretant created by machine learning as it processes ever more zettabytes of data and converges on a single interpretation. This interpretation is always of a rearview mirror. Artificial intelligence is based on an unfathomably complex and voluminous look at the past. But this look is always a compound of slightly wrong measurements, thus multiplying its errors through the cosmos. In the real world, by contrast, where interpretation is decentralized among many individual mindseach person interpreting each symbolmistakes are limited, subject to ongoing checks and balances, rather than being inexorably perpetuated onward.

Does this limitation make a difference in practice? It helps account for the ongoing failure of Big Data to provide consistently meaningful correlations in science, medicine, or economics research. Economics professor Gary Smith puts the problem this way:

Humans naturally assume that all patterns are significant. But AI cannot grasp the meaning of any pattern, significant or not. Thus, from massive number crunches, we may learn (if thats the right word) that

Stock prices can be predicted from Google searches for the word debt.

Stock prices can be predicted from the number of Twitter tweets that use calm words.

An unborn babys sex can be predicted by the amount of breakfast cereal the mother eats.

Bitcoin prices can be predicted from stock returns in the paperboard-containers-and-boxes industry.

Interest rates can be predicted from Trump tweets containing the words billion and great.

If the significance of those patterns makes no sense to you, its not because you are not as smart as the Big Data machine. Those patterns shouldnt make any sense to you. Theres no sense in them because they are meaningless.

Smith, author with Jay Cordes of The Phantom Pattern Problem (Oxford, 2020), explains that these phantom patterns are a natural occurrence within the huge amounts of data that big computers crunch:

even random data contain patterns. Thus the patterns that AI algorithms discover may well be meaningless. Our seduction by patterns underlies the publication of nonsense in good peer-reviewed journals.

Yes, such meaningless findings from Big Data do creep into science and medicine journals. Thats partly a function of thinking that a big computer can do our thinking for us even though it cant recognize the meaning of patterns. Its what happens when there is no interpreter.

Ah, butso we are toldquantum computers will evolve so as to save the dream of true thinking machines. Gilder has thought about that one too. In fact, hes been thinking about it since 1989 when he published Microcosm: The Quantum Era in Economics and Technology.

Its true that, in the unimaginably tiny quantum world, electrons can do things we cant:

A long-ago thought experiment of Einsteins showed that once any two photonsor other quantum entitiesinteract, they remain in each others influence no matter how far they travel across the universe (as long as they do not interact with something else). Schrdinger christened this entanglement: The spinor other quantum attributeof one behaves as if it reacts to what happens to the other, even when the two are impossibly remote.

But, he says, its also true that continuously observing a quantum system will immobilize it (the quantum Zeno effect). As John Wheeler reminded us, we live in a participatory universe where the observer (Peirces interpretant) is critical. So quantum computers, however cool they sound, still play by rules where the interpreter matters.

In any event, at the quantum scale, we are trying to measure atoms and electrons using instruments composed of atoms and electrons (p. 41). That is self-referential and introduces uncertainty into everything: With quantum computing, you still face the problem of creating an analog machine that does not accumulate errors as it processes its data (p. 42). Now we are back where we started: Making the picture within the machine much bigger and more detailed will not make it identical to the reality it is supposed to interpret correctly.

And remember, we still have no idea how to make the Ultimate Smart Machine conscious because we dont know what consciousness is. We do know one thing for sure now: If Peirce is right, we could turn most of the known universe into processors and still not produce an interpreter (the consciousness that understands meaning).

Robert J. Marks points out that human creativity is non-algorithmic and therefore uncomputable. From which Gilder concludes, The test of the new global ganglia of computers and cables, worldwide webs of glass and light and air, is how readily they take advantage of unexpected contributions from free human minds in all their creativity and diversity. These high-entropy phenomena cannot even be readily measured by the metrics of computer science (p. 46).

Its not clear to Gilder that the AI geniuses of Silicon Valley are taking this in. The next Big Fix is always just around the corner and the Big Hype is always at hand.

Meanwhile, the rest of us can ponder an idea from technology philosopher George Dyson, Complex networksof molecules, people or ideasconstitute their own simplest behavioral descriptions. (p. 53) He was explaining why analog quantum computers would work better than digital ones. But, considered carefully, his idea also means that you are ultimately the best definition of you. And thats not something that a Big Fix can just get around.

Heres the earlier article: Why AI geniuses think they can create true thinking machines. Early on, it seemed like a string of unbroken successes In Gaming AI, George Gilder recounts the dizzying achievements that stoked the ambitionand the hidden fatal flaw.

Read more:
Why AI Geniuses Haven't Created True Thinking Machines - Walter Bradley Center for Natural and Artificial Intelligence

University of Rhode Island names respected professor, researcher, computational scientist to lead research computing efforts – URI Today

KINGSTON, R.I. Oct. 22, 2020 The University of Rhode Island has named Gaurav Khanna, Ph.D., its founding director of Research Computing. Khanna comes to URI from the University of Massachusetts Dartmouth where he served as a professor of physics and co-director of the universitys Center for Scientific Computing & Visualization Research.

A respected leader in research computing for more than a decade, Khanna has directed several scientific computing efforts at UMass Dartmouth, including supporting the research efforts of faculty members across the campus. He also served as the founding director for the interdisciplinary Engineering & Applied Sciences Ph.D. program, the largest Ph.D. program at UMass Dartmouth.

Im looking forward to building a research computing center at the University of Rhode Island that will help support and grow the research efforts of both junior and established researchers across its campuses, says Khanna. I intend to develop a wide array of computational resources (local, regional, cloud) with full support, to advance the diverse research work underway at Rhode Islands only public research university.

Khanna also served on multiple committees in the UMass system that play a role in the governance of the Massachusetts Green High-Performance Computing Center and noted the opportunity to make similar advances at URI, I look forward to the center innovating in the space of green and energy-efficient computing, and in the emerging area of quantum computing.

As an accomplished researcher in the area of black hole and gravitational physics, Khanna has been funded by the National Science Foundation for nearly two decades and has published nearly 100 papers in top peer-reviewed research journals. His research has been covered widely in outlets including Wired, Forbes, BBC, HPCWire, Discovery, Space.com and the New York Times.

Khanna earned a Bachelor of Technology degree from the Indian Institute of Technology Kanpur, India in 1995. He earned his Ph.D. from Penn State in 2000.

Excerpt from:
University of Rhode Island names respected professor, researcher, computational scientist to lead research computing efforts - URI Today

Quantum Computing in Aerospace and Defense Market Trends and Forecast to 2028 – TechnoWeekly

Quantum Computing in Aerospace and Defense

COVID-19 Industry impact

The market research extensively explores the effect of the COVID-19 outbreak on the market for Quantum Computing in Aerospace and Defense Market. Limits resulting in low sales and sector operators dominating the hospitality industry are at risk due to the lockdowns imposed to contain the spread of the virus, as cafes and restaurants have closed temporarily. Demand from food service providers is expected to recover, as the COVID-19 pandemic restrictions are easy. However, some participants may be forced to leave the sector.

Sample Copy of This Report @ https://www.quincemarketinsights.com/request-sample-29723?utm_source=TW/LY

Features of Key Market Research

Overview of the Market Study:

The market research also analyses methods such as PORTER analysis, PEST analysis, and SWOT analysis to provide companies with quality evaluation. It helps arrange and inspire companies investment strategies for a particular business segment in the near future. The review of market attributes, market overview, industry chain, historical and future data by categories, applications, and regions, and competition landscape are included in this market research. Industry research involves analyzing the global environment in order to estimate the markets vulnerabilities, assets, opportunities, and risks.

Insights on the Market

The purpose of the market study is to include evidence, estimates, statistics, historical data, and market data verified by the industry, as well as the appropriate methodology and evaluation for a full market evaluation. The market research also helps understand the structure by evaluating the dynamics of the market segments. Market segmentation is split on the basis of content, form, end-user, and region.

Segmentation of the Market

This detailed market analysis of Quantum Computing in Aerospace and Defense Market also provides a thorough summary and description of every segment offered in the analysis. Based on their market size, growth rate, and general attractiveness in terms of investment information and incremental value growth, the main segments are benchmarked. Market segmentation is divided into sub-groups, based on certain significant common attributes, into a wide customer or business market.

Segmented By Component (Hardware, Software, Services), By Application (QKD, Quantum Cryptanalysis, Quantum Sensing, Naval)

Get ToC for the overview of the premium report @ https://www.quincemarketinsights.com/request-toc-29723?utm_source=TW/LY

Regional Estimation:

In terms of different geographies, the Quantum Computing in Aerospace and Defense Market report provides a comprehensive perspective on industry growth over the projected period, including Asia Pacific ( APAC), Europe (EU), North America (NA), Latin America (LATAM), and Middle East & Africa (MEA) revenue estimates.

Business Competitive Background:

The competitive market for Quantum Computing in Aerospace and Defense is measured by the number of domestic and foreign players participating in the market. The main focus is on the companys growth, merger, acquisition, and alliance, along with new product creation as measured strategies implemented by influential corporations to improve their customer market presence. D-Wave Systems Inc, Qxbranch LLC, IBM Corporation, Cambridge Quantum Computing Ltd, 1qb Information Technologies Inc., QC Ware Corp., Magiq Technologies Inc., Station Q-Microsoft Corporation, and Rigetti Computing are the prominent market participants examined and profiled in this study.

Highlights of the Market

The market study presents information on key manufacturers of Quantum Computing in Aerospace and Defense Market and revenues, profits, recent growth, and market share of key players. In order to evaluate the global and key regionsQuantum Computing in Aerospace and Defense Market advantages, potentials, opportunity, constraints, threat, and risks, the report has divided the breakdown data by category, regions, businesses, and applications.

By covering all markets, offering quality analysis, and insights to help our customers make the right choices, the market study offers solutions. The latest trends, niche areas, and leading company profiles are included in the study. To provide reliable and useful information, the market research database consists of numerous reports updated on a regular basis.

If You Have Any Query, Ask Our Experts @ https://www.quincemarketinsights.com/ enquiry-before-buying/enquiry-before-buying-29723?utm_source=TW/LY

About us:

QMI has the most varied products and services available on the internet for analysis. We provide research from nearly all major publications and periodically update our list to give you instant online access to the worlds most extensive and up-to-date set of expert insights into the global economy.

Contact Us:Quince Market InsightsOffice No- A109,Pune, Maharashtra 411028Phone: APAC +91 706 672 4848 / US +1 208 405 2835 / UK +44 1444 39 0986Email:[emailprotected]Web: http://www.quincemarketinsights.com

See the rest here:
Quantum Computing in Aerospace and Defense Market Trends and Forecast to 2028 - TechnoWeekly

5 Emerging AI And Machine Learning Trends To Watch In 2021 – CRN: Technology news for channel partners and solution providers

Artificial Intelligence and machine learning have been hot topics in 2020 as AI and ML technologies increasingly find their way into everything from advanced quantum computing systems and leading-edge medical diagnostic systems to consumer electronics and smart personal assistants.

Revenue generated by AI hardware, software and services is expected to reach $156.5 billion worldwide this year, according to market researcher IDC, up 12.3 percent from 2019.

But it can be easy to lose sight of the forest for the trees when it comes to trends in the development and use of AI and ML technologies. As we approach the end of a turbulent 2020, heres a big-picture look at five key AI and machine learning trends not just in the types of applications they are finding their way into, but also in how they are being developed and the ways they are being used.

The Growing Role Of AI And Machine Learning In Hyperautomation

Hyperautomation, an IT mega-trend identified by market research firm Gartner, is the idea that most anything within an organization that can be automated such as legacy business processes should be automated. The pandemic has accelerated adoption of the concept, which is also known as digital process automation and intelligent process automation.

AI and machine learning are key components and major drivers of hyperautomation (along with other technologies like robot process automation tools). To be successful hyperautomation initiatives cannot rely on static packaged software. Automated business processes must be able to adapt to changing circumstances and respond to unexpected situations.

Thats where AI, machine learning models and deep learning technology come in, using learning algorithms and models, along with data generated by the automated system, to allow the system to automatically improve over time and respond to changing business processes and requirements. (Deep learning is a subset of machine learning that utilizes neural network algorithms to learn from large volumes of data.)

Bringing Discipline To AI Development Through AI Engineering

Only about 53 percent of AI projects successfully make it from prototype to full production, according to Gartner research. When trying to deploy newly developed AI systems and machine learning models, businesses and organizations often struggle with system maintainability, scalability and governance, and AI initiatives often fail to generate the hoped-for returns.

Businesses and organizations are coming to understand that a robust AI engineering strategy will improve the performance, scalability, interpretability and reliability of AI models and deliver the full value of AI investments, according to Gartners list of Top Strategic Technology Trends for 2021.

Developing a disciplined AI engineering process is key. AI engineering incorporates elements of DataOps, ModelOps and DevOps and makes AI a part of the mainstream DevOps process, rather than a set of specialized and isolated projects, according to Gartner.

Increased Use Of AI For Cybersecurity Applications

Artificial intelligence and machine learning technology is increasingly finding its way into cybersecurity systems for both corporate systems and home security.

Developers of cybersecurity systems are in a never-ending race to update their technology to keep pace with constantly evolving threats from malware, ransomware, DDS attacks and more. AI and machine learning technology can be employed to help identify threats, including variants of earlier threats.

AI-powered cybersecurity tools also can collect data from a companys own transactional systems, communications networks, digital activity and websites, as well as from external public sources, and utilize AI algorithms to recognize patterns and identify threatening activity such as detecting suspicious IP addresses and potential data breaches.

AI use in home security systems today is largely limited to systems integrated with consumer video cameras and intruder alarm systems integrated with a voice assistant, according to research firm IHS Markit. But IHS says AI use will expand to create smart homes where the system learns the ways, habits and preferences of its occupants improving its ability to identify intruders.

The Intersection Of AI/ML and IoT

The Internet of Things has been a fast-growing area in recent years with market researcher Transforma Insights forecasting that the global IoT market will grow to 24.1 billion devices in 2030, generating $1.5 trillion in revenue.

The use of AI/ML is increasingly intertwined with IoT. AI, machine learning and deep learning, for example, are already being employed to make IoT devices and services smarter and more secure. But the benefits flow both ways given that AI and ML require large volumes of data to operate successfully exactly what networks of IoT sensors and devices provide.

In an industrial setting, for example, IoT networks throughout a manufacturing plant can collect operational and performance data, which is then analyzed by AI systems to improve production system performance, boost efficiency and predict when machines will require maintenance.

What some are calling Artificial Intelligence of Things: (AIoT) could redefine industrial automation.

Persistent Ethical Questions Around AI Technology

Earlier this year as protests against racial injustice were at their peak, several leading IT vendors, including Microsoft, IBM and Amazon, announced that they would limit the use of their AI-based facial recognition technology by police departments until there are federal laws regulating the technologys use, according to a Washington Post story.

That has put the spotlight on a range of ethical questions around the increasing use of artificial intelligence technology. That includes the obvious misuse of AI for deepfake misinformation efforts and for cyberattacks. But it also includes grayer areas such as the use of AI by governments and law enforcement organizations for surveillance and related activities and the use of AI by businesses for marketing and customer relationship applications.

Thats all before delving into the even deeper questions about the potential use of AI in systems that could replace human workers altogether.

A December 2019 Forbes article said the first step here is asking the necessary questions and weve begun to do that. In some applications federal regulation and legislation may be needed, as with the use of AI technology for law enforcement.

In business, Gartner recommends the creation of external AI ethics boards to prevent AI dangers that could jeopardize a companys brand, draw regulatory actions or lead to boycotts or destroy business value. Such a board, including representatives of a companys customers, can provide guidance about the potential impact of AI development projects and improve transparency and accountability around AI projects.

Visit link:
5 Emerging AI And Machine Learning Trends To Watch In 2021 - CRN: Technology news for channel partners and solution providers