Quantum venture funding dipped 12% in 2020, but quantum investments rose 46% – VentureBeat

Sorting through the hype surrounding quantum computing these days isnt easy for enterprises trying to figure out the right time to jump in. Skeptics say any real impact is still years away, and yet quantum startups continue to seduce venture capitalists in search of the next big thing.

A new report from CB Insights may not resolve this debate, but it does add some interesting nuance. While the number of venture capital deals for quantum computing startups rose 46% to 37 in 2020 compared to 2019, the total amount raised in this sector fell 12% to $365 million.

Looking at just the number of deals, the annual tally has ticked up steadily from just 6 deals in 2015. As for the funding total, while it was down from $417 million in 2019, it remains well above the $73 million raised in 2015.

Theres a couple of conclusions to draw from this.

First, the number of startups being drawn into this space is clearly rising. As research has advanced, more entrepreneurs with the right technical chops feel the time is now to start building their startup.

Second, the average deal size for 2020 was just under $10 million. And if you include the $46 million IQM raised, that squeezes the average for most other deals down even further. That certainly demonstrates optimism, but its far from the kind of financial gusher or valuations that would indicate any kind of quantum bubble.

Finally, its important to remember that startups are likely a tiny slice of whats happening in quantum these days. A leading indicator? Perhaps.But a large part of the agenda is still being driven by tech giants who have massive resources to invest in a technology that may have a long horizon and could be years away from generating sufficient revenues. That includes Intel, IBM, Google, Microsoft, and Amazon.

Indeed, Amazon just rolled out a new blog dedicated to quantum computing.Last year, Amazon Web Services launched Amazon Braket, a product that lets enterprises start experimenting with quantum computing. Even so, AWS quantum computing director Simone Severini wrote in the inaugural blog post that business customers are still scratching their heads over the whole phenomenon.

We heard a recurring question, When will quantum computing reach its true potential? My answer was, I dont know.' she wrote. No one does. Its a difficult question because there are still fundamental scientific and engineering problems to be solved. The uncertainty makes this area so fascinating, but it also makes it difficult to plan. For some customers, thats a real issue. They want to know if and when they should focus on quantum computing, but struggle to get the facts, to discern the signal from all the noises.

Continued here:
Quantum venture funding dipped 12% in 2020, but quantum investments rose 46% - VentureBeat

Quantum Computers May Steal Bitcoin by Deriving Private Keys once Advanced Enough in 5-30 Years, Experts Claim – Crowdfund Insider

John Smith, who has been regularly keeping up with computer science, quantum computing, and cryptocurrency-related developments, claims that the future of crypto is quantum-resistant, meaning we must build systems that can protect themselves against the potential attack from quantum computers (QCs) when they become powerful enough to present a challenge to digital asset networks.

While discussing what the future threat to Bitcoin (BTC) from Quantum Computing might be, and how big of a deal it really is, Smith claims that the threat is that quantum computers will eventually be able to break Bitcoins current digital signatures, which could render the network insecure and cause it to lose value.

He goes on to question why there isnt already a solution as trivial as simply upgrading the signatures? He explains that this might not be possible due to the decentralized nature of Bitcoin and other large crypto-asset networks such as Ethereum (ETH).

While discussing how long until someone actually develops a quantum computer that can steal BTC by quickly deriving private keys from their associated public keys, Smith reveals that serious estimates range somewhere from 5 to over 30 years, with the median expert opinion being around 15 years.

Smooth added:

Banks/govts/etc. will soon upgrade to quantum-resistant cryptography to secure themselves going forward. Bitcoin, however, with large financial incentives for attacking it and no central authority that can upgrade *for* users, faces a unique set of challenges.

Going on to mention the main challenges, Smith notes that we can separate vulnerable BTC into three classes, including lost coins (which are estimated to be several million), non-lost coins residing in reused/taproot/otherwise-vulnerable addresses, and coins in the mempool (i.e., being transacted).

Beginning with lost coins, why are they even an issue? Because its possible to steal a huge number all at once and then selling them in mass quantities which could tank the entire crypto market. He added that if that seems imminent, the market could preemptively tank. He also mentioned that an attacker may profit greatly by provoking either of the above and shorting BTC.

While proposing potential solutions, Smith suggests preemptively burning lost coins via soft fork (or backwards compatible upgrade). He clarifies that just how well this works will depend on:

He further noted:

Another potential way around the problem of millions of lost BTC is if a benevolent party were to steal & then altruistically burn them. Not clear how realistic this is, given the financial incentives involved & who the parties likely to have this capability would be.

He added:

Moving on why are non-lost coins with vulnerable public keys an issue? This is self-evident. The primary threat to the wealth of BTC holders is their BTC being stolen. And as with lost coins, a related threat is that the market starts to fear such an attack is possible.

He also mentioned that another solution could be that Bitcoin adds a quantum-resistant signature and holders proactively migrate. He points out that how well this all works will depend on:

While discussing the vulnerability of coins in the mempool, Smith mentioned that it could complicate migration to quantum-resistant addresses *after* large QCs are built or it could greatly magnify the threat posed by an unanticipated black swan advance in QC.

While proposing other solutions, Smith noted:

A commit-reveal tx scheme can be used to migrate coins without mempool security. This gets around the vulnerability of a users old public key by adding an extra encryption/decryption step based on their new quantum-resistant key but w/ crucial limitations.

He added:

Considerations w/ commit-reveal migration [are that] its not foolproof unless a user starts with their coins stored in a non-vulnerable address, because attackers can steal any vulnerable coins simply by beating the original owner to the punch.

Considerations with commit-reveal migration are also that commit transactions introduce technical hurdles (vs. regular txs) & increase the load on the network. Neither of these are insurmountable by any means, but they suggest that this method should not be relied upon too heavily, Smith claims.

He also noted that how well the commit-reveal transaction type works will depend on:

He added:

One potential way around the network overhead & just plain hassle of commit-reveal migration would be if a highly efficient quantum-resistant zero-knowledge proof were discovered. Current QR ZK algorithms are far too large to use in Bitcoin, but that could change. Worth noting.

While sharing other potential solutions, Smith noted that theres the tank the attack & rebuild.

He pointed out that Bitcoins network effects are massive, so it is challenging to accurately estimate or predict what the crypto ecosystem will look like in the future, but the potential economic disruption of BTC failing may incentivize extraordinary measures to save the network.

He added:

Bitcoins ability to tank a quantum-computing-related market crash will depend on [whether theres] another chain capable of replacing BTC as the main crypto store of value [and whether] BTC [can] avoid a mining death spiral? Also, how far will stakeholders go to ensure the network survives & rebounds?

Smith also mentioned that for people or institutions holding Bitcoin, some good measures may be purchasing insurance, and/or hedging BTC exposure with an asset that would be expected to increase in value in the case of an attack.

Read more from the original source:
Quantum Computers May Steal Bitcoin by Deriving Private Keys once Advanced Enough in 5-30 Years, Experts Claim - Crowdfund Insider

The search for dark matter gets a speed boost from quantum technology – The Conversation US

Nearly a century after dark matter was first proposed to explain the motion of galaxy clusters, physicists still have no idea what its made of.

Researchers around the world have built dozens of detectors in hopes of discovering dark matter. As a graduate student, I helped design and operate one of these detectors, aptly named HAYSTAC. But despite decades of experimental effort, scientists have yet to identify the dark matter particle.

Now, the search for dark matter has received an unlikely assist from technology used in quantum computing research. In a new paper published in the journal Nature, my colleagues on the HAYSTAC team and I describe how we used a bit of quantum trickery to double the rate at which our detector can search for dark matter. Our result adds a much-needed speed boost to the hunt for this mysterious particle.

There is compelling evidence from astrophysics and cosmology that an unknown substance called dark matter constitutes more than 80% of the matter in the universe. Theoretical physicists have proposed dozens of new fundamental particles that could explain dark matter. But to determine which if any of these theories is correct, researchers need to build different detectors to test each one.

One prominent theory proposes that dark matter is made of as-yet hypothetical particles called axions that collectively behave like an invisible wave oscillating at a very specific frequency through the cosmos. Axion detectors including HAYSTAC work something like radio receivers, but instead of converting radio waves to sound waves, they aim to convert axion waves into electromagnetic waves. Specifically, axion detectors measure two quantities called electromagnetic field quadratures. These quadratures are two distinct kinds of oscillation in the electromagnetic wave that would be produced if axions exist.

The main challenge in the search for axions is that nobody knows the frequency of the hypothetical axion wave. Imagine youre in an unfamiliar city searching for a particular radio station by working your way through the FM band one frequency at a time. Axion hunters do much the same thing: They tune their detectors over a wide range of frequencies in discrete steps. Each step can cover only a very small range of possible axion frequencies. This small range is the bandwidth of the detector.

Tuning a radio typically involves pausing for a few seconds at each step to see if youve found the station youre looking for. Thats harder if the signal is weak and theres a lot of static. An axion signal in even the most sensitive detectors would be extraordinarily faint compared with static from random electromagnetic fluctuations, which physicists call noise. The more noise there is, the longer the detector must sit at each tuning step to listen for an axion signal.

Unfortunately, researchers cant count on picking up the axion broadcast after a few dozen turns of the radio dial. An FM radio tunes from only 88 to 108 megahertz (one megahertz is one million hertz). The axion frequency, by contrast, may be anywhere between 300 hertz and 300 billion hertz. At the rate todays detectors are going, finding the axion or proving that it doesnt exist could take more than 10,000 years.

On the HAYSTAC team, we dont have that kind of patience. So in 2012 we set out to speed up the axion search by doing everything possible to reduce noise. But by 2017 we found ourselves running up against a fundamental minimum noise limit because of a law of quantum physics known as the uncertainty principle.

The uncertainty principle states that it is impossible to know the exact values of certain physical quantities simultaneously for instance, you cant know both the position and the momentum of a particle at the same time. Recall that axion detectors search for the axion by measuring two quadratures those specific kinds of electromagnetic field oscillations. The uncertainty principle prohibits precise knowledge of both quadratures by adding a minimum amount of noise to the quadrature oscillations.

In conventional axion detectors, the quantum noise from the uncertainty principle obscures both quadratures equally. This noise cant be eliminated, but with the right tools it can be controlled. Our team worked out a way to shuffle around the quantum noise in the HAYSTAC detector, reducing its effect on one quadrature while increasing its effect on the other. This noise manipulation technique is called quantum squeezing.

In an effort led by graduate students Kelly Backes and Dan Palken, the HAYSTAC team took on the challenge of implementing squeezing in our detector, using superconducting circuit technology borrowed from quantum computing research. General-purpose quantum computers remain a long way off, but our new paper shows that this squeezing technology can immediately speed up the search for dark matter.

Our team succeeded in squeezing the noise in the HAYSTAC detector. But how did we use this to speed up the axion search?

Quantum squeezing doesnt reduce the noise uniformly across the axion detector bandwidth. Instead, it has the largest effect at the edges. Imagine you tune your radio to 88.3 megahertz, but the station you want is actually at 88.1. With quantum squeezing, you would be able to hear your favorite song playing one station away.

In the world of radio broadcasting this would be a recipe for disaster, because different stations would interfere with one another. But with only one dark matter signal to look for, a wider bandwidth allows physicists to search faster by covering more frequencies at once. In our latest result we used squeezing to double the bandwidth of HAYSTAC, allowing us to search for axions twice as fast as we could before.

Quantum squeezing alone isnt enough to scan through every possible axion frequency in a reasonable time. But doubling the scan rate is a big step in the right direction, and we believe further improvements to our quantum squeezing system may enable us to scan 10 times faster.

Nobody knows whether axions exist or whether they will resolve the mystery of dark matter; but thanks to this unexpected application of quantum technology, were one step closer to answering these questions.

See more here:
The search for dark matter gets a speed boost from quantum technology - The Conversation US

The Quantum Comprehension Gap and the Emergence of Quantum Ethics – insideHPC

Though years from potential fruition, quantum computing and its control has emerged as an issue among technology ethicists. But if a YouTube video released last week voicing the concerns of six quantum experts is any indication, the level of discourse is at an early and amorphous stage, with only vague notions of solutions.

This is not to belittle the good work of Matt Swayne, an editor at Quantum Daily who co-produced the video with publisher Evan Kubes. To be fair, the video is intended for a general, not technical, audience, and Swayne and Kubes raise critical issues that individual technologists, their companies, their countries and governing bodies will need to come to grips with. Its just to say that quantum ethics, like the technology itself, is at an early stage, and that the thinking, talking and actions taken on quantum ethics will have to progress far and fast if it is to be effective.

The thought of what quantum may someday be able to do, that it could dust todays HPC and supercomputing, is staggering. Altering the human genome, designing super (and super-expensive) drugs, developing new military weapons, along with espionage and law enforcement techniques all of these and more have major implications not only for the technology but for the existing gaps between rich and poor people and countries, between normally intelligent and the abnormally intelligent technological elite, gaps that quantum could widen.

As Faye Wattleton, co-founder , EeroQ Quantum Hardware, said in the video, I think its in a moment for us to pause, and cause us to take a step back to say, Wait a minute, if we can do in a few minutes what it would take 10,000 years to do with our current technology, well, that really requires some careful consideration.

If we think about what it can do for good, of course, (many) industries farmer, molecular simulation, creating new materials thats wonderful, said Dr. Ilana Wisby, CEO, Oxford Quantum Circuits. But of course, it could also be used to create new materials for purposes that arent so wonderful. We start to see and understand why governments, for example, are interested from even a material science perspective. And, of course, the infamous one is Shors Algorithm and the understanding that quantum computing could one day, likely, break encryption What we have to understand and address now is: Is it worth the risk? Just because we can do something doesnt mean we should.

The point regarding the gap in quantum comprehension is not raised in the video, but there already is a major divide between those doing quantum R&D over against the vast majority of technologists, never mind the public at large, for whom quantum will remain an utter blank, a non-starter, beginning with the head splitting concept that a qubit can be a 0 and a 1 at the same time (though, we admit, the more often we hear it repeated the less intimidating it becomes, even if its no more comprehendible). As Nobel Laureate Richard Feynman said, If you think you understand quantum mechanics, you dont understand quantum mechanics. (It may have been Feynman who also said, You dont understand quantum mechanics, you just go with it.)

Dr. Ilana Wisby, CEO, Oxford Quantum Circuits

The comprehension gap only adds to the complexities of quantum ethics when we consider that those who will apply the ethics in the form of legislation i.e., politicians wont understand the technology at all. Collision of the tech-political worlds was put on display last summer during Congressional hearings on Big Tech in which members of Congress asked elementary and transparently uninformed questions that the Big Tech company executives struggled mightily to answer without condescension and that was about social media, a technology every politician uses (one media wag said the hearings at times seemed more like an extended Facebook help session).

Theres a truism that when it comes to business, politicians first do too little, then too much. This could pose a problem for FAANG and other companies pursuing quantum that are accustomed to asking for forgiveness, not permission, from local, state and federal governments and regulators.

Perhaps companies in the quantum sector should look for guidance from Germanys approach to governance of autonomous vehicles. Led by the countrys transportation minister, an ethics commission was assembled and deliberated on the matter with religious, intellectual and other societal leaders, along with technologists and car makers. The commissions 2017 report recommended that all AVs let humans take control, that if an accident occurs in which the car is in control then the automaker is liable, that AVs cant be programmed demographically (such as deciding that an elderly person should die before a baby), and other matters. If these ethical constraints make it harder to produce AVs then so be it ethics before technology seemed to be the commissions overriding priority.**

Ilyas Khan, CEO, Cambridge Quantum Computing

In that vein, one the experts who participated in the video, Ilyas Khan, CEO, Cambridge Quantum Computing, urged the quantum community not repeat the ethical lapses of previous decades.

My generation was asleep of the wheel in the 90s, Khan said. The pursuit of various different returns overcame our sensibility. If you think 100 years ago, 150 years ago, when mass media first made its appearance in the form of newspapers that millions of people would read, we put controls in place. When railways started to emerge, we put controls in place. In the mid-90s, the combination of the internet revolution and what happened with mobile telephony, we gave up, there were no controls. Now, societies get very excited about things like (the financial crisis of) 2008, and 2009 and the so-called bankers that were at fault, but this is a far, far bigger issue that were facing today because of being asleep of the wheel in the 90s, and the 80s.

Considering quantums potential powers, and the natural concern of the bottom 99 percent who can only stand in uncomprehending awe before that power, an ethics-first approach may be the right way to guide quantum through its development if it is to be accepted, not feared, by society at large.

As one of the experts in the video, Nick Farina, founder, EeroQ Quantum Hardware, has said, The early stage of quantum computing is not a reason to delay ethical considerations, its actually a great opportunity to create ethical frameworks in advance of large scale impact.

** Source: Steve Conway, senior adviser, HPC market dynamics, at industry analyst firm Hyperion Research.

Read the original here:
The Quantum Comprehension Gap and the Emergence of Quantum Ethics - insideHPC

The Interplay between Quantum Theory And Artificial Intelligence – Analytics India Magazine

Download our Mobile App

Machine Learning Developers Summit (MLDS 2021) is one of the biggest gatherings of machine learning developers in India. With more than 1,500 machine learning developers, 60 speakers from around 200 organisations, the conference corrals Indias leading Machine Learning innovators and practitioners to share their ideas about machine learning tools, advanced development and more.

Anish Agarwal, Director, Data & Analytics, India at NatWest Group, talked about The Interplay between Quantum Theory And Artificial Intelligence at MLDS 2021.

The session started with an introduction to emerging technologies like artificial intelligence, a brief on quantum computing, different forms of quantum technology used for various military as well as civilian applications, how it is different from the classical computers as well as how quantum computing plays a vital role in the advancement of artificial intelligence.

In the field of quantum computing, Agarwal discussed the technique of quantum artificial intelligence, how it can be used for computation of machine learning algorithms and what makes this technology unique.

Quantum AI can help in achieving results that are impossible with classical computers. He said, as per reports, 25 percent of fortune global 500 companies will have a competitive edge from quantum computing by the year 2023. Tech giants like Google, Microsoft are doubling down on quantum computing.

He then explained the possibilities of applying quantum computing in AI:

He said, Quantum machine learning (QML) is not one settled homogeneous field. This is because machine learning itself is quite diverse in nature. He added, Quantum Machine Learning is simply the field exploring the connections between quantum computing and quantum physics on one hand and machine learning and related fields on the other hand.

Agarwal then deliberated on Quantum Game Theory and compared it with classical game theory. He said quantum game theory can be used to overcome critical problems in quantum communications.

He also discussed the advantages of quantum AI:

Agarwal concluded the session by touching upon the key applications of quantum artificial intelligence. Lastly, he mentioned some of the critical milestones for quantum AI and busted a few myths related to quantum computing techniques.

The critical milestones include:

A Technical Journalist who loves writing about Machine Learning and Artificial Intelligence. A lover of music, writing and learning something out of the box. Contact: [emailprotected]

The rest is here:
The Interplay between Quantum Theory And Artificial Intelligence - Analytics India Magazine

The Quantum Computing market is expected to grow from USD 472 million in 2021 to USD 1,765 million by 2026, at a CAGR of 30.2% – GlobeNewswire

New York, Feb. 10, 2021 (GLOBE NEWSWIRE) -- Reportlinker.com announces the release of the report "Quantum Computing Market with COVID-19 impact by Offering, Deployment, Application, Technology, End-use Industry and Region - Global Forecast to 2026" - https://www.reportlinker.com/p05064748/?utm_source=GNW Several companies are focusing on the adoption of QCaaS post-COVID-19. This, in turn, is expected to contribute to the growth of the quantum computing market. However, stability and error correction issues is expected to restrain the growth of the market.

Services segment is attributed to hold the largest share of the Quantum Computing marketThe growth of services segment can be attributed to the increasing number of startups across the world that are investing in research and development activities related to quantum computing technology. This technology is used in optimization, simulation, and machine learning applications, thereby leading to optimum utilization costs and highly efficient operations in various end-use industries.

Cloud based deployment to witness the highest growth in Quantum Computing market in coming yearsWith the development of highly powerful systems, the demand for cloud-based deployment of quantum computing systems and services is expected to increase.This, in turn, is expected to result in a significant revenue source for service providers, with users paying for access to noisy intermediate-scale quantum (NISQ) systems that can solve real-world problems.

The limited lifespan of rapidly advancing quantum computing systems also favors cloud service providers.The flexibility of access offered to users is another factor fueling the adoption of cloud-based deployment of quantum computing systems and services.

For the foreseeable future, quantum computers are expected not to be portable. Cloud can provide users with access to different devices and simulators from their laptops.

Optimization accounted for a major share of the overall Quantum Computing marketOptimization is the largest application for quantum computing and accounted for a major share of the overall Quantum Computing market.Companies such as D-Wave Systems, Cambridge Quantum Computing, QC Ware, and 1QB Information Technologies are developing quantum computing systems for optimization applications.

Networked Quantum Information Technologies Hub (NQIT) is expanding to incorporate optimization solutions for resolving problems faced by the practical applications of quantum computing technology.

Trapped ions segment to witness highest CAGR of Quantum Computing market during the forecast periodThe trapped ions segment of the market is projected to grow at the highest CAGR during the forecast period as quantum computing systems based on trapped ions offer more stability and better connectivity than quantum computing systems based on other technologies. IonQ, Alpine Quantum Technologies, and Honeywell are a few companies that use trapped ions technology in their quantum computing systems.

Banking and finance is attributed to hold major share of Quantum Computing market during the forecast periodIn the banking and finance end-use industry, quantum computing is used for risk modeling and trading applications.It is also used to detect the market instabilities by identifying stock market risks and optimize the trading trajectories, portfolios, and asset pricing and hedging.

As the financial sector is difficult to understand; the quantum computing approach is expected to help users understand the complexities of the banking and finance end-use industry. Moreover, it can help traders by suggesting them solutions to overcome financial challenges.

APAC to witness highest growth of Quantum Computing market during the forecast periodAPAC region is a leading hub for several industries, including healthcare and pharmaceuticals, banking and finance, and chemicals.Countries such as China, Japan, and South Korea are the leading manufacturers of consumer electronics, including smartphones, laptops, and gaming consoles, in APAC.

There is a requirement to resolve complications in optimization, simulation, and machine learning applications across these industries.The large-scale development witnessed by emerging economies of APAC and the increased use of advanced technologies in the manufacturing sector are contributing to the development of large and medium enterprises in the region.

This, in turn, is fueling the demand for quantum computing services and systems in APAC.In APAC, the investments look promising, as most countries such as China, Japan, and South Korea have successfully contained the virus compared with the US and European countries.China is easing the restrictions placed on factory lockdowns and worker movement.

Despite being the epicenter of COVID-19, China has maintained its dominant position as a global network leader.

The break-up of primary participants for the report has been shown below: By Company Type: Tier 1 - 18%, Tier 2 - 22%, and Tier 3 - 60% By Designation: C-level Executives - 21%, Manager Level - 35%, and Others - 44% By Region: North America - 45%, Europe - 38%, APAC - 12%, and RoW - 5%

The Quantum Computing market was dominated by International Business Machines (US), D-Wave Systems (Canada), Microsoft (US), Amazon (US), and Rigetti Computing (US).

Research Coverage:This research report categorizes the Quantum Computing based on offering, deployment, application, technology, end-use industry and region. The report describes the major drivers, restraints, challenges, and opportunities pertaining to the Quantum Computing market and forecasts the same till 2026.

Key Benefits of Buying the Report

The report would help leaders/new entrants in this market in the following ways:1. This report segments the Quantum Computing market comprehensively and provides the closest market size projection for all subsegments across different regions.2. The report helps stakeholders understand the pulse of the market and provides them with information on key drivers, restraints, challenges, and opportunities for market growth.3. This report would help stakeholders understand their competitors better and gain more insights to improve their position in the business. The competitive landscape section includes product launches and developments, partnerships, and collaborations.4. This report would help understand the pre and post-COVID-19 scenarios as to how would the penetration of quantum computing will look like for the forecast period. The region segment includes the country wise impact analysis of COVID-19 and initiatives taken to overcome these impacts.

Read the full report: https://www.reportlinker.com/p05064748/?utm_source=GNW

About ReportlinkerReportLinker is an award-winning market research solution. Reportlinker finds and organizes the latest industry data so you get all the market research you need - instantly, in one place.

__________________________

Go here to see the original:
The Quantum Computing market is expected to grow from USD 472 million in 2021 to USD 1,765 million by 2026, at a CAGR of 30.2% - GlobeNewswire

Gurucul XDR Uses Machine Learning & Integration for Real-Time Threat Detection, Incident Response – Integration Developers

To improve speed and intelligence of threat detection and response, Guruculs cloud-native XDR platform is adding machine learning, integration risk scoring and more.

by Anne Lessman

Tags: cloud-native, Gurucul, integration, machine learning, real-time, threat detection,

The latest upgrade to the Gurucul XDR platform adds extended detection and response alongside improved risk scoring to strengthen security operations effectiveness and productivity.

Improvements to Guruculs cloud-native solution also sport features to enable intelligent investigations and risk-based response automation. New features include extended data linking, additions to its out-of-the-box integrations, contextual machine learning (ML) analytics and risk-prioritized alerting.

The driving force behind these updates is to provide users a single pane of risk, according to Gurucul CEO Saryu Nayyar.

Most XDR products are based on legacy platforms limited to siloed telemetry and threat detection, which makes it difficult to provide unified security operations capabilities, Nayyar said.

Gurucul Cloud-native XDR is vendor-agnostic and natively built on a Big Data architecture designed to process, contextually link, analyze, detect, and risk score using data at massive scale. It also uses contextual Machine Learning models alongside a risk scoring engine to provide real-time threat detection, prioritize risk-based alerts and support automated response, Nayyar.added.

Gurucul XDR provides the following capabilities that are proven to improve incident response times:

AI/ML Suggestive Investigation and Automated Intelligent Responses: Traditional threat hunting tools and SIEMs focus on a limited number of use cases since they rely on data and alerts from a narrow set of resources. With cloud adoption increasing at a record pace, threat hunting must span hybrid on-premises and cloud environments and ingest data from vulnerability management, IoT, medical, firewall, network devices and more.

Guruculs approach provides agentless, out-of-the-box integrations that support a comprehensive set of threat hunting applications. These include: Insider threat detection, Data exfiltration, Phishing, Endpoint forensics, Malicious processes and Network threat analytics.

Incident Timeline, Visualizations, and Reporting: Automated Incident Timelines create a smart link of the entire attack lifecycle for pre-and post-incident analysis. Timelines can span days and even years of data in easy-to-understand visualizations.

Guruculs visualization and dashboarding enables analysts to view threats from different perspectives using several widgets, including TreeMap, Bubble Chart, etc., that provide full drill-down capabilities into events without leaving the interface. The unique scorecard widget generates a spider chart representation of cyber threat hunting outcomes such as impact, sustaining mitigation measures, process improvements scores, etc.

Risk Prioritized Automated Response: Integration with Gurucul SOAR enables analysts to invoke more than 50 actions and 100 playbooks upon detection of a threat to minimize damages.

Entity Based Threat Hunting: Perform contextual threat hunting or forensics on entities. Automate and contain any malicious or potential threat from a single interface.

Red Team Data Tagging: Teams can leverage red team exercise data and include supervised learning techniques as part of a continuous AI-based threat hunting process.

According to Gartner, XDR products aim to solve the primary challenges with SIEM products, such as effective detection of and response to targeted attacks, including native support for behavior analysis, threat intelligence, behavior profiling and analytics.

Further, the primary value propositions of an XDR product are to improve security operations productivity and enhance detection and response capabilities by including more security components into a unified whole that offers multiple streams of telemetry, Gartner added.

The result, the firm said, is to present options for multiple forms of detection and . . multiple methods of response.

Gurucul XDR provides the following capabilities that are proven to improve incident response times by nearly 70%:

Surgical Response

Intelligent Centralized Investigation

Rapid Incident Correlation and Causation

Gurucul XDR is available immediately from Gurucul and its business partners worldwide.

back

Visit link:
Gurucul XDR Uses Machine Learning & Integration for Real-Time Threat Detection, Incident Response - Integration Developers

– Retracing the evolution of classical music with machine learning – Design Products & Applications

05 February 2021

Researchers in EPFLs Digital and Cognitive Musicology Lab in the College of Humanities used an unsupervised machine learning model to reveal how modes such as major and minor have changed throughout history.

Many people may not be able to define what a minor mode is in music, but most would almost certainly recognise a piece played in a minor key. Thats because we intuitively differentiate the set of notes belonging to the minor scale which tend to sound dark, tense, or sad from those in the major scale, which more often connote happiness, strength, or lightness.

But throughout history, there have been periods when multiple other modes were used in addition to major and minor or when no clear separation between modes could be found at all.

Understanding and visualising these differences over time is what Digital and Cognitive Musicology Lab (DCML) researchers Daniel Harasim, Fabian Moss, Matthias Ramirez, and Martin Rohrmeier set out to do in a recent study, which has been published in the open-access journal Humanities and Social Sciences Communications. For their research, they developed a machine learning model to analyze more than 13,000 pieces of music from the 15th to the 19th centuries, spanning the Renaissance, Baroque, Classical, early Romantic, and late-Romantic musical periods.

We already knew that in the Renaissance [1400-1600], for example, there were more than two modes. But for periods following the Classical era [1750-1820], the distinction between the modes blurs together. We wanted to see if we could nail down these differences more concretely, Harasim explains.

Machine listening (and learning)

The researchers used mathematical modelling to infer both the number and characteristics of modes in these five historical periods in Western classical music. Their work yielded novel data visualizations showing how musicians during the Renaissance period, like Giovanni Pierluigi da Palestrina, tended to use four modes, while the music of Baroque composers, like Johann Sebastian Bach, revolved around the major and minor modes. Interestingly, the researchers could identify no clear separation into modes of the complex music written by Late Romantic composers, like Franz Liszt.

Harasim explains that the DCMLs approach is unique because it is the first time that unlabelled data have been used to analyse modes. This means that the pieces of music in their dataset had not been previously categorized into modes by a human.

We wanted to know what it would look like if we gave the computer the chance to analyse the data without introducing human bias. So, we applied unsupervised machine learning methods, in which the computer 'listens' to the music and figures out these modes on its own, without metadata labels.

Although much more complex to execute, this unsupervised approach yielded especially interesting results which are, according to Harasim, more cognitively plausible with respect to how humans hear and interpret music.

We know that musical structure can be very complex and that musicians need years of training. But at the same time, humans learn about these structures unconsciously, just as a child learns a native language. Thats why we developed a simple model that reverse engineers this learning process, using a class of so-called Bayesian models that are used by cognitive scientists, so that we can also draw on their research.

From class project to publicationand beyond

Harasim notes with satisfaction that this study has its roots in a class project that he and his co-authors Moss and Ramirez did together as students in EPFL professor Robert Wests course, Applied Data Analysis. He hopes to take the project even further by applying their approach to other musical questions and genres.

For pieces within which modes change, it would be interesting to identify exactly at what point such changes occur. I would also like to apply the same methodology to jazz, which was the focus of my PhD dissertation because the tonality in jazz is much richer than just two modes.

See original here:
- Retracing the evolution of classical music with machine learning - Design Products & Applications

Machine Learning and Artificial Intelligence in Healthcare Market 2021 inclining trends with NVIDIA Corporation, Intel Corporation, GENERAL ELECTRIC…

Travel Guard has specific cruise insurance policies, which makes it simpler than trying to find an add-on. If youre getting a quote online, theyll ask you to specify if youre taking a plane, a cruise, or both. They cover any emergency travel assistance, trip interruption, delay, or cancellation.

Cruise travel insurance secures non-refundable investments related to your trip. It reimburses you if you have to cancel your international cruise unexpectedly prior to your departure. It also provides medical coverage for unexpected injuries and illnesses. Cruise travel insurance policies provide medical coverage while you are on a holiday. A cancellation after this can mean a huge financial loss, but a cruise travel insurance policyholder shall be covered for cancellation or postponement of trips.

The aim of the report is to equip relevant players in deciphering essential cues about the various real-time market based developments, also drawing significant references from historical data, to eventually present a highly effective market forecast and prediction, favoring sustainable stance and impeccable revenue flow despite challenges such as sudden pandemic, interrupted production and disrupted sales channel in the Cruise Travel Insurance market.

Request a sample copy of report @ https://www.reportconsultant.com/request_sample.php?id=77601

Key players profiled in the report includes:

Allianz, AIG, Munich RE, Generali, Tokio Marine, Sompo Japan, CSA Travel Protection, AXA, Pingan Baoxian, Mapfre Asistencia, USI Affinity, Seven Corners, Hanse Merkur, MH Ross, STARR

Market Segmentation by type:

Market Segmentation by application:

This report is well documented to present crucial analytical review affecting the Cruise Travel Insurance market amidst COVID-19 outrage. The report is so designed to lend versatile understanding about various market influencers encompassing a thorough barrier analysis as well as an opportunity mapping that together decide the upcoming growth trajectory of the market. In the light of the lingering COVID-19 pandemic, this mindfully drafted research offering is in complete sync with the current ongoing market developments as well as challenges that together render tangible influence upon the holistic growth trajectory of the Cruise Travel Insurance market.

Besides presenting a discerning overview of the historical and current market specific developments, inclined to aid a future-ready business decision, this well-compiled research report on the Cruise Travel Insurance market also presents vital details on various industry best practices comprising SWOT and PESTEL analysis to adequately locate and maneuver profit scope. Therefore, to enable and influence a flawless market-specific business decision, aligning with the best industry practices, this specific research report on the market also lends a systematic rundown on vital growth triggering elements comprising market opportunities, persistent market obstacles and challenges, also featuring a comprehensive outlook of various drivers and threats that eventually influence the growth trajectory in the Cruise Travel Insurance market.

Get reports for upto 40% discount @ https://www.reportconsultant.com/ask_for_discount.php?id=77601

Global Cruise Travel Insurance Geographical Segmentation Includes:

North America (U.S., Canada, Mexico)

Europe (U.K., France, Germany, Spain, Italy, Central & Eastern Europe, CIS)

Asia Pacific (China, Japan, South Korea, ASEAN, India, Rest of Asia Pacific)

Latin America (Brazil, Rest of L.A.)

Middle East and Africa (Turkey, GCC, Rest of Middle East)

Some Major TOC Points:

Chapter 1. Report Overview

Chapter 2. Global Growth Trends

Chapter 3. Market Share by Key Players

Chapter 4. Breakdown Data by Type and Application

Chapter 5. Market by End Users/Application

Chapter 6. COVID-19 Outbreak: Cruise Travel Insurance Industry Impact

Chapter 7. Opportunity Analysis in Covid-19 Crisis

Chapter 9. Market Driving Force

And More

In this latest research publication a thorough overview of the current market scenario has been portrayed, in a bid to aid market participants, stakeholders, research analysts, industry veterans and the like to borrow insightful cues from this ready-to-use market research report, thus influencing a definitive business discretion. The report in its subsequent sections also portrays a detailed overview of competition spectrum, profiling leading players and their mindful business decisions, influencing growth in the Cruise Travel Insurance market.

About Us:

Report Consultant A worldwide pacesetter in analytics, research and advisory that can assist you to renovate your business and modify your approach. With us, you will learn to take decisions intrepidly by taking calculative risks leading to lucrative business in the ever-changing market. We make sense of drawbacks, opportunities, circumstances, estimations and information using our experienced skills and verified methodologies.

Our research reports will give you the most realistic and incomparable experience of revolutionary market solutions. We have effectively steered business all over the world through our market research reports with our predictive nature and are exceptionally positioned to lead digital transformations. Thus, we craft greater value for clients by presenting progressive opportunities in the global futuristic market.

Contact us:

Rebecca Parker

(Report Consultant)

sales@reportconsultant.com

http://www.reportconsultant.com

Read this article:
Machine Learning and Artificial Intelligence in Healthcare Market 2021 inclining trends with NVIDIA Corporation, Intel Corporation, GENERAL ELECTRIC...

NTUC LearningHub Survey Reveals Accelerated Business Needs In Cloud Computing And Machine Learning Outpacing Singapore Talent Supply; Skills Gap A…

SINGAPORE -Media OutReach-5 February2021 -Despite majority of Singapore employers(89%) reporting that the COVID-19 pandemic has accelerated the adoption of cloudcomputing and Machine Learning (ML) in their companies, obstacles abound. Singaporebusiness leaders say that the largest hindrance to adopting cloud computing andML technologies is the shortage of relevant in-house IT support (64%), amongstother reasons such as 'employees do not have the relevant skill sets' (58%) and'the lack of financial resources' (46%).

alt="NTUC LearningHub Survey Reveals Accelerated Business Needs In Cloud Computing And Machine Learning Outpacing Singapore Talent Supply; Skills Gap A Hindrance To Implementing These Technologies"

These are some ofthe key findings from the recently launched NTUC LearningHub (NTUC LHUB)Industry Insights report on cloud computing and ML in Singapore. The report is basedon in-depth interviews with industry experts, such as Amazon Web Services (AWS)and NTUC LHUB, and a survey with 300 hiring managers across industries inSingapore.

While organisationsare keen to adopt cloud computing and ML to improve the company's businessperformance (64%), obtain business insights from Big Data (59%) and performmundane or tedious tasks (53%), a third of Singapore employers (32%) say theircompanies have insufficient talent to implement cloud computing and MLtechnologies.

To overcome thisshortage, companies say they have been upskilling employees that have relevantskill sets/ roles (55%), and reskilling employees that have completelydifferent skill sets/ roles (44%). In a further show of how organisations werewilling to take steps to overcome this skills gap, three in five (61%) stronglyagree or agree that they will be open to hiring individuals with relevantmicro-credentials, even if these candidates has no relevant experience oreducation degrees.

Looking to thefuture, four in five employers (81%) agree or strongly agree that ML will bethe most in-demand Artificial Intelligence (AI) skill in 2021. Meanwhile, sevenout of 10 surveyed (70%) indicated they will be willing to offer a premium fortalent with AI and ML skills.

"The report reinforces the growing demand for a cloud-skilled workforce inSingapore, and the critical need to upskill and reskill local talent", said TanLee Chew, Managing Director, ASEAN, Worldwide Public Sector, AWS. "Thecollaboration across government, businesses, education and traininginstitutions will be instrumental in helping Singapore employers address theseskills gaps. AWS will continue to collaborate with training providers like NTUCLearningHub to make skills training accessible to help Singaporeans, fromstudents to adult learners, to remain relevant today and prepare for the future."

NTUC LHUB's Head ofICT, Isa Nasser also adds, "While much of the talent demand encompasses technicalpositions such as data scientists and data engineers, businesses are alsolooking for staff to pick up practical ML and data science skills sets that canbe applied to their existing work. Thatis why in today's digital age, most professionals would benefit greatly frompicking up some data science skills to enable them to deploy ML applicationsand use cases in their organization. We highly urge workers to get started on equipping themselveswith ML skills, including understanding the core concepts of data science, aswell as familiarising themselves on the use of cloud or ML platforms such as AmazonSageMaker."

To download theIndustry Insights: Cloud Computing and ML report, visit

https://www.ntuclearninghub.com/machine-learning-cloud.

NTUCLearningHub is the leading Continuing Education and Training provider in Singapore,which aims to transform the lifelong employability of working people. Since ourcorporatisation in 2004, we have been working employers and individual learnersto provide learning solutions in areas such as Cloud, Infocomm Technology,Healthcare, Employability & Literacy, Business Excellence, Workplace Safety& Health, Security, Human Resources and Foreign Worker Training.

Todate, NTUC LearningHub has helped over 25,000 organisations and achieved over2.5 million training places across more than 500 courses with a pool of over460 certified trainers. As a Total Learning Solutions provider toorganisations, we also forge partnerships and offer a wide range of relevantend-to-end training solutions and work constantly to improve our trainingquality and delivery. In 2020, we have accelerated our foray into onlinelearning with our Virtual Live Classes and, through working with best-in-classpartners such as IBM, DuPont Sustainable Solutions and GO1, asynchronous onlinecourses.

For moreinformation, visitwww.ntuclearninghub.com.

Read the original:
NTUC LearningHub Survey Reveals Accelerated Business Needs In Cloud Computing And Machine Learning Outpacing Singapore Talent Supply; Skills Gap A...

Machine Learning To Bring A Transformation In Software Testing – CIO Applications

The test automation effort will continue to accelerate. Surprisingly, a lot of businesses do have manual checks in their distribution pipeline, but you can't deliver quickly if you have humans on the vital path of the supply chain, slowing things down.

FREMONT, CA: Over the last decade, there has been an unwavering drive to deliver applications faster. Automated testing has emerged as one of the most relevant technologies for scaling DevOps, businesses are spending a lot of time and effort to develop end-to-end software delivery pipelines, and containers and their ecosystem are keeping up with their early promise.

Testing is one of the top DevOps monitors that companies can use to ensure that their consumers have a delightful brand experience. Others include access management, logging, traceability and disaster recovery.

Quality and access control are preventive controls, while others are reactive. In the future, there will be a growing emphasis on consistency because it prevents consumers from having a bad experience. So delivering value quicklyor better still delivering the right value quickly at the right quality levelis the main theme that everyone will see this year and beyond.

Here are the five key trends in 2021:

Automation of exams

The test automation effort will continue to accelerate. Surprisingly, a lot of businesses do have manual checks in their distribution pipeline, but you can't deliver quickly if you have humans on the vital path of the supply chain, slowing things down.

Automation of manual tests is a long process that takes dedicated engineering time. While many companies have at least some kind of test automation, much needs to be done. That's why automated testing will remain one of the top trends in the future.

DevOps-driven data

Over the past six to eight years, the industry has concentrated on linking various resources through the development of robust distribution pipelines. Each of these tools produces a significant amount of data, but the data is used minimally, if at all.

The next stage is to add the smarts to the tooling. Expect to see an increased focus on data-driven decision-making by practitioners.

Read more:
Machine Learning To Bring A Transformation In Software Testing - CIO Applications

REACH and Millennium Systems International Partner to offer Machine Learning Driven Booking Automation to the MeevoXchange Marketplace – PRNewswire

REACH is available in award-winning Millennium System International's scheduling software product, Meevo 2, and serves thousands of beauty businesses in over 30 countries."We are thrilled to announce another Meevo 2 business building integration offering within our MeevoXchange marketplace REACH by Octopi. REACH delivers the AI-powered smart scheduling features to help keep our salons and spas booked and growing. This partnership aligns with our strategic goals for our award-winning software Meevo 2 as we continuously add value to our platform and ultimately our salon and spa customers," says CEO John Harms, Millennium Systems International.

"REACH is so special because it requires virtually no setup or upkeep as it follows your existing Meevo 2 online booking settings. REACH plays 'matchmaker' by connecting your clients that are due and overdue with open spaces in your Meevo 2 appointment book over the next few days, automatically. It has taken us years of research and development to create such successful and exciting tool that will begin to show value to your business starting on day one!" CEO Patrick Blickman, REACH by Octopi

Performance Guarantee and Affordability

The platform includes the REACH Revenue Guarantee thatensures each location will see a minimum of $600-$1400 in new booking revenue every month. There are never any contracts or commitments with REACH. Simply turn it on and let it start filling your Meevo 2 appointment book. Pricing starts at $149/month.

About REACH by OCTOPI

REACH was founded to make the client booking experience easier and far more automated for the health and beauty businesses we serve. Headquartered in Scottsdale, Arizona; REACH is built on decades of consolidated industry and channel expertise. Visitwww.octopi.com/reach

About Millennium Systems International:

Millennium Systems International has been a leading business management software for the salon, spa and wellness industry for more than three decades. The award-winning Meevo 2 platform provides a true cloud-based business management software that is HIPAA compliant and fully responsive, so users can gain complete access using any device, built by wellness and beauty veterans exclusively for the wellness and beauty industry. Visit https://www.millenniumsi.com

SOURCE Octopi

octopi.com

Read the original here:
REACH and Millennium Systems International Partner to offer Machine Learning Driven Booking Automation to the MeevoXchange Marketplace - PRNewswire

Can Machine Learning be the Best Remedy in the Education Sector? – Analytics Insight

The classrooms in present era are not only expanding to use more technologies and digital tools but they are also engaging in machine learning

Technology in the classroom is becoming more and more popular as we pass through the 21st century. Laptops are replacing our textbooks, and on our smart phones, we can study just about everything we want. Social media has become ubiquitous, and the way we use technology has changed the way we live our lives fully.

Technology has become the core component of distance education programs. It enhances teachers and students to digitally interconnect and exchange material and student work, retaining a human link, which is important for the growth of young minds. Enhanced connections and customized experience can allow educators torecognizeopportunities for learning skills and enhance the potential of a student.

Hence, the classrooms in present era are not only expanding to use more technologies and digital tools but they are also engaging in machine learning.

Machine learning is an artificial intelligence (AI) element, which lets machines or computers learn from all previous knowledge and make smart decisions. The architecture for machine learning involves gathering and storing a rich collection of information and turning it into a standardized knowledge base for various uses in different fields. Educators could save time in their non-classroom practices in the field of education by concentrating on machine learning.

For instance, teachers may use virtual helpers to work for their students directly from home. This form of assistance helps to boost the learning environment of students and can promote growth and educational success.

According to ODSC, Last years report by MarketWatch has revealed that Machine Learning in education will remain one of the top industries to drive investment, with the U.S. and China becoming the top key players by 2030. Major companies, like Google and IBM, are getting involved in making school education more progressive and innovative.

Analyzing all-round material

By making the content more up-to-date and applicable to an exact request, the use of machine learning in education aims to bring the online learning sector to a new stage. How? ML technologies evaluate the content of courses online and help to assess whether the quality of the knowledge presented meets the applicable criteria. On the other hand, know how users interpret the data and understand what is being explained. Users then obtain the data according to their particular preferences and expertise, and the overall learning experience increases dramatically.

Customized Learning

This is the greatest application of machine learning. It is adaptable and it takes care of individual needs. Students are able to guide their own learning through this education system. They can have theirown speed and decide what to study and how to learn. They can select the topics they are interested in, the instructor they want to learn from, and what program they want to pursue, expectations and trends.

Effective Grading

In education, there is another application of machine learning that deals with grades and scoring. Since the learning skills of a large number of students are expressed in each online course, grading them becomes a challenge. ML technology makes the grading process a few seconds problem. In this context, we talk more about the exact sciences. There are places where teachers cannot be replaced by computers, but even in such situations, they can contribute to enhance current approaches of grading and evaluation.

According to TechXplore, Researchers at University of Tbingen and Leibniz Institute fr Wissensmedien in Germany, as well as University of Colorado Boulder, have recently investigated the potential of machine-learning techniques for assessing student engagement in the context of classroom research. More specifically, they devised a deep-neural-network-based architecture that can estimate student engagement by analyzing video footage collected in classroom environments.

They also mentioned that, We used camera data collected during lessons to teach a deep-neural-network-based model to predict student engagement levels, Enkelejda Kasneci the leading HCI researcher in the multidisciplinary team that carried out the study, told TechXplore. We trained our model on ground-truth data (e.g., expert ratings of students level of engagement based on the videos recorded in the classroom). After this training, the model was able to predict, for instance, whether data obtained from a particular student at a particular point in time indicates high or low levels of engagement.

See more here:
Can Machine Learning be the Best Remedy in the Education Sector? - Analytics Insight

Microchip Accelerates Machine Learning and Hyperscale Computing Infrastructure with the World’s First PCI Express 5.0 Switches – EE Journal

Switchtec PFX PCIe Gen 5 high performance switches double the data rate of PCIe Gen 4.0 solutions while delivering ultra-low latency and advanced diagnostics

CHANDLER, Ariz., Feb. 02, 2021 (GLOBE NEWSWIRE) Applications such as data analytics, autonomous-driving and medical diagnostics are driving extraordinary demands for machine learning and hyperscale compute infrastructure. To meet these demands, Microchip Technology Inc.(Nasdaq: MCHP)today announced the worlds first PCI Express (PCIe) 5.0 switch solutions theSwitchtec PFX PCIe 5.0 family doubling the interconnect performance for dense compute, high speed networking and NVM Express(NVMe) storage. Together with the XpressConnectretimers, Microchip is the industrys only supplier of both PCIe Gen 5 switches and PCIe Gen 5 retimer products, delivering a complete portfolio of PCIe Gen 5 infrastructure solutions with proven interoperability.

Accelerators, graphic processing units (GPUs), central processing units (CPUs) and high-speed network adapters continue to drive the need for higher performance PCIe infrastructure. Microchips introduction of the worlds first PCIe 5.0 switch doubles the PCIe Gen 4 interconnect link rates to 32 GT/s to support the most demanding next-generation machine learning platforms, said Andrew Dieckmann, associate vice president of marketing and applications engineering for Microchips data center solutions business unit. Coupled with our XpressConnect family of PCIe 5.0 and Compute Express Link(CXL) 1.1/2.0 retimers, Microchip offers the industrys broadest portfolio of PCIe Gen 5 infrastructure solutions with the lowest latency and end-to-end interoperability.

The Switchtec PFX PCIe 5.0 switch family comprises high density, high reliability switches supporting 28 lanes to 100 lanes and up to 48 non-transparent bridges (NTBs). The Switchtec technology devices support high reliability capabilities, including hot-and surprise-plug as well as secure boot authentication. With PCIe 5.0 data rates of 32 GT/s, signal integrity and complex system topologies pose significant development and debug challenges. To accelerate time-to-market, the Switchtec PFX PCIe 5.0 switch provides a comprehensive suite of debug and diagnostic features including sophisticated internal PCIe analyzers supporting Transaction Layer Packet (TLP) generation and analysis and on-chip non-obtrusive SerDes eye capture capabilities. Rapid system bring-up and debug is further supported with ChipLink an intuitive graphical user interface (GUI) based device configuration and topology viewer that provides full access to the PFX PCIe switchs registers, counters, diagnostics and forensic capture capabilities.

Intels upcoming Sapphire Rapids Xeon processors will implement PCI Express 5.0 and Compute Express Link running up to 32.0 GT/s to deliver the low-latency and high-bandwidth I/O solutions our customers need to deploy, said Dr. Debendra Das Sharma, Intel fellow and director of I/O technology and standards. We are pleased to see Microchips PCIe 5.0 switch and retimer investment strengthen the ecosystem and drive broader deployment of PCIe 5.0 and CXL enabled solutions.

Development ToolsMicrochip has released a full set of design-in collateral, reference designs, evaluation boards and tools to support customers building systems that take advantage of the high-bandwidth of PCIe 5.0.

In addition to PCIe technology, Microchip also provides data center infrastructure builders worldwide with total system solutions including RAID over NVMe, storage, memory, timing and synchronization systems, stand-alone secure boot, secure firmware and authentication, wireless products, touch-enabled displays to configure and monitor data center equipment and predictive fan controls.

AvailabilityThe Switchtec PFX PCIe 5.0 family of switches are sampling now to qualified customers. For additional information, contact a Microchip sales representative.

ResourcesHigh-res image available through Flickr or editorial contact (feel free to publish):

About Microchip TechnologyMicrochip Technology Inc. is a leading provider of smart, connected and secure embedded control solutions. Its easy-to-use development tools and comprehensive product portfolio enable customers to create optimal designs which reduce risk while lowering total system cost and time to market. The companys solutions serve more than 120,000 customers across the industrial, automotive, consumer, aerospace and defense, communications and computing markets. Headquartered in Chandler, Arizona, Microchip offers outstanding technical support along with dependable delivery and quality. For more information, visit the Microchip website atwww.microchip.com.

Related

Visit link:
Microchip Accelerates Machine Learning and Hyperscale Computing Infrastructure with the World's First PCI Express 5.0 Switches - EE Journal

The POWER Interview: The Importance of AI and Machine Learning – POWER magazine

Artificial intelligence (AI) and machine learning (ML) are becoming synonymous with the operation of power generation facilities. The increased digitization of power plants, from equipment to software, involves both thermal generation and renewable energy installations.

Both AI and ML will be key elements for the design of future energy systems, supporting the growth of smart grids and improving the efficiency of power generation, along with the interaction among electricity customers and utilities.

The technology group Wrtsil is a global leader in using data to improve operations in the power generation sector. The company helps generators make better asset management decisions, which supports predictive maintenance. The company uses AI, along with advanced diagnostics, and its deep equipment expertise greatly to enhance the safety, reliability, and efficiency of power equipment and systems.

Luke Witmer, general manager, Data Science, Energy Storage & Optimization at Wrtsil, talked with POWER about the importance of AI and ML to the future of power generation and electricity markets.

POWER: How can artificial intelligence (AI) be used in power trading, and with regard to forecasts and other issues?

Witmer: Artificial intelligence is a very wide field. Even a simple if/else statement is technically AI (a computer making a decision). Forecasts for price and power are generated by AI (some algorithm with some historic data set), and represent the expected trajectory or probability distribution of that value.

Power trading is also a wide field. There are many different markets that span different time periods and different electricity (power) services that power plants provide. Its more than just buying low and selling high, though that is a large piece of it. Forecasts are generally not very good at predicting exactly when electricity price spikes will happen. There is always a tradeoff between saving some power capacity for the biggest price spikes versus allocating more of your power for marginal prices. In the end, as a power trader, it is important to remember that the historical data is not a picture of the future, but rather a statistical distribution that can be leveraged to inform the most probable outcome of the unknown future. AI is more capable at leveraging statistics than people will ever be.

POWER: Machine learning and AI in power generation rely on digitalization. As the use of data becomes more important, what steps need to be taken to support AI and machine learning while still accounting for cybersecurity?

Witmer: A lot of steps. Sorry for the lame duck answer here. Regular whitehat penetration testing by ethical hackers is probably the best first step. The second step should be to diligently and quickly address each critical issue that is discovered through that process. This can be done by partnering with technology providers who have the right solution (cyber security practices, certifications, and technology) to enable the data flow that is required.

POWER: How can the power generation industry benefit from machine learning?

Witmer: The benefit is higher utilization of the existing infrastructure. There is a lot of under-utilized intrastructure in the power generation industry. This can be accomplished with greater intelligence on the edges of the network (out at each substation and at each independent generation facility) coupled with greater intelligence at the points of central dispatch.

POWER: Can machines used in power generation learn from their experiences; would an example be that a machine could perform more efficiently over time based on past experience?

Witmer: Yes and no. It depends what you mean by machines. A machine itself is simply pieces of metal. An analogy would be that your air conditioner at home cant learn anything, but your smart thermostat can. Your air conditioner needs to just operate as efficiently as possible when its told to operate, constrained by physics. Power generation equipment is the same. The controls however, whether at some point of aggregation, or transmission intersection, or at a central dispatch center, can certainly apply machine learning to operate differently as time goes on, adapting in real time to changing trends and conditions in the electricity grids and markets of the world.

POWER: What are some of the uses of artificial intelligence in the power industry?

Witmer: As mentioned in the response to question 1, I think it appropriate to point you at some definitions and descriptions of AI. I find wikipedia to be the best organized and moderated by experts.

In the end, its a question of intelligent control. There are many uses of AI in the power industry. To start listing some of them is insufficient, but, to give some idea, I would say that we use AI in the form of rules that automatically ramp power plants up/down by speeding up or slowing down their speed governors, in the form of neural networks that perform load forecasting based on historic data and the present state data (time of day, metering values, etc.), in the form of economic dispatch systems that leverage these forecasts, and in the form of reinforcement learning for statistically based automated bid generation in open markets. Our electricity grids combined with their associated controls and markets are arguably the most complex machines that humans have built.

POWER: How can AI benefit centralized generation, and can it provide cost savings for power customers?

Witmer: Centralized power systems continue to thrive from significant economies of scale. Centralized power systems enable equal access to clean power at the lowest cost, reducing economic inequality. I view large renewable power plants that are owned by independent power producers as centralized power generation, dispatched by centralized grid operators. Regardless of whether the path forward is more or less centralized, AI brings value to all parties. Not only does it maximize revenue for any specific asset (thus the asset owner), it also reduces overall electricity prices for all consumers.

POWER: How important is AI to smart grids? How important is AI to the integration of e-mobility (electric vehicles, etc.) to the grid?

Witmer: AI is very important to smart grids. AI is extremely important to the integration of smart charging of electric vehicles, and leveraging of those mobile batteries for grid services when they are plugged into the grid (vehicles to grid, or V2G). However, the more important piece is for the right market forces to be created (economics), so that people can realize the value (actually get paid) for allowing their vehicles to participate in these kinds of services.

The mobile batteries of EVs will be under-utilized if we do not integrate the controls for charging/discharging this equipment in a way that gives both the consumers the ability to opt in/out of any service but also for the centralized dispatch to leverage this equipment as well. Its less a question of AI, and more a question of economics and human behavioral science. Once the economics are leveraged and the right tools are in place, then AI will be able to forecast the availability and subsequent utility that the grid will be able to extract from the variable infrastructure of plugged in EVs.

POWER: How important is AI to the design and construction of virtual power plants?

Witmer: Interesting question. On one level, this is a question that raises an existential threat to aspects of my own job (but thats a good thing because if a computer can do it, I dont want to do it!). Its a bit of a chicken-and-egg scenario. Today, any power plant (virtual or actual), is designed through a process that involves a lot of modeling, or simulations of what-if scenarios. That model must be as accurate as possible, including the controls behavior of not only the new plant in question, but also the rest of the grid and/or markets nearby.

As more AI is used in the actual context of this new potential power plant, the model must also contain a reflection of that same AI. No model is perfect, but as more AI gets used in the actual dispatch of power plants, more AI will be needed in the design and creation process for new power plants or aggregations of power generation equipment.

POWER: What do you see as the future of AI and machine learning for power generation / utilities?

Witmer: The short-term future is simply an extension of what we see today. As more renewables come onto the grids, we will see more negative price events and more price volatility. AI will be able to thrive in that environment. I suspect that as time goes on, the existing market structures will cease to be the most efficient for society. In fact, AI is likely going to be able to take advantage of some of those legacy features (think Enron).

Hopefully the independent system operators of the world can adapt quickly enough to the changing conditions, but I remain skeptical of that in all scenarios. With growing renewables that have free fuel, the model of vertically integrated utilities with an integrated resource planning (IRP) process will likely yield the most economically efficient structure. I think that we will see growing inefficiencies in regions that have too many manufactured rules and structure imposed by legacy markets, designed around marginal costs of operating fossil fuel-burning plants.

Darrell Proctor is associate editor for POWER (@POWERmagazine).

Link:
The POWER Interview: The Importance of AI and Machine Learning - POWER magazine

Five trends in machine learning-enhanced analytics to watch in 2021 – Information Age

AI usage is growing rapidly. What does 2021 hold for the world of analytics, and how will AI drive it?

Progress of AI-powered operations looks set to grow this year.

As the world prepares to recover from the Covid-19 pandemic, businesses will need to increasingly rely on analytics to deal with new consumer behaviour.

According to Gartner analyst Rita Sallam, In the face of unprecedented market shifts, data and analytics leaders require an ever-increasing velocity and scale of analysis in terms of processing and access to accelerate innovation and forge new paths to a post-Covid-19 world.

Machine learning and artificial intelligence are finding increasingly significant use cases in data analytics for business. Here are five trends to watch out for in 2021.

Gartner predicts that by 2024, 75% of enterprises will shift towards putting AI and ML into operation. A big reason for this is the way the pandemic has changed consumer behaviour. Regression learning models that rely on historical data might not be valid anymore. In their place, reinforcement and distributed learning models will find more use, thanks to their adaptability.

A large share of businesses have already democratised their data through the use of embedded analytics dashboards. The use of AI to generate augmented analytics to drive business decisions will increase as businesses seek to react faster to shifting conditions. Powering data democratisation efforts with AI will help non-technical users make a greater number of business decisions, without having to rely on IT support to query data.

Companies such as Sisense already offer companies the ability to integrate powerful analytics into custom applications. As AI algorithms become smarter, its a given that theyll help companies use low-latency alerts to help managers react to quantifiable anomalies that indicate changes in their business. Also, AI is expected to play a major role in delivering dynamic data stories and might reduce a users role in data exploration.

A fact thats often forgotten in AI conversations is that these technologies are still nascent. Many of the major developments have been driven by open source efforts, but 2021 will see an increasing number of companies commercialise AI through product releases.

This event will truly be a marker of AI going mainstream. While open source has been highly beneficial to AI, scaling these projects for commercial purposes has been difficult. With companies investing more in AI research, expect a greater proliferation of AI technology in project management, data reusability, and transparency products.

Using AI for better data management is a particular focus of big companies right now. A Pathfinder report in 2018 found that a lack of skilled resources in data management was hampering AI development. However, with ML growing increasingly sophisticated, companies are beginning to use AI to manage data, which fuels even faster AI development.

As a result, metadata management becomes streamlined, and architectures become simpler. Moving forward, expect an increasing number of AI-driven solutions to be released commercially instead of on open source platforms.

Vendors such as Informatica are already using AI and ML algorithms to help develop better enterprise data management solutions for their clients. Everything from data extraction to enrichment is optimised by AI, according to the company.

This article explores the ways in which Kubernetes enhances the use of machine learning (ML) within the enterprise. Read here

Voice search and data is increasing by the day. With products such as Amazons Alexa and Googles Assistant finding their way into smartphones and growing adoption of smart speakers in our homes, natural language processing will increase.

Companies will wake up to the immense benefits of voice analytics and will provide their customers with voice tools. The benefits of enhanced NLP include better social listening, sentiment analysis, and increased personalisation.

Companies such as AX Semantics provide self-service natural language generation software that allows customers to self-automate text commands. Companies such as Porsche, Deloitte and Nivea are among their customers.

As augmented analytics make their way into embedded dashboards, low-level data analysis tasks will be automated. An area that is ripe for automation is data collection and synthesis. Currently, data scientists spend large amounts of time cleaning and collecting data. Automating these tasks by specifying standardised protocols will help companies employ their talent in tasks better suited to their abilities.

A side effect of data analysis automation will be the speeding up of analytics and reporting. As a result, we can expect businesses to make decisions faster along with installing infrastructure that allows them to respond and react to changing conditions quickly.

As the worlds of data and analytics come closer together, vendors who provide end-to-end stacks will provide better value to their customers. Combine this with increased data democratisation and its easy to see why legacy enterprise software vendors such as SAP offer everything from data management to analytics to storage solutions to their clients.

Tech experts provide their tips on how to effectively implement automation into your customer relationship management (CRM) process. Read here

IoT devices are making their way into not just B2C products but B2B, enterprise and public projects as well, from smart cities to industry 4.0.

Data is being generated at unprecedented rates, and to make sense of it, companies are increasingly turning to AI. With so much signal, this is a key help for arriving at insights.

While the rise of embedded and augmented analytics has already been discussed, its critical to point out that the sources of data are more varied than ever before. This makes the use of AI critical, since manual processes cannot process such large volumes efficiently.

As AI technology continues to make giant strides the business world is gearing up to take full advantage of it. Weve reached a stage where AI is powering further AI development, and the rate of progress will only increase.

Original post:
Five trends in machine learning-enhanced analytics to watch in 2021 - Information Age

BioSig and Mayo Clinic Collaborate on New R&D Program to Develop Transformative AI and Machine Learning Technologies for its PURE EP System – BioSpace

Westport, CT, Feb. 02, 2021 (GLOBE NEWSWIRE) --

BioSig Technologies, Inc. (NASDAQ: BSGM) (BioSig or the Company), a medical technology company commercializing an innovative signal processing platform designed to improve signal fidelity and uncover the full range of ECG and intra-cardiac signals, today announced a strategic collaboration with the Mayo Foundation for Medical Education and Research to develop a next-generation AI- and machine learning-powered software for its PURE EP system.

The new collaboration will include an R&D program that will expand the clinical value of the Companys proprietary hardware and software with advanced signal processing capabilities and aim to develop novel technological solutions by combining the electrophysiological signals delivered by the PURE EPand other data sources. The development program will be conducted under the leadership of Samuel J. Asirvatham, M.D., Mayo Clinics Vice-Chair of Innovation and Medical Director, Electrophysiology Laboratory, and Alexander D. Wissner-Gross, Ph.D., Managing Director of Reified LLC.

The global market for AI in healthcare is expected to grow from $4.9 billion in 2020 to $45.2 billion by 2026 at an estimated compound annual growth rate (CAGR) of 44.9%1. According to Accenture, key clinical health AI applications, when combined, can potentially create $150 billion in annual savings for the United States healthcare economy by 20262.

AI-powered algorithms that are developed on superior data from multiple biomarkers could drastically improve the way we deliver therapies, and therefore may help address the rising global demand for healthcare, commented Kenneth L Londoner, Chairman and CEO of BioSig Technologies, Inc. We believe that combining the clinical science of Mayo Clinic with the best-in-class domain expertise of Dr. Wissner-Gross and the technical leadership of our engineering team will enable us to develop powerful applications and help pave the way toward improved patient outcomes in cardiology and beyond.

Artificial intelligence presents a variety of novel opportunities for extracting clinically actionable information from existing electrophysiological signals that might otherwise be inaccessible. We are excited to contribute to the advancement of this field, said Dr. Wissner-Gross.

BioSig announced its partnership with Reified LLC, a provider of advanced artificial intelligence-focused technical advisory services to the private sector in late 2019. The new research program builds upon the progress achieved by this collaboration in 2020, which included an abstract for Computational Reconstruction of Electrocardiogram Lead Placement presented during the 2020 Computing in Cardiology Conference in Rimini, Italy, and the development of an initial suite of electrophysiological analytics for the PURE EPSystem.

BioSig signed a 10-year collaboration agreement with Mayo Clinic in March 2017. In November 2019, the Company announced that it signed three new patent and know-how license agreements with the Mayo Foundation for Medical Education and Research.

About BioSig TechnologiesBioSig Technologies is a medical technology company commercializing a proprietary biomedical signal processing platform designed toimprove signal fidelity and uncover the full range of ECG and intra-cardiac signals(www.biosig.com).

The Companys first product,PURE EP Systemis a computerized system intended for acquiring, digitizing, amplifying, filtering, measuring and calculating, displaying, recording and storing of electrocardiographic and intracardiac signals for patients undergoing electrophysiology (EP) procedures in an EP laboratory.

Forward-looking Statements

This press release contains forward-looking statements. Such statements may be preceded by the words intends, may, will, plans, expects, anticipates, projects, predicts, estimates, aims, believes, hopes, potential or similar words. Forward- looking statements are not guarantees of future performance, are based on certain assumptions and are subject to various known and unknown risks and uncertainties, many of which are beyond the Companys control, and cannot be predicted or quantified and consequently, actual results may differ materially from those expressed or implied by such forward-looking statements. Such risks and uncertainties include, without limitation, risks and uncertainties associated with (i) the geographic, social and economic impact of COVID-19 on our ability to conduct our business and raise capital in the future when needed, (ii) our inability to manufacture our products and product candidates on a commercial scale on our own, or in collaboration with third parties; (iii) difficulties in obtaining financing on commercially reasonable terms; (iv) changes in the size and nature of our competition; (v) loss of one or more key executives or scientists; and (vi) difficulties in securing regulatory approval to market our products and product candidates. More detailed information about the Company and the risk factors that may affect the realization of forward-looking statements is set forth in the Companys filings with the Securities and Exchange Commission (SEC), including the Companys Annual Report on Form 10-K and its Quarterly Reports on Form 10-Q. Investors and security holders are urged to read these documents free of charge on the SECs website at http://www.sec.gov. The Company assumes no obligation to publicly update or revise its forward-looking statements as a result of new information, future events or otherwise.

1 Artificial Intelligence in Healthcare Market with COVID-19 Impact Analysis by Offering, Technology, End-Use Application, End User and Region Global Forecast to 2026; Markets and Markets

2 Artificial Intelligence (AI): Healthcares New Nervous System https://www.accenture.com/us-en/insight-artificial-intelligence-healthcare%C2%A0

See the article here:
BioSig and Mayo Clinic Collaborate on New R&D Program to Develop Transformative AI and Machine Learning Technologies for its PURE EP System - BioSpace

When Are We Going to Start Designing AI With Purpose? Machine Learning Times – The Predictive Analytics Times

Originally published in UX Collective, Jan 19, 2021.

For an industry that prides itself on moving fast, the tech community has been remarkably slow to adapt to the differences of designing with AI. Machine learning is an intrinsically fuzzy science, yet when it inevitably returns unpredictable results, we tend to react like its a puzzle to be solved; believing that with enough algorithmic brilliance, we can eventually fit all the pieces into place and render something approaching objective truth. But objectivity and truth are often far afield from the true promise of AI, as well soon discuss.

I think a lot of the confusion stems from language;in particular the way we talk about machine-like efficiency. Machines are expected to make precise measurements about whatever theyre pointed at; to produce data.

But machinelearningdoesnt produce data. Machine learning producespredictionsabout how observations in the present overlap with patterns from the past. In this way, its literally aninversionof the classicif-this-then-thatlogic thats driven conventional software development for so long. My colleague Rick Barraza has a great way of describing the distinction:

To continue reading this article, click here.

Read more:
When Are We Going to Start Designing AI With Purpose? Machine Learning Times - The Predictive Analytics Times

Learn in-demand technical skills in Python, machine learning, and more with this academy – The Next Web

Credit: Clment Hlardot/Unsplash

TLDR: With access to the Zenva Academy, users can take over 250 tech courses packed with real world programming training to become a knowledgeable and hirable professional coder.

The tech industry is expected to grow by as many as 13 million new jobs in the U.S. alone over the next five years, with another 20 million likely to spring up in the EU.

And you can rest assured that coding will be at the heart of almost every single one of those new positions.

Its no surprise that programming courses are being taught to our youngest students these days. From web development to gaming to data science, all the tech innovations well see over those next five years and beyond will come from innovators who understand how to make those static lines of code get together and dance.

If you feel behind the programming curve or just want a stockpile of tech training to have you ready for anything, the Zenva Academy ($139.99 for a one-year subscription) may be just the bootcamp you need to grab one of those new jobs.

This access unlocks everything in the Zenva Academys vast archives, a collection of more than 250 courses that dive into every aspect of learning to build games, websites, apps and more.

With courses taught by knowledgeable industry professionals, even newbies coming in with zero experience receive world-class training on in-demand programming skills on their way to becoming professionals themselves. Classes are based entirely around your own schedule with no deadlines or due dates so you can work at your own pace on bolstering your abilities.

Whether a student is interested in crafting mobile apps, mastering data science, or exploring machine learning and AI, these courses dont just tell you how to interact with these disciplines, they actually show you. Zenva coursework is based around creating real projects in tandem with the learning.

As you build a VR or AR app, or craft your first artificial neural networks using Python and TensorFlow, or create an awesome game, youll be building work for a professional portfolio that can help you land one of these prime coding positions. And with their ties to elite developer programs for outlets like Intel, Microsoft, and CompTIA, students can get on the fast track toward getting hired.

Regularly $169 for a year of Zenva Academy access, you can get it foronly $139.99 for a limited time.

Prices are subject to change.

Read next: Forget Hyperloop, check out Chinas new 620kmph maglev prototype

See more here:
Learn in-demand technical skills in Python, machine learning, and more with this academy - The Next Web

What is Machine Learning and its Uses? – Technotification

What is Machine Learning?

A useful way to introduce the machine learning methodology is by means of a comparison with the conventional engineering design flow.

This starts with an in-depth analysis of the problem domain, which culminates with the definition of a mathematical model. The mathematical model is meant to capture the key features of the problem under study and is typically the result of the work of a number of experts. The mathematical model is finally leveraged to derive hand-crafted solutions to the problem.

For instance, consider the problem of defining a chemical process to produce a given molecule. The conventional flow requires chemists to leverage their knowledge of models that predict the outcome of individual chemical reactions, in order to craft a sequence of suitable steps that synthesize the desired molecule. Another example is the design of speech translation or image/ video compression algorithms. Both of these tasks involve the definition of models and algorithms by teams of experts, such as linguists, psychologists, and signal processing practitioners, not infrequently during the course of long standardization meetings.

The engineering design flow outlined above may be too costly and inefficient for problems in which faster or less expensive solutions are desirable. The machine learning alternative is to collect large data sets, e.g., of labeled speech, images, or videos, and to use this information to train general-purpose learning machines to carry out the desired task. While the standard engineering flow relies on domain knowledge and on design optimized for the problem at hand, machine learning lets large amounts of data dictate algorithms and solutions. To this end, rather than requiring a precise model of the set-up understudy, machine learning requires the specification of an objective, of a model to be trained, and of an optimization technique.

Returning to the first example above, a machine learning approach would proceed by training a general-purpose machine to predict the outcome of known chemical reactions based on a large data set, and by then using the trained algorithm to explore ways to produce more complex molecules. In a similar manner, large data sets of images or videos would be used to train a general-purpose algorithm with the aim of obtaining compressed representations from which the original input can be recovered with some distortion.

When to Use Machine Learning?

Based on the discussion above, machine learning can offer an efficient alternative to the conventional engineering flow when development cost and time are the main concerns, or when the problem appears to be too complex to be studied in its full generality. On the flip side, the approach has the key disadvantages of providing generally suboptimal performance, or hindering interpretability of the solution, and applying only to a limited set of problems. In order to identify tasks for which machine learning methods may be useful, suggests the following criteria:1. the task involves a function that maps well-defined inputs to well-defined outputs;2. large data sets exist or can be created containing input-output pairs;3. the task provides clear feedback with clearly definable goals and metrics;4. the task does not involve long chains of logic or reasoning that depend on diverse background knowledge or common sense;5. the task does not require detailed explanations for how the decision was made;6. the task has a tolerance for error and no need for provably correct or optimal solutions;7. the phenomenon or function being learned should not change rapidly over time; and8. no specialized dexterity, physical skills, or mobility is required.

These criteria are useful guidelines for the decision of whether the machine learning methods are suitable for a given task of interest. They also offer a convenient demarcation line between machine learning as is intended today, with its focus on training and computational statistics tools, and more general notions of Artificial Intelligence (AI) based on knowledge and common sense.

In short, Machine learning is very useful and so progressive in the field of programming and topics related to computers.

See original here:
What is Machine Learning and its Uses? - Technotification