Solving problems by working together: Could quantum computing hold the key to Covid-19? – ITProPortal

Given the enormous potential for quantum computing to change the way we forecast, model and understand the world, many are beginning to question whether it could have helped to better prepare us all for a global pandemic such as the Covid-19 crisis. Governments, organisations and the public are continuing the quest for answers about when this crisis will end and how we can find a way out of the current state of lockdown, and we are all continuing to learn through incremental and experimental steps. It certainly seems plausible that the high compute simulation capabilities of our most revolutionary technology could hold some of the answers and enable us to respond in a more coherent and impactful way.

Big investments have already been made in quantum computing, as countries and companies battle to create the first quantum supercomputer, so they can harness the power of this awesome technology. The World Economic Forum has also recognised the important role that this technology will play in our future, and has a dedicated Global Future Council to drive collaboration between public and private sector organisations engaged in the development of Quantum Computing. Although its unlikely to result in any overnight miracles, its understandable that many are thinking about whether these huge efforts and investments can be turned towards the mutual challenge we face in finding a solution to the Covid-19 pandemic.

There are already some ground-breaking use-cases for quantum computing within the healthcare industry. Where in the past some scientific breakthroughs such as the discovery of penicillin came completely by accident, quantum computing puts scientists in a much stronger position to find what they were looking for, faster. Quantum raises capacity to such a high degree that it would be possible to model penicillin using just a third of the processing power a classical computer would require to do the job meaning it can do more with less, at greater speed.

In the battle against Covid-19, the US Department of Energys Oak Ridge National Laboratory (ORNL) is already using quantum supercomputers in its search for drug compounds that can treat the disease. IBM has also been using quantum supercomputers to run simulations on thousands of compounds to try and identify which of them is most likely to attach to the spike that Covid-19 uses to inject genetic material into healthy cells, and thereby prevent it. It has already emerged with 77 promising drugs that are worth further investigation and development progress that would have taken years if traditional computing power had been used.

Other businesses are likely to be keen to follow in the footsteps of these examples, and play their own part in dealing with the crisis, but to date its only been the worlds largest organisations that have been using quantum power. At present, many businesses simply dont have the skills and resources needed to fabricate, verify, architect and launch a large-scale quantum computer on their own.

It will be easier to overcome these barriers, and enable more organisations to start getting to work with quantum computing, if they open themselves up to collaboration with partners, rather than trying to go it alone. Instead of locking away their secrets, businesses must be willing to work within an open ecosystem; finding mutually beneficial partnerships will make it much more realistic to drive things forward.

The tech giants have made a lot of early progress with quantum, and partnering with them could prove extremely valuable. Google, for example, claims to have developed a machine that can solve a problem in 200 seconds that would take the worlds fastest supercomputer 10,000 years imagine adding that kind of firepower to your computing arsenal. Google, IBM and Microsoft have already got the ball rolling by creating their own quantum partner networks. IBM Q and Microsoft Quantum Network bring together start-ups, universities, research labs, and Fortune 500 companies, enabling them to enjoy the benefits of exploring and learning together. The Google AI quantum initiative brings together strong academia support along with start-up collaboration on open source frameworks and tools in their lab. Collaborating in this manner, businesses can potentially play their own part in solving the Covid-19 crisis, or preventing future pandemics from doing as much damage.

Those that are leading the way in quantum computing are taking a collaborative approach, acknowledging that no one organisation holds all the answers or all the best ideas. This approach will prove particularly beneficial as we search for a solution to the Covid-19 crisis: its in everyones interests to find an exit to the global shutdown and build knowledge that means we are better-prepared for future outbreaks.

Looking at the bigger picture, despite all the progress that is being made with quantum, traditional computing will still have an important role to play in the short to medium term. Strategically, it makes sense to have quantum as the exploratory left side of the brain, while traditional systems remain in place for key business-as-usual functions. If they can think about quantum-related work in this manner, businesses should begin to feel more comfortable making discoveries and breakthroughs together. This will allow them to speed up the time to market so that ideas can be explored, and new ground broken, much faster than ever before and thats exactly what the world needs right now.

Kalyan Kumar, CVP & CTO, IT Services, HCL Technologies

The rest is here:
Solving problems by working together: Could quantum computing hold the key to Covid-19? - ITProPortal

Six things you need to learn about quantum computing in finance – eFinancialCareers

This willcome as bad news if you're only just getting to grips with Python, but you should probably be thinking of adding quantum computing to your repertoire if you want to maintain your long term employability in finance. Both Goldman Sachs and JPMorgan have been investigating the application of quantum computers to their businesses, and many say it's less a question of if than whenquantum computing is more widely applied.

Both Google and IBM are competing for quantum leadership. Google declared that it had achieved 'quantum supremacy' last October,a claimpromptly disputed by IBM, which said that Google's assertion was misleading. IBM itself now has 18 quantum computersthatcan be accessed via the cloud and that are already used by JPMorgan to set derivatives prices. In a new report*, IBM researchers includingDaniel Egger, Claudio Gambella,Jakub Marecek,Scott McFaddin, and Martin Mevissenargue that this is just the start.

Over time, the researchers say banks will use quantum computers for everything from creating value at risk and liquidity coverage ratios to running simulations to enable more accurate calculations of net stable funding ratios and pricing financial instruments. In preparation for this future they suggest you familiarize yourself with the following six quantum algorithms.

1. The Variational Quantum Eigensolver

The Variational Quantum Eigensolver (VQE) is used for optimization applications. It harnesses energy states to calculate the function of the variables it needs to optimize and is good whenstandard computers struggle due to the intensity of the computing required. In financial services, IBM says the VQE can be used in portfolio optimization. The only problem is that the number of qubits you need increases signficantly withproblem size.

2. The Quantum Approximate Optimization

TheQuantum Approximate Optimization is used to optimize combined problems and tond solutions to problems with complex constraints. IBM says it can be combined with VQE forportfolio optimization.

3. TheQuantum Amplitude Estimator

TheQuantum Amplitude Estimator(QAE) is used in simulations, optimizations and machine learning. It allows users to create simulation scenarios by estimating an unknown property in the style of the Monte Carlo method. Instead of simple samplying random distributions, the QAE can handle them directly and this dramatically speeds up simulation time. In finance, it can be used for option pricing, portfolio risk calculations,issuance auctions, anti-money laundering operations and identifying fraud.

4.Quantum Support Vector Machines

Quantum support vector machines (QSVM) applysupervised machine learning to high dimensional problem sets. Used for financial forecasting, they map data into a 'quantum-enhanced feature space' that enables the separation of data points and improvedforecastaccuracy.

5. Harrow, Hassidim, and Lloyd

Harrow, Hassidim, and Lloyd (HHL) is used for optimization and machine learning and enables better measurement of large linear systems by exponentially speeding up calculations. It can be used for credit scoring.

6.Quantum Semidenite Programming

Quantum Semidefinite Programming (QDSP) is used to optimize a linear objective over a set of positive semi-denite matrices. It can be used for portfolio diversification and "exponentially" speeds-up calculations when there are particular constraints.

As the financial services industry is subject to the combined demands of, "sophisticated risk analysis, dynamic client management, constant updates to market volatility, and faster transaction speeds," IBM's researchers predict quantum algorithms are primed for take-off. Now might be a good time to start familiarizing yourself with how they work.

*Quantum computing for Finance: state of the art and future prospects

Have a confidential story, tip, or comment youd like to share? Contact: sbutcher@efinancialcareers.com in the first instance. Whatsapp/Signal/Telegram also available. Bear with us if you leave a comment at the bottom of this article: all our comments are moderated by human beings. Sometimes these humans might be asleep, or away from their desks, so it may take a while for your comment to appear. Eventually it will unless its offensive or libelous (in which case it wont.)

Photo by Michael Dziedzic on Unsplash

Follow this link:
Six things you need to learn about quantum computing in finance - eFinancialCareers

Cybersecurity in the quantum era – ETCIO.com

By Tirthankar Dutta

On October 23rd, 2019, Google claimed that they had achieved Quantum supremacy by solving a particularly difficult problem in 200 seconds by using their quantum computer, which is also known as "sycamore." This performance was compared with a Supercomputer known as 'Summit" and built by IBM. According to Google, this classical computer would have taken 10,000 years to solve the same problem.

The advancement of large quantum computers, along with the more computational power it will bring, could have dire consequences for cybersecurity. It is well known that important problems such as factoring, whose considered hardness ensures the security of many widely used protocols (RSA, DSA, ECDSA), can be solved efficiently, if a quantum computer that is sufficiently large, "fault-tolerant" and universal, is developed. However, addressing the imminent risk that adversaries equipped with quantum technologies pose is not the only issue in cybersecurity where quantum technologies are bound to play a role.

Because quantum computing speeds up prime number factorization, computers enabled with that technology can easily break cryptographic keys by quickly calculating or exhaustively searching secret keys. A task considered computationally infeasible by a conventional computer becomes painfully easy, compromising existing cryptographic algorithms used across the board. In the future, even robust cryptographic algorithms will be substantially weakened by quantum computing, while others will no longer be secure at all:

There would be many disconnects on the necessity to change the current cryptographic protocols and infrastructure to counter quantum technologies in a negative way, but we can't deny the fact that future adversaries might use this kind of technology to their benefit. As it allows them to work on millions of computations in parallel, exponentially speeding up the time it takes to process a task.

According to the National, Academies Study notes, "the current quantum computers have very little processing power and are too error-prone to crack today's strong codes. The future code-breaking quantum computers would need 100,000 times more processing power and an error rate 100 times better than today's best quantum computers have achieved. The study does not predict how long these advances might takebut it did not expect them to happen within a decade."

But does this mean that we should wait and watch the evolution of quantum computing, or should we go back to our drawing board to create quantum-resistant cryptography? Thankfully, researchers have been working on a public-key cryptography algorithm that can counter code-breaking efforts by quantum computers. US National Institute of Standards and Technology (NIST) evaluating 69 potential new methods for what it calls "post-quantum cryptography." The institution expects to have a draft standard by 2024, which would then be added to web browsers and other internet applications and systems

No matter when dominant quantum computing arrives, it poses a large security threat. Because the process of adopting new standards can take years, it is wise to begin planning for quantum-resistant cryptography now.

The author is SVP and Head of Information Security at Infoedge.

DISCLAIMER: The views expressed are solely of the author and ETCIO.com does not necessarily subscribe to it. ETCIO.com shall not be responsible for any damage caused to any person/organisation directly or indirectly.

The rest is here:
Cybersecurity in the quantum era - ETCIO.com

Finally we have teleportation for particles with a mass – InTallaght

Teleportation between photons has not been a novelty for a long time, but when it comes to massive particles everything becomes more complicated. Thanks to the strange rules of quantum entanglement, physicists believe they have found a method for teleport information between two electrons distant from each other.

The teleportation of information it is not only the first step to get to teleportation itself, but it has important applications in the development of quantum computing and in data encryption. The development of teleportation between electrons can allow the construction of quantum computers with architecture more similar to the current one.

We got evidence of an entanglement exchange, in which we created an entanglement bond between two electrons even though they hadnt interacted before and we teleported information, a technique potentially useful for quantum computers, explains John Nichol of the University of Rochester, New York.

Teleport is a word that is part of the jargon of physics and serves to explain a very simple concept. When you buy a pair of shoes, even if you separate them you always know all the characteristics of both even if you cannot observe them directly. In a sense, the shoes are entangled.

Things get strange if you imagine that your shoe can be both rights and left at the same time, at least until you look at it. When you look at it, it instantly assumes one of two states and the distant shoe becomes right or left according to the first.

This is the mechanism behind the idea of teleportation in physics.

Their computational logic uses binary language, states are described by sequences of 0 and 1. Quantum computers use qubits, which can take on both states simultaneously, providing possibilities that current technology cannot achieve.

Using photons to teleport information is very easy and intuitive, they can be separated very quickly after they have been tied, and it can also be done inside a chip. Separating massive particles is much more difficult because transport could be lost mathematical purity of their quantum state, and have interference.

Individual electrons are very promising qubits because they interact very easily with each other, making long-distance connections is essential for quantum computing, says Nichol.

To create this teleportation, scientists exploited some fundamental laws of subatomic physics. When two electrons share the same state, they must necessarily have one opposite spin. Researchers had previously shown how this property can be manipulated without acting directly on electrons, presenting itself as a method for teleportation.

Scientists have managed to make one spin exchange between a pair of electrons without having interacted before. There is still a lot of work to be done to replace photons with electrons, as the latter is very difficult objects to control. But having convincing evidence of electron teleportation is an encouraging step.

See the original post:
Finally we have teleportation for particles with a mass - InTallaght

Quantum Computing Market: In-Depth Market Research and Trends Analysis till 2030 – Cole of Duty

Prophecy Market Insights Quantum Computing market research report provides a comprehensive, 360-degree analysis of the targeted market which helps stakeholders to identify the opportunities as well as challenges during COVID-19 pandemic across the globe.

Quantum Computing Devices Market reports provide in-depth analysis of Top Players, Geography, End users, Applications, Competitor analysis, Revenue, Financial Analysis, Market Share, COVID-19 Analysis, Trends and Forecast 2020-2029. It incorporates market evolution study, involving the current scenario, growth rate, and capacity inflation prospects, based on Porters Five Forces and DROT analyses.

Get Sample Copy of This Report @ https://www.prophecymarketinsights.com/market_insight/Insight/request-sample/571

An executive summary provides the markets definition, application, overview, classifications, product specifications, manufacturing processes; raw materials, and cost structures.

Market Dynamics offers drivers, restraints, challenges, trends, and opportunities of the Quantum Computing market

Detailed analysis of the COVID-19 impact will be given in the report, as our analyst and research associates are working hard to understand the impact of COVID-19 disaster on many corporations, sectors and help our clients in taking excellent business decisions. We acknowledge everyone who is doing their part in this financial and healthcare crisis.

Segment Level Analysis in terms of types, product, geography, demography, etc. along with market size forecast

Segmentation Overview:

The Quantum Computing research study comprises 100+ market data Tables, Graphs & Figures, Pie Chat to understand detailed analysis of the market. The predictions estimated in the market report have been resulted in using proven research techniques, methodologies, and assumptions. This Quantum Computing market report states the market overview, historical data along with size, growth, share, demand, and revenue of the global industry.

Request Discount @ https://www.prophecymarketinsights.com/market_insight/Insight/request-discount/571

Regional and Country- level Analysis different geographical areas are studied deeply and an economical scenario has been offered to support new entrants, leading market players, and investors to regulate emerging economies. The top producers and consumers focus on production, product capacity, value, consumption, growth opportunity, and market share in these key regions, covering

The comprehensive list of Key Market Players along with their market overview, product protocol, key highlights, key financial issues, SWOT analysis, and business strategies. The report dedicatedly offers helpful solutions for players to increase their clients on a global scale and expand their favour significantly over the forecast period. The report also serves strategic decision-making solutions for the clients.

Competitive landscape Analysis provides mergers and acquisitions, collaborations along with new product launches, heat map analysis, and market presence and specificity analysis.

Quantum ComputingMarket Key Players:

Wave Systems Corp, 1QB Information Technologies Inc, QC Ware, Corp, Google Inc, QxBranch LLC, Microsoft Corporation, International Business Machines Corporation, Huawei Technologies Co., Ltd, ID Quantique SA, and Atos SE.

The study analyses the manufacturing and processing requirements, project funding, project cost, project economics, profit margins, predicted returns on investment, etc. With the tables and figures, the report provides key statistics on the state of the industry and is a valuable source of guidance and direction for companies and individuals interested in the market.

Stakeholders Benefit:

About us:

Prophecy Market Insights is specialized market research, analytics, marketing/business strategy, and solutions that offers strategic and tactical support to clients for making well-informed business decisions and to identify and achieve high-value opportunities in the target business area. We also help our clients to address business challenges and provide the best possible solutions to overcome them and transform their business.

Contact Us:

Mr Alex (Sales Manager)

Prophecy Market Insights

Phone: +1 860 531 2701

Email: [emailprotected]

Read the original post:
Quantum Computing Market: In-Depth Market Research and Trends Analysis till 2030 - Cole of Duty

Nuclear submarines, non-nuclear weapons and the search for strategic stability – The Strategist

The decision to deploy nuclear-powered ballistic missile submarines (SSBNs) in the years to come will be a product of the major paradigms and concepts used to manage nuclear dangers more broadly. Recently, an emerging literature has pointed to a change in the way that at least the major powers plan to mitigate nuclear threats to their interests. This shift in thinking can be summarised as involving a greater reliance on strategic non-nuclear weaponsweapons and enabling systems that can be used to compromise an adversarys nuclear forces using both kinetic and non-kinetic means that dont involve nuclear weaponsand a decreased commitment to mutual vulnerability as the basis of strategic stability between nuclear-armed adversaries.

Strategic non-nuclear weapons include ballistic missile defence, conventional precision-strike missiles, anti-satellite weapons and anti-submarine weapons. When combined with advances in enabling platforms and systems such as elements of cyber, artificial intelligence and quantum technology, they can, in principle, be used to compromise an adversarys nuclear capabilities, with serious implications for issues of deterrence and stability.

Traditional approaches to deterrence based on the threat of punishment now compete with policies based instead on deterrence by denial. Stability based on rational calculations under conditions of mutual vulnerability appears set to be even harder to maintain.

The potential for conventional counterforce strikes makes future scenarios involving use them or lose them logic more likely for states that face adversaries armed with more sophisticated capabilities.

The current challenge to traditional nuclear deterrence relationships has a dual but paradoxical effect on the incentives to deploy sea-based nuclear weapons. In general, as missile silos (and even, over time, mobile land-based missiles), air fields, satellites, and command, control and communications stations become more vulnerable to counterforce attacks, the incentives to diversify a states nuclear force structure increase. In particular, SSBNs still remain the most secure form of second-strike capability, meaning that the further spread of strategic non-nuclear weapons is likely to result in ever more nuclear weapons being deployed at sea.

On the other hand, one of the key technologies that falls under the banner of strategic non-nuclear weapons is anti-submarine weapons themselves, and much analysis now is focusing on whether advances in this area may in fact undermine the perceived invulnerability of SSBNs. Its important to note that growing concerns over the effects of new anti-submarine capabilities on strategic stability are, at least in part, based on projections about the future. Little serious analysis or commentary predicts that the oceans are going to become effectively transparent overnight. However, advances in sensing and signal processing in particular mean that its a serious possibility that the oceans will become significantly more transparent than they are today. And when it comes to nuclear force structure planning, serious possibilities are enough to keep decision-makers up at night.

As the development of strategic non-nuclear weapons and the associated shift in thinking about stable deterrence based on mutual vulnerability continues, policymakers and analysts will need to give serious attention to what might become the new determinants of stability in the global nuclear order.

The development of countermeasures will play an important role in mitigating the destabilising effects of disruptive technological breakthroughs in anti-submarine weaponry. The role of countermeasures is already evident in other domains. For example, as a reaction to US missile defence, both China and Russia today are placing increasing emphasis on hypersonic missiles because their combination of speed and manoeuvrability makes them extraordinarily difficult to defend against.

Countermeasures for anti-submarine weapons need not rely on kinetic effects. The development both of ever quieter SSBNs with smaller acoustic signatures and of new techniques of deception (for example, unmanned underwater vehicles designed to produce tonals that match those of SSBNs that are thought to have been identified by an adversary) can increase a states confidence that at least some of its SSBNs can remain undetected and uncompromised in a crisis.

Developments in anti-submarine weapons aimed at compromising SSBNs and developments in countermeasures aimed at mitigating those breakthroughs will take on a tit-for-tat dynamic in the years to come. This is not a new phenomenon, but as rapid increases in things such as sensing techniques and data processing allow for technological leaps in anti-submarine capabilities, countermeasures should be expected to take on a new and much greater importance.

Defensive measures for SSBNs aimed at increasing their reliability in the face of technological breakthroughs in anti-submarine weaponry are unlikely to solely rely on new technologies themselves. For example, James Holmes has suggested that both bastion strategies for SSBNs (vessels constricted to a much smaller, actively defended area for patrols) and SSBNs being accompanied by convoys of skirmisher-type defensive units (adopting a similar principle to aircraft carrier battle groups) may be necessary to regain confidence in the survivability of SSBNs.

Stability needs to be seen as the most important goal and that will require a degree of what has been termed security dilemma sensibility among the nuclear-armed powers. Leaders that develop security dilemma sensibility display an openness to the idea that, as Nicholas Wheeler has put it, an adversary is acting out of fear and insecurity and not aggressive intent, as well as a recognition that ones own actions have contributed to that fear.

For example, future Chinese breakthroughs on quantum computing and their application to SSBN communication technology could be a positive development in the USChina strategic relationship. The more confidence Beijing has in the security of its second-strike capability, the less likely it is that a crisis between the US and China will inadvertently escalate.

Beyond unilateral measures, it may be possible, over the longer term, to negotiate, and design, limited multilateral efforts aimed at restoring stability between adversaries, including in relation to sea-based nuclear deployments. History suggests that confidence-building measures can play as important a role as formal arms control measures in reducing nuclear dangers, meaning that finding avenues for dialogue, even at a low level, should now be a top priority.

In the short term, the increasing salience of strategic non-nuclear weapons and the abandonment of deterrence strategies based on mutual vulnerability, is likely to continue to encourage states to deploy more SSBNs. Simultaneously, these forces will intensify the pressures to better protect SSBN fleets that are already deployed from technological breakthroughs in the anti-submarine weapons domain. Restraint in the deployment of anti-submarine capabilities may need to become a substitute for the more traditional tools used to instil stability in nuclear-armed relationshipsrestraint in defensive technology (such as missile defence) and negotiated limits on arms.

This piece was produced as part of the Indo-Pacific Strategy: Undersea Deterrence Project, undertaken by the ANU National Security College. This article is a shortened version of chapter 20, Strategic non-nuclear weapons, SSBNs, and the new search for strategic stability, as published in the 2020 edited volume The future of the undersea deterrent: a global survey. Support for this project was provided by a grant from Carnegie Corporation of New York.

Excerpt from:
Nuclear submarines, non-nuclear weapons and the search for strategic stability - The Strategist

Machine Learning Is Living in the Past – EnterpriseAI

via Shutterstock

Machine learning algorithms trained on large data sets have proven useful for spotting past patterns. Examples include stable environments like image databases or board games.

When it comes to messy, real-world data, however, those same ML algorithms often fall short, critics say, rigid and unable to adapt. Machine learning algorithms perform remarkably poorly on time-series predictions, assert researchers at causalLens, a platform developer that offers real-time economic predictions.

Current machine learning platforms largely fail to provide time-series predictions because correlations that have held in the past may simply not continue to hold in the future, the London-based company notes. Thats a particular problem in areas like finance and business where time-series data types are ubiquitous.

Those correlations tend to be single data points, unsuited to capturing context or complex relationships. In one example, an algorithm can be given access to a data set about dairy commodity prices to predict the price of cheese. The algorithm may conclude that butter prices as a guide to predicting the cost of limburger.

Eluding the algorithm is a fundamental assumption about the cost of dairy products: the hidden common cause of price spikes for cheese and butter is the cost of milk. Therefore, a sudden change in the price of butterconsumers preference for olive oil, for instanceis unrelated to milk prices. Hence, the faulty correlation between butter and cheese cant be used to predict the latters price.

The company touts its causal AI framework as looking beyond correlations to learn obvious relationships and then propose plausible hypotheses about more obscure chains of causality, it noted in a recent research bulletin. The approach allows data scientists to add domain knowledge and real-world context to improve predictive analytics.

Indeed, new open source libraries have emerged that seek to help data scientists and domain experts develop adaptable models based on causal relationships rather than data correlations alone. For example, the CausalNex library released earlier this year allows data dependencies to be expressed in network graphs that can be scanned by domain experts to eliminate spurious correlations in machine learning models.

CausalNex is the second open source release of a causal AI data set after Kedro, a library aimed at production ML code. The new library applies what-if analysis to Bayesian networks on the assumption that a probabilistic model is more intuitive in describing causality than traditional ML frameworks based on correlation analysis and pattern recognition.

Causal AI proponents also argue their approach makes better use of data to come up with more accurate predictions through the frameworks ability to simulate different scenarios.

Conventional machine learning approaches are, quite literally, stuck in the past, the company concludes. They are fooled by illusory patterns and are unable to quickly adapt to new conditions.

Related

About the author: George Leopold

George Leopold has written about science and technology for more than 30 years, focusing on electronics and aerospace technology. He previously served as executive editor of Electronic Engineering Times. Leopold is the author of "Calculated Risk: The Supersonic Life and Times of Gus Grissom" (Purdue University Press, 2016).

Link:
Machine Learning Is Living in the Past - EnterpriseAI

My Invisalign app uses machine learning and facial recognition to sell the benefits of dental work – TechRepublic

Align Technology uses DevSecOps tactics to keep complex projects on track and align business and IT goals.

Image: AndreyPopov/Getty Images/iStockphoto

Align Technology's Chief Digital Officer Sreelakshmi Kolli is using machine learning and DevOps tactics to power the company's digital transformation.

Kolli led the cross-functional team that developed the latest version of the company's My Invisalign app. The app combines several technologies into one product including virtual reality, facial recognition, and machine learning. Kolli said that using a DevOps approach helped to keep this complex work on track.

"The feasibility and proof of concept phase gives us an understanding of how the technology drives revenue and/or customer experience," she said. "Modular architecture and microservices allows incremental feature delivery that reduces risk and allows for continuous delivery of innovation."

SEE: Research: Microservices bring faster application delivery and greater flexibility to enterprises (TechRepublic Premium)

The customer-facing app accomplishes several goals at once, the company said:

More than 7.5 million people have used the clear plastic molds to straighten their teeth, the company said. Align Technology has used data from these patients to train a machine learning algorithm that powers the visualization feature in the mobile app. The SmileView feature uses machine learning to predict what a person's smile will look like when the braces come off.

Kolli started with Align Technology as a software engineer in 2003. Now she leads an integrated software engineering group focused on product technology strategy and development of global consumer, customer and enterprise applications and infrastructure. This includes end user and cloud computing, voice and data networks and storage. She also led the company's global business transformation initiative to deliver platforms to support customer experience and to simplify business processes.

Kolli used the development process of the My Invisalign app as an opportunity to move the dev team to DevSecOps practices. Kolli said that this shift represents a cultural change, and making the transition requires a common understanding among all teams on what the approach means to the engineering lifecycle.

"Teams can make small incremental changes to get on the DevSecOps journey (instead of a large transformation initiative)," she said. "Investing in automation is also a must for continuous integration, continuous testing, continuous code analysis and vulnerability scans." To build the machine learning expertise required to improve and support the My Invisalign app, she has hired team members with that skill set and built up expertise internally.

"We continue to integrate data science to all applications to deliver great visualization experiences and quality outcomes," she said.

Align Technology uses Amazon Web Services to run its workloads.

The My Invisalign app accomplished several goals for the company: connecting patients with doctors, creating a new marketing tool with the SmileView feature, and evolving the software development process.

Kolli said that IT leaders should work closely with business leaders to make sure initiatives support business goals such as revenue growth, improved customer experience, or operational efficiencies, and modernize the IT operation as well.

"Making the line of connection between the technology tasks and agility to go to market helps build shared accountability to keep technical debt in control," she said.

Align Technology released the revamped app in late 2019. In May of this year, the company released a digital version tool for doctors that combines a photo of the patient's face with their 3D Invisalign treatment plan.

This ClinCheck "In-Face" Visualization is designed to help doctors manage patient treatment plans.

The visualization workflow combines three components of Align's digital treatment platform: Invisalign Photo Uploader for patient photos, the iTero intraoral scanner to capture data needed for the 3D model of the patient's teeth, and ClinCheck Pro 6.0. ClinCheck Pro 6.0 allows doctors to modify treatment plans through 3D controls.

These new product releases are the first in a series of innovations to reimagine the digital treatment planning process for doctors, Raj Pudipeddi, Align's chief innovation, product, and marketing officer and senior vice president, said in a press release about the product.

Learn the latest news and best practices about data science, big data analytics, and artificial intelligence. Delivered Mondays

Here is the original post:
My Invisalign app uses machine learning and facial recognition to sell the benefits of dental work - TechRepublic

2 books to strengthen your command of python machine learning – TechTalks

Image credit: Depositphotos

This post is part ofAI education, a series of posts that review and explore educational content on data science and machine learning. (In partnership withPaperspace)

Mastering machine learning is not easy, even if youre a crack programmer. Ive seen many people come from a solid background of writing software in different domains (gaming, web, multimedia, etc.) thinking that adding machine learning to their roster of skills is another walk in the park. Its not. And every single one of them has been dismayed.

I see two reasons for why the challenges of machine learning are misunderstood. First, as the name suggests, machine learning is software that learns by itself as opposed to being instructed on every single rule by a developer. This is an oversimplification that many media outlets with little or no knowledge of the actual challenges of writing machine learning algorithms often use when speaking of the ML trade.

The second reason, in my opinion, are the many books and courses that promise to teach you the ins and outs of machine learning in a few hundred pages (and the ads on YouTube that promise to net you a machine learning job if you pass an online course). Now, I dont what to vilify any of those books and courses. Ive reviewed several of them (and will review some more in the coming weeks), and I think theyre invaluable sources for becoming a good machine learning developer.

But theyre not enough. Machine learning requires both good coding and math skills and a deep understanding of various types of algorithms. If youre doing Python machine learning, you have to have in-depth knowledge of many libraries and also master the many programming and memory-management techniques of the language. And, contrary to what some people say, you cant escape the math.

And all of that cant be summed up in a few hundred pages. Rather than a single volume, the complete guide to machine learning would probably look like Donald Knuths famous The Art of Computer Programming series.

So, what is all this tirade for? In my exploration of data science and machine learning, Im always on the lookout for books that take a deep dive into topics that are skimmed over by the more general, all-encompassing books.

In this post, Ill look at Python for Data Analysis and Practical Statistics for Data Scientists, two books that will help deepen your command of the coding and math skills required to master Python machine learning and data science.

Python for Data Analysis, 2nd Edition, is written by Wes McKinney, the creator of the pandas, one of key libraries using in Python machine learning. Doing machine learning in Python involves loading and preprocessing data in pandas before feeding them to your models.

Most books and courses on machine learning provide an introduction to the main pandas components such as DataFrames and Series and some of the key functions such as loading data from CSV files and cleaning rows with missing data. But the power of pandas is much broader and deeper than what you see in a chapters worth of code samples in most books.

In Python for Data Analysis, McKinney takes you through the entire functionality of pandas and manages to do so without making it read like a reference manual. There are lots of interesting examples that build on top of each other and help you understand how the different functions of pandas tie in with each other. Youll go in-depth on things such as cleaning, joining, and visualizing data sets, topics that are usually only discussed briefly in most machine learning books.

Youll also get to explore some very important challenges, such as memory management and code optimization, which can become a big deal when youre handling very large data sets in machine learning (which you often do).

What I also like about the book is the finesse that has gone into choosing subjects to fit in the 500 pages. While most of the book is about pandas, McKinney has taken great care to complement it with material about other important Python libraries and topics. Youll get a good overview of array-oriented programming with numpy, another important Python library often used in machine learning in concert with pandas, and some important techniques in using Jupyter Notebooks, the tool of choice for many data scientists.

All this said, dont expect Python for Data Analysis to be a very fun book. It can get boring because it just discusses working with data (which happens to be the most boring part of machine learning). There wont be any end-to-end examples where youll get to see the result of training and using a machine learning algorithm or integrating your models in real applications.

My recommendation: You should probably pick up Python for Data Analysis after going through one of the introductory or advanced books on data science or machine learning. Having that introductory background on working with Python machine learning libraries will help you better grasp the techniques introduced in the book.

While Python for Data Analysis improves your data-processing and -manipulation coding skills, the second book well look at, Practical Statistics for Data Scientists, 2nd Edition, will be the perfect resource to deepen your understanding of the core mathematical logic behind many key algorithms and concepts that you often deal with when doing data science and machine learning.

The book starts with simple concepts such as different types of data, means and medians, standard deviations, and percentiles. Then it gradually takes you through more advanced concepts such as different types of distributions, sampling strategies, and significance testing. These are all concepts you have probably learned in math class or read about in data science and machine learning books.

But again, the key here is specialization.

On the one hand, the depth that Practical Statistics for Data Scientists brings to each of these topics is greater than youll find in machine learning books. On the other hand, every topic is introduced along with coding examples in Python and R, which makes it more suitable than classic statistics textbooks on statistics. Moreover, the authors have done a great job of disambiguating the way different terms are used in data science and other fields. Each topic is accompanied by a box that provides all the different synonyms for popular terms.

As you go deeper into the book, youll dive into the mathematics of machine learning algorithms such as linear and logistic regression, K-nearest neighbors, trees and forests, and K-means clustering. In each case, like the rest of the book, theres more focus on whats happening under the algorithms hood rather than using it for applications. But the authors have again made sure the chapters dont read like classic math textbooks and the formulas and equations are accompanied by nice coding examples.

Like Python for Data Analysis, Practical Statistics for Data Scientists can get a bit boring if you read it end to end. There are no exciting applications or a continuous process where you build your code through the chapters. But on the other hand, the book has been structured in a way that you can read any of the sections independently without the need to go through previous chapters.

My recommendation: Read Practical Statistics for Data Scientists after going through an introductory book on data science and machine learning. I definitely recommend reading the entire book once, though to make it more enjoyable, go topic by topic in-between your exploration of other machine learning courses. Also keep it handy. Youll probably revisit some of the chapters from time to time.

I would definitely count Python for Data Analysis and Practical Statistics for Data Scientists as two must-reads for anyone who is on the path of learning data science and machine learning. Although they might not be as exciting as some of the more practical books, youll appreciate the depth they add to your coding and math skills.

Read the rest here:
2 books to strengthen your command of python machine learning - TechTalks

Machine Learning in Communication Market 2020 Trends, Growth Factors, Detailed Analysis and Forecast to 2024: Amazon, IBM, Microsoft, Google, Nextiva…

A research report on the Global Machine Learning in Communication Market provides the overall growth forces and present scenario of the Machine Learning in Communication industry. The research report also integrates the significant insights for the number of investors that are seeking to increase their market status in the past and upcoming industry scenario. In addition, the study extensively studies the numerous factors which are likely to influence the trend of the market over the forecast period. The global Machine Learning in Communication market report offers a holistic view of the industry along with the several factors which are limiting and driving the expansion of the global Machine Learning in Communication market. Similarly, to assess the complete market size, this study offers an accurate analysis of the market players landscape and a corresponding detailed study regarding the manufacturers functioning in the Machine Learning in Communication market. Furthermore, the Machine Learning in Communication industry report offers quantitative and qualitative evaluation which helps in understanding the past, current, and potential market scenario.

Request a sample of Machine Learning in Communication Market report @ https://www.orbisresearch.com/contacts/request-sample/4629300?utm_source=Bis

The global Machine Learning in Communication market report also covers present trends across various regions with a number of opportunities that are there for the service providers across the region. In addition, the study offers a concise overview of the manufacturing plan of the key companies which comprises an extensive analysis of the manufacturing unit, research & development capacity, as well as suppliers of the raw materials. This report delivers a complete analysis of the industry segmentation and the growth factors that are impacting the market. The Machine Learning in Communication market study also provides other significant data such as cost structure, value chain analysis, and Porters Five analysis which offers market outlook.

Major companies of this report:

AmazonIBMMicrosoftGoogleNextivaNexmoTwilioDialpadCiscoRingCentral

Browse the complete report @ https://www.orbisresearch.com/reports/index/global-machine-learning-in-communication-market-report-2020?utm_source=Bis

The global Machine Learning in Communication market report delivers the absolute mapping of the market providers that are functioning in the Machine Learning in Communication market with their market status on the basis of business developments as well as various product offerings that offers the complete competitive landscape of the market. In addition to this, the research report majorly focuses on the expansive analysis of the entire strategic overview along with the various activities of the market players such as merger & acquisition, partnerships, collaborations, agreements, and others which offers a clear idea of their present market scenario. Similarly, the global Machine Learning in Communication market report emphasizes on the major economies such as Asia Pacific, Europe, North America, and the Middle East and Africa.

Machine Learning in Communication Market Segmentation by Type:

Cloud-BasedOn-Premise

Machine Learning in Communication Market Segmentation by Application:

Network OptimizationPredictive MaintenanceVirtual AssistantsRobotic Process Automation (RPA)

This Machine Learning in Communication Market research study provides the business landscape of the prominent players with their revenue, industry overview, and product portfolio by segment and regional outlook. This report also covers a complete analysis of the major strategies adopted by the service providers in order to increase a market footprint against other service providers.

Major Points from Table of Content:Section 1 Machine Learning in Communication Product DefinitionSection 2 Global Machine Learning in Communication Market Manufacturer Share and Market OverviewSection 3 Manufacturer Machine Learning in Communication Business IntroductionSection 4 Global Machine Learning in Communication Market Segmentation (Region Level)Section 5 Global Machine Learning in Communication Market Segmentation (Product Type Level)Section 6 Global Machine Learning in Communication Market Segmentation (Industry Level)Section 7 Global Machine Learning in Communication Market Segmentation (Channel Level)Section 8 Machine Learning in Communication Market Forecast 2019-2024Section 9 Machine Learning in Communication Segmentation Product TypeSection 10 Machine Learning in Communication Segmentation IndustrySection 11 Machine Learning in Communication Cost of Production Analysis

Make an enquiry of this report @ https://www.orbisresearch.com/contacts/enquiry-before-buying/4629300?utm_source=Bis

About Us :

Orbis Research (orbisresearch.com) is a single point aid for all your market research requirements. We have vast database of reports from the leading publishers and authors across the globe. We specialize in delivering customized reports as per the requirements of our clients. We have complete information about our publishers and hence are sure about the accuracy of the industries and verticals of their specialization. This helps our clients to map their needs and we produce the perfect required market research study for our clients.

Contact Us :

Hector CostelloSenior Manager Client Engagements4144N Central Expressway,Suite 600, Dallas,Texas 75204, U.S.A.Phone No.: +1 (972)-362-8199; +91 895 659 5155

View original post here:
Machine Learning in Communication Market 2020 Trends, Growth Factors, Detailed Analysis and Forecast to 2024: Amazon, IBM, Microsoft, Google, Nextiva...