Qualitest Acquires AI and Machine Learning Company AlgoTrace to Expand Its Offering – PRNewswire

LONDON, Dec. 12, 2019 /PRNewswire/ --Qualitest, the world's largest software testing and quality assurance company, has acquired AI and machine learning company AlgoTrace for an undisclosed amount. This acquisition marks the first step of Qualitest's growth strategy following an investment from Bridgepoint earlier this year.

The acquisition will allow Qualitest to radically expand the number of AI-powered testing solutions available to clients, as well as develop its capabilities in assisting companies test and launch new AI-powered solutions with greater confidence and speed. As software grows in complexity and the pressure to launch faster and more frequently increases, according to Gartner, companies that do not use AI to enhance their Quality Assurance will be at a significant disadvantage.

AlgoTrace's machine learning tools help brands answer business critical questions as they launch new software: what, where, when, and how to test and in what order to ensure consistently high quality. With multiple clients already using Qualitest's suite of AI-testing tools, this expansion of capabilities creates opportunities not only for new Qualitest clients, but also allows for the growth of existing relationships with current customers around the world.

Qualitest began working with the AlgoTrace team more than a year ago, with AlgoTrace's AI platform powering Qualitest's market-leading test predictor tool, which applies pioneering autonomous AI capabilities and predictive modeling to unstructured data without the need for code or complex interfaces. Following multiple successful joint projects, the teams saw that, together, they would be able to apply AlgoTrace's powerful prediction engine in a variety of ways across the software development lifecycle to improve quality and speed to market.

Qualitest's AI-testing solutions have two main features focused on increasing confidence and assurance. First, to assist and enhance quality assurance efforts giving brands, in a more rapid fashion, high levels of confidence that software releases will go smoothly. Second, helping companies who are using AI in their own offerings to have a higher level of confidence that their AI algorithms are generating correct, unbiased results.

Ron Ritter, CEO at AlgoTrace, said: "We are thrilled to be joining with Qualitest. Following successful implementations with the company in the past, we have complete faith that we will help Qualitest change the testing paradigm forever enhancing their quality engineering with machine learning. While there is a lot of hype surrounding AI, we're deploying real, hard-nosed and practical tools that significantly change the rules."

Norm Merritt, CEO of Qualitest, said:"Applying AI to quality engineering is a perfect fit. Just as software becomes increasingly complex, the companies producing it are under competitive pressure to increase the speed and frequency of their rollouts. AI is the only way companies can scale software testing and quality engineering and the AlgoTrace team have shown that they understand this. In our view, companies that do not use AI to improve quality will be at a significant disadvantage."

Aviram Shotten, Chief Knowledge and Innovation Officer at Qualitest, said: "Ron and his team are just the kind of innovators we love: smart, customer-obsessed and attacking a big market problem with cutting edge technology. This acquisition will not only help us accelerate AI adoption within quality engineering by providing a holistic solution to our clients, it provides an avenue for our teams to access AlgoTrace's unique expertise to build new models, tools and solutions to improve how technology is developed, tested and deployed."

About Qualitest

Qualitest is the world's largest independent managed services provider of quality assurance and testing solutions. As a strategic partner, Qualitest helps brands move beyond functional testing to adopt new innovations such as automation, AI, and crowd-sourced UX testing. It leverages its domain expertise across industries, including financial services, media and entertainment, retail, consumer goods, technology, gaming, telecom, among others. Qualitest's global service delivery platform includes the United States, Israel, UK, India and Romania. To learn more about Qualitest, visit http://www.qualitestgroup.com.

About AlgoTrace

AlgoTracewas founded in 2016 as a data science company that focuses on building automated machine learning tools. It builds the tools data analystsand data scientists need to simplify and accelerate prediction modelling processes. Its software tool helps organizations to take the right decisions based on clear inputs that are based on facts and patterns discovered by our prediction engine. Its mission is to empower data scientists and analysts to create accurate and stable prediction models faster than ever.

SOURCE Qualitest

Home

Read the original:

Qualitest Acquires AI and Machine Learning Company AlgoTrace to Expand Its Offering - PRNewswire

Industry Call to Define Universal Open Standards for Machine Learning Operations and Governance – MarTech Series

Defining open standards is essential for deploying and governing machine learning models at scale for enterprise businesses

Cloudera, the enterprise data cloud company, asks for industry participation in defining universal open standards for machine learning operations (MLOps) and machine learning model governance. By contributing to these standards, the community can help companies make the most of their machine learning platforms and pave the way for the future of MLOps.

Machine learning models are already part of almost every aspect of our lives from automating internal processes to optimizing the design, creation, and marketing behind virtually every product consumed, said Nick Patience, founder and research vice president, software at 451 Research. As ML proliferates, the management of those models becomes challenging, as they have to deal with issues such as model drift and repeatability that affect productivity, security and governance. The solution is to create a set of universal, open standards so that machine learning metadata definitions, monitoring, and operations become normalized, the way metadata and data governance are standardized for data pipelines.

Marketing Technology News: TEC Eurolab Doubles Productivity With Data Management Solution from Cohesity and HPE

At Cloudera, we dont want to solve the challenge of deploying and governing machine learning models at scale only for our customers, we agree it needs to be addressed at the industry level. Apache Atlas is the best positioned framework to integrate data management and explainable, interoperable, and reproducible MLOps workflows, said Doug Cutting, chief architect at Cloudera. The Apache Atlas (Project) fits all the needs for defining ML metadata objects and governance standards. It is open-source, extensible, and has pre-built governance features.

Industry Call for Standards

Open source and open APIs have powered the growth of data science in business. But deploying and managing models in production is often difficult because of technology sprawl and siloing, said Peter Wang, CEO of Anaconda. Open standards for ML operations can reduce the clutter of proprietary technologies and give businesses the agility to focus on innovation. We are very pleased to see Cloudera lead the charge for this important next step.

As leaders in creating a machine learning oriented data strategy across our organization, we know what is required to address the challenges with deploying ML models into production at scale and building an ML-driven business, said Daniel Stahl, SVP model platforms at Regions Financial Corporation. A fundamental set of model design principles enables the repeatable, transparent, and governed approaches necessary for scaling model development and deployment. We join Cloudera in calling for open industry standards for machine learning operations.

Marketing Technology News: IBM AI Innovations Sharpen Risk Detection in Identity Management

At Santander, we focus on using machine learning to preemptively fight fraud and protect our customers, said Luan Vasconcelos Corumba, data science leader for fraud prevention at Santander Bank. Because there are many different types of fraud across many channels; scaling and maintaining this effort requires dynamic approaches to monitoring and governing models with sometimes hundreds of features to check on an ongoing weekly basis. We endorse these standards because establishing and implementing open universal standards for our production ML workflows can not only help us better protect our customers but will also enable our teams to drive adoption and deliver cost-effective, accurate predictions continuously.

Marketing Technology News: OneTrust Expands ID Verification Partner Program to Simplify the CCPA Consumer Rights Validation Process

View post:

Industry Call to Define Universal Open Standards for Machine Learning Operations and Governance - MarTech Series

Schneider Electric Wins ‘AI/ Machine Learning Innovation’ and ‘Edge Project of the Year’ at the 2019 SDC Awards – PRNewswire

LONDON, Dec. 12, 2019 /PRNewswire/ --Schneider Electric,the leader in digital transformation of energy management and automation, has today announced that it has won two categories at the 2019 SDC Awards for 'AI/Machine Learning Innovation of the Year' and 'edge project of the year.'

"I'm delighted to accept these prestigious awards on behalf of Schneider Electric," said Marc Garner Vice President, Secure Power Division UK&I. "As the industry's next-generation data centre infrastructure management (DCIM) platform, EcoStruxure IT leverages AI and ML technologies to proactively prevent downtime in data centre and edge computing environments. The software also provides end-users and partners with increased visibility that streamlines servicing and improves both operational and energy efficiency, something, which was instrumental for the Wellcome Sanger Institute."

The award for 'AI/Machine Learning Innovation of the Year' was presented to Schneider Electric for theirnext-generation DCIM platform EcoStruxure IT, which brings secure, vendor agnostic, wherever-you-go monitoring for all IoT-enabled physical infrastructure assets. Withthe ability to integrate securely with other manufacturer applications, the software delivers complete visibility into today's data centre and edge environments, from anywhere, at any time and on any device via the cloud.

In collaboration with Elite Channel Partner EfficiencyIT (EIT), Schneider Electric was awarded a second accolade for 'edge project of the year' for work completed for prestigious customer, the Wellcome Sanger Institute. The Wellcome Sanger Institute is one of the world leaders in genomic research and its research deals with some of the biggest medical research questions across the biggest challenges in human diseases - from cancer and malaria to measles and cholera.

Essential to the research function are the Institute's DNA sequencing machines, which produce terabytes of raw information each day. Due to the vast quantity of data, the criticality of local applications and the need for ultra-low latency, cloud hosting would present them with a number of complications and incur significant connectivity costs. The Institute, therefore, hosts Europe's largest, on-premise genomic data centre and uses its high-performance processing capabilities to store and analyse data in real-time.

Under the guidance of EIT, Sanger has deployed Schneider Electric's EcoStruxure IT to proactively manage the data centre, and to improve energy efficiency and resiliency. The campus has issues with power reliability, and any outage could result in loss of important genomic data and costly replacement of sequencing chemicals. Therefore, to protect the laboratory processes from downtime, the Institute has installed individual Schneider Electric Smart-UPS uninterruptible power supplies on each of its sequencers.

"EcoStruxure IT was selected due to its open-based architecture, which allows us to integrate with the technology already in place on campus, and because we considered it best-in-class for the Institute's requirements," said Simon Binley, Data Centre Manager, Wellcome Sanger Institute. "The platform provides us with increased visibility into the entire data centre environment and enables us to improve energy efficiency, meaning in time, more funding will be available for critical research that will benefit all of humankind."

Tofind out more about Schneider Electric's next generation DCIM platform EcoStruxure IT, please click here.

About Schneider Electric

At Schneider, we believe access to energy and digital is a basic human right. We empower all to make the most of their energy and resources, ensuring Life Is On everywhere, for everyone, at every moment.

We provide energy and automation digital solutions for efficiency and sustainability. We combine world-leading energy technologies, real-time automation, software and services into integrated solutions for Homes, Buildings, Data Centers, Infrastructure and Industries.

We are committed to unleash the infinite possibilities of an open, global, innovative community that is passionate about our Meaningful Purpose, Inclusive and Empowered values.

https://www.se.com/uk/en/

Related resources:

Follow us on:

Hashtags: #LifeIsOn #EcoStruxure #edgecomputing #DCIM

SOURCE Schneider Electric

https://www.se.com/uk/en/

More:

Schneider Electric Wins 'AI/ Machine Learning Innovation' and 'Edge Project of the Year' at the 2019 SDC Awards - PRNewswire

IQVIA on the adoption of AI and machine learning – OutSourcing-Pharma.com

Artificial intelligence (AI) and machine learning (ML) have become central topics in the pharma industry in 2019. Greater levels of investment are being funneled in this direction and a greater number of partnerships have sprung up around the areas.

The potential in relation to the pharma industry have often centered around drug discovery. The potential is there for the technology to reduce the cost of developing a new drug, which has been estimated to be approximately $2-3bn (1.8-2.7bn).

As a result, a number of large pharma companies have signed partnership deals to unlock the promise of faster drug discovery, such as Pfizers deal with CytoReason and Novo Nordisks with e-therapeutics.

Wider than this, there is the potential to improve patient recruitment to clinical trials, another notorious stumbling block in drug development.

Outsourcing-Pharma (OSP) asked Yilian Yuan (YY), SVP of data science and advanced analytics at Iqvia, for analysis on how the pharma industry is approaching the opportunity provided by AI and ML so far, and how this is likely to develop over the coming years.

OSP: How would you characterize the pharma industrys adoption of AI, so far?

YY: In general, the pharma industry recognizes the value of AI/ML, and some pharma companies have made significant investments to build the infrastructure and talent pool necessary to bring AI/ML capabilities into the R&D and commercial sectors. However, implementation can be challenging. To overcome these challenges pharma companies should undertake the following steps:

Because of the many challenges, pharma has been slow to take up AI/ML. Extra effort will be needed for the industry to fully leverage AI/ML and to realize the positive impact on business and improved patient care.

OSP: How do you see further adoption of AI/ML for pharma in 2020 and beyond?

YY: I see more and more pharma companies taking various approaches to realize the value of AI/ML, and they fall into two categories:

We also see some companies taking a combination approach to get the benefits of both.

OSP: The industry generally has a reputation of being cautious when it comes to the adoption of new technologies what are the dangers of this when applied to AI/ML?

YY: The development and commercialization of innovative treatment options for the market is costly and competitive. AI/ML can leverage real-world data to innovate clinical trial design and execution, e.g., smart patient recruitment, and select sites that can quickly enroll patients. On the commercialization side, AI/ML enables proactive and precise engagements with health care providers and patients and the ability to identify patients with high risk of exacerbation or noncompliance with the trial regimen, which can trigger interventions by nurse educators.

Pharma companies that are slow to adopt AI/ML will be left behind in the race to bring new products to market and the right products to the right patients at the right time.

OSP: Are there are any particular areas of drug discovery where the technology can have the most impact?

YY: There are many areas where AI/ML will have a positive impact on drug discovery:

OSP: Are there any noteworthy industries that are leading the way in using this technology? What can the pharma industry learn from them?

YY: The automotive industry faces fierce competition and has leveraged AI/ML to do precision marketing on its websites, with tailored messages and select products for potential buyers. Perhaps pharma can learn from them and use AI/ML to develop personalized medicine to improve patient care.

Many industries use robotic process automation to automate processes like finance systems, which pharma could do as well. Pharma is a heavily regulated industry with many reporting documents generated for clinical trials and for product usage and adverse events. These documents must be translated into many languages. Tech companies have developed auto translation services with the help of AI/ML. Existing auto translation with AI/ML can be enhanced with pharma vocabulary to cut down on the cost of translating documents into multiple languages and increase the speed of this type of work.

Yilian Yuan leads a team of data scientists, statisticians and research experts to help clients address a broad range of business and industry issues. Dr. Yuan has an extensive background in applying econometric and statistical modeling, predictive modeling and machine learning, discrete choice modeling and quantitative market research, combined with patient-level longitudinal data to provide actionable insights for pharma clients to improve business performance.

Original post:

IQVIA on the adoption of AI and machine learning - OutSourcing-Pharma.com

Bing: To Use Machine Learning; You Have To Be Okay With It Not Being Perfect – Search Engine Roundtable

Frdric Dubut of Bing said on Twitter that if you do use machine learning in production, like Bing does, you have to be okay with the results not being perfect. And not all applications are okay with not being perfect, but I assume search can be okay with it.

Frdric Dubut wrote "To use ML in production you also need to be comfortable with a model that *will* get it wrong occasionally." "There are many applications where the required precision is just 100% - and unfortunately quite a few of these are still using ML," he added. These applications that have to be perfect are often in the financial, health and other areas. Think about the software that helps the airplane fly or software that helps you transfer money from on place to another.

Here are the tweets within the context:

As you know, Bing uses a lot of machine learning in search - upwards of 90% or more.

Forum discussion at Twitter.

Read the rest here:

Bing: To Use Machine Learning; You Have To Be Okay With It Not Being Perfect - Search Engine Roundtable

Machine Learning Answers: If Nvidia Stock Drops 10% A Week, Whats The Chance Itll Recoup Its Losses In A Month? – Forbes

Jen-Hsun Huang, president and chief executive officer of Nvidia Corp., gestures as he speaks during ... [+] the company's event at the 2019 Consumer Electronics Show (CES) in Las Vegas, Nevada, U.S., on Sunday, Jan. 6, 2019. CES showcases more than 4,500 exhibiting companies, including manufacturers, developers and suppliers of consumer technology hardware, content, technology delivery systems and more. Photographer: David Paul Morris/Bloomberg

We found that if Nvidia Stock drops 10% or more in a week (5 trading days), there is a solid 36% chance itll recover 10% or more, over the next month (about 20 trading days)

Nvidia stock has seen significant volatility this year. While the company has been impacted by the broader correction in the semiconductor space and the trade war between the U.S. and China, the stock is being supported by a strong long-term outlook for GPU demand amid growing applications in Deep Learning and Artificial Intelligence.

Considering the recent price swings, we started with a simple question that investors could be asking about Nvidia stock: given a certain drop or rise, say a 10% drop in a week, what should we expect for the next week? Is it very likely that the stock will recover the next week? What about the next month or a quarter? You can test a variety of scenarios on the Trefis Machine Learning Engine to calculate if Nvidia stock dropped, whats the chance itll rise.

For example, after a 5% drop over a week (5 trading days), the Trefis machine learning engine says chances of an additional 5% drop over the next month, are about 40%. Quite significant, and helpful to know for someone trying to recover from a loss. Knowing what to expect for almost any scenario is powerful. It can help you avoid rash moves. Given the recent volatility in the market, the mix of macroeconomic events (including the trade war with China and interest rate easing by the U.S. Fed), we think investors can prepare better.

Below, we also discuss a few scenarios and answer common investor questions:

Question 1: Does a rise in Nvidia stock become more likely after a drop?

Answer:

Not really.

Specifically, chances of a 5% rise in Nvidia stock over the next month:

= 40%% after Nvidia stock drops by 5% in a week.

versus,

= 44.5% after Nvidia stock rises by 5% in a week.

Question 2: What about the other way around, does a drop in Nvidia stock become more likely after a rise?

Answer:

No.

Specifically, chances of a 5% decline in Nvidia stock over the next month:

= 40% after NVIDIA stock drops by 5% in a week

versus,

= 27% after NVIDIA stock rises by 5% in a week

Question 3: Does patience pay?

Answer:

According to data and Trefis machine learning engines calculations, largely yes!

Given a drop of 5% in Nvidia stock over a week (5 trading days), while there is only about 28% chance the Nvidia stock will gain 5% over the subsequent week, there is more than 58% chance this will happen in 6 months.

The table below shows the trend:

Trefis

Question 4: What about the possibility of a drop after a rise if you wait for a while?

Answer:

After seeing a rise of 5% over 5 days, the chances of a 5% drop in Nvidia stock are about 30% over the subsequent quarter of waiting (60 trading days). However, this chance drops slightly to about 29% when the waiting period is a year (250 trading days).

Whats behind Trefis? See How Its Powering New Collaboration and What-Ifs ForCFOs and Finance Teams|Product, R&D, and Marketing Teams More Trefis Data Like our charts? Exploreexample interactive dashboardsand create your own

See original here:

Machine Learning Answers: If Nvidia Stock Drops 10% A Week, Whats The Chance Itll Recoup Its Losses In A Month? - Forbes

Government invests 49m in data analytics, machine learning and AI Ireland, news for Ireland, FDI,Ireland,Technology, – Business World

Minister for Business, Enterprise and Innovation, Heather Humphreys and Minister for Training, Skills, Innovation, Research and Development, John Halligan today announced a Government investment of 49 million through Science Foundation Ireland in the Insight SFI Research Centre for Data Analytics.

This Government investment will secure a further 100 million from industry and other international sources, such as the European Union, over the next six years to further harness the power of data analytics, machine learning and artificial intelligence (AI).

Through this new investment, Insight will continue its research via a set of three demonstrator projects under the themes -Augmented Human, Smart Enterprise and Sustainable Societies. In addition, it will significantly expand its Education and Outreach Programme, including a new Citizen Science initiative.

Insight was established in 2013 through an initial SFI investment of 43 million and has delivered an economic impact of 593m to the Irish economy. For every 1 of state investment, 5.54 is returned to the economy on an overall leveraged basis.

This funding was supplemented by 63 million from EU sources and industry. That means for every 1 of SFI funding, another 1.46 in additional investment has come from those other sources. During this period Insight has produced over 2,000 publications, trained 184 postdoctoral graduates and established 11 spin out companies and with this new funding will continue to develop these outputs.

Insight was established in 2013 and is hosted at four higher education institutions - Dublin City University, National University Ireland Galway, University College Cork and University College Dublin and works in partnership with Maynooth University, Trinity College Dublin, Tyndall National Institute and University of Limerick.

Commenting on the announcement, Science Foundation Irelands Director General and Chief Scientific Adviser to the Government of Ireland, Professor Mark Ferguson said, "Insights research is equipping indigenous Irish companies to harness the power of data analytics, machine learning and AI to become more competitive and open new markets. The SFI Research Centres continue to attract and retain multinational organisations who want to conduct high value research in Ireland. Centres like Insight are seeding the next generation of world class innovators in our universities."

Minister Humphreys added, "Many traditional job roles are changing, and with Brexit and other international challenges on the horizon, we must continue to plan ahead, focus on what is within our control domestically and be the masters of our own destiny. Insight is playing an important role in our plans to prepare now for tomorrows world by keeping Ireland at the cutting edge of innovation in this important sector."

Source: http://www.businessworld.ie

Read the original here:

Government invests 49m in data analytics, machine learning and AI Ireland, news for Ireland, FDI,Ireland,Technology, - Business World

Taking UX and finance security to the next level with IBM’s machine learning – The Paypers

Fraud Prevention and Online Authentication Report 2019/2020

Machine learning is a technology that has been with us for some time now. Sometimes understated or used just as a buzz word, we cannot deny its impact and benefits on the human life.

From personal assistants and social media advertising services to medical diagnosis, image processing, and financial prediction, this innovative technology impacts our everyday life and supports business decisions for some of the worlds leading companies. For instance, machine learning (ML) solutions could assist financial services institutions to predict financial transactions fraud or outcomes of investments. Furthermore, banks can apply machine learning models to create targeted upselling and cross selling marketing campaigns.

Usually, the common ML techniques applied involve dealing with large amounts of data that needs to be shared and prepared before the actual learning phase. However, compliance with privacy laws (e.g. GDPR in Europe, the Personal Data Protection Bill in India, etc.) requires that most of the data and the computation to be kept in a secure environment, usually in-house, and not outsourced to cloud or multi-tenant shared environments.

At the beginning of October 2019, IBM scientists have published a paper demonstrating how homomorphic encryption (HE) enabled a bank to run machine learning algorithms on their sensitive client data, while keeping it encrypted.

Towards a homomorphic machine learning Big Data pipeline in finance

As data management and data protection are top concerns for financial institutions, The Paypers has been closely watching this space and has spoken with Flavio Bergamaschi, IBM Senior Research Scientist and one of the scientists behind IBMs pilot to find more about the research.

Imagine what you could do if you could compute on encrypted data without ever decrypt it. This was the message that dominated Flavios presentation and opened a whole spectrum of possibilities, new scenarios about what we can do today or we're not even considering doing because we cannot share information.

Broadly speaking, homomorphic encryption (HE) enables us to do the processing of the data without giving access to the data, and this is technically done by computing on encrypted data. The technology promises to generally transform and disrupt how business is currently done in many industries such as, but not limited to, healthcare, medical sciences, and finance.

His explanation recalled an interview that we had in May 2019 with Michael Osborne, a Security Researcher at IBMs Zurich Research Laboratory, one year after GDPR was passed in Europe. Back then, Michael agreed that banks are left with a dilemma: on the one hand, if they do not have sufficient technologies for fraud detection they can be fined and, on the other hand, if they do it in such a way that there is a breach and there is a kind of a risk to data subjects, they can again be fined within GDPR law. So, at the end of the day, its all about how you can do it; but IBM researchers solved this puzzle, as homomorphic encryption (HE) allows us to resolve the paradox of need to know vs. need to share.

The beginnings of homomorphic encryption

The first fully homomorphic encryption scheme was invented in 2009 by Craig Gentry. Going through the chronology of HE, Flavio explained that Gentrys invention described an encryption scheme that supports both multiplication and addition operations that can be used to perform arbitrary computation. Before this technology was developed, one could do either one or the other, but not both. But how long did it take to do one multiplication of one bit back in 2009? Flavios reply came disappointingly: the performance predictions were disappointing to the point that it was branded as "not in my lifetime". However, 10 years later and after many algorithmic improvements, the performance today is very adequate for many use cases where keeping the privacy and confidentiality of the data is paramount...

When it comes to real life applications, the engineering team started developing use cases for genomics (finding similarities between two genomic sequences, predicting a genetic predisposition to a specific condition or disease), oblivious queries (perform queries without revealing the query data), private set intersections (finding intersections of data without revealing anything more than the intersection), and prediction models for finance (investments, risk score determination).

In 2019, IBM managed to reduce the speed of the homomorphic computation, making it orders of magnitude faster than it was believed before.

How computing is done today

To stress the breakthrough of the research, Flavio demonstrated how computing is done today using a diagram that involves data exchange between two entities: Alice and Bob, plus Eve trying to eavesdrop the communication.

When Alice needs some service from an entity which we call Bob, it will encrypt the data when the data is in storage or when it's in transmission, to prevent Eve from grabbing unprotected data. Still, even if Eve steals that data, she is going to take it in an encrypted form. But Bob needs to decrypt the data in order to do anything with it.

I guess I seemed a bit puzzled by his diagram, so Flavio came up with a real life example when you buy something from an online shopping site you send your credit card details, and, most of the time, the details go to the site through an encrypted channel. But, when it gets to the source, the service needs to decrypt that info to process your order. This is the honest, but curious threat model. Because it's honest what the service is proposing to do for you, i.e. process a payments/transaction, but is curious as it wants to learn/extract information from your data.

With homomorphic encryption this model is changed because now the entity that provides the service, Bob, not only cannot see the data, but he doesnt have the ability to decrypt that data either, because he doesn't have the key. Nevertheless, he can still compute on that data and provide the service that he proposed to provide.

Shift in the security paradigm

Both Flavio and I agreed that security is crucial and protecting data privacy has become a major concern for users, and companies need to be careful when handling data.

Before homomorphic encryption was discovered, you would first implement the business logic of the application, and then the security team would build walls around it, to protect it. Data would be encrypted for storage in the disc or when it was transmitted but would have to be decrypted whenever you needed to do something with it he added.

Homomorphic encryption changed the picture because now, the cryptography is entangled with the business logic and we can have the data always encrypted while at rest/storage, transmission, and even while we are computing it.

The finance opportunity

Financial organisations have so many different departments. For instance, a bank could have a retail banking part, loans, investments, insurance, health insurance, etc. This translates into a lot of information, which due to privacy legislation such as GDPR, antitrust or anti-competitive business legislation, may not be combined by analysts in a clear form, as there is too much risk for data exfiltration and leaks. If all this data is encrypted, and computation can still be performed, without accessing that data, there is a lesser risk if the data leaks because it is encrypted. Only the machine, without accessing the data in the clear, can perform computations such as running models to analyse and predict data for marketing, fraud detection, loans, financial health of the account holder, and be able to offer services.

By using HE encrypted models and data, IBM team demonstrated the feasibility of predicting if a customer might need a personal loan soon, enabling targeted marketing. Typically, this is done behind a firewall in a segregated environment Flavio explained, limiting a bank to only using machine learning tools and resources built or installed in-house. Homomorphic encryption can successfully be used to protect the privacy and confidentiality of data used both in the creation of predictive models and running predictions theoretically freeing the bank to safely outsource sensitive data to a hybrid and/or public cloud for analysis with peace of mind.

Finally, I got fully convinced. Lets say you are looking to make an investment with a bank, and you want to make that in a way that you dont want to reveal with your bank what sort of volumes you might want to invest. In this case, the bank could deploy machine learning models on your encrypted data that will predict the risk for investment or returns, and offer you a service/offer, which you might accept it or not.

I would like to thank Flavio and the whole IBM team for an insightful presentation on homomorphic encryption, and what other best way to conclude than to quote him: Imagine what you could do if you could compute on encrypted data without ever decrypt it. Feel free to share your thoughts with us at mirelac@thepaypers.com .

About Flavio Bergamaschi

Flavio Bergamaschi is a Senior Research Scientist and currently the leader of the group developing IBM's Fully Homomorphic Encryption (FHE) technology for robustness, serviceability and usability, and designing and developing real world FHE applications. He also represents IBM in the industry-wide homomorphic encryption standards.

His areas of expertise include cryptography, distributed systems (MIMD & SIMD), signal processing and machine learning.

About Mirela Ciobanu

Mirela Ciobanu is a Senior Editor at The Paypers and has been actively involved in covering digital payments and related topics, especially in the cryptocurrency, online security and fraud prevention space. She is passionate about finding the latest news on data breaches, machine learning, digital identity, blockchain, and she is an active advocate of the need to keep our online data/presence protected. Mirela has a bachelor degree in English language and holds a Masters degree in Marketing.

Excerpt from:

Taking UX and finance security to the next level with IBM's machine learning - The Paypers

Quantum computing leaps ahead in 2019 with new power and speed – CNET

A close-up view of the IBM Q quantum computer. The processor is in the silver-colored cylinder.

Quantum computers are getting a lot more real. No, you won't be playing Call of Duty on one anytime soon. But Google, Amazon, Microsoft, Rigetti Computing and IBM all made important advances in 2019 that could help bring computers governed by the weird laws of atomic-scale physics into your life in other ways.

Google's declaration of quantum supremacywas the most headline-grabbing moment in the field. The achievement -- more limited than the grand term might suggest -- demonstrated that quantum computers could someday tackle computing problems beyond the reach of conventional "classical" computers.

Proving quantum computing progress is crucial. We're still several breakthroughs away from realizing the full vision of quantum computing. Qubits, the tiny stores of data that quantum computers use, need to be improved. So do the finicky control systems used to program and read quantum computer results. Still, today's results help justify tomorrow's research funding to sustain the technology when the flashes of hype inevitably fizzle.

Now playing: Watch this: Quantum computing is the new super supercomputer

4:11

Quantum computers will live in data centers, not on your desk, when they're commercialized. They'll still be able to improve many aspects of your life, though. Money in your retirement account might grow a little faster and your packages might be delivered a little sooner as quantum computers find new ways to optimize businesses. Your electric-car battery might be a little lighter and new drugs might help you live a little longer after quantum computers unlock new molecular-level designs. Traffic may be a little lighter from better simulations.

But Google's quantum supremacy step was just one of many needed to fulfill quantum computing's promise.

"We're going to get there in cycles. We're going to have a lot of dark ages in which nothing happens for a long time," said Forrester analyst Brian Hopkins. "One day that new thing will really change the world."

Among the developments in 2019:

Classical computers, which include everything from today's smartwatches to supercomputers that occupy entire buildings, store data as bits that represent either a 1 or a 0. Quantum computers use a different approach called qubits that can represent a combination of 1 and 0 through an idea called superposition.

Ford and Microsoft adapted a quantum computing traffic simulation to run on a classical computer. The result: a traffic routing algorithm that could cut Seattle traffic congestion by 73%.

The states of multiple qubits can be linked, letting quantum computers explore lots of possible solutions to a problem at once. With each new qubit added, a quantum computer can explore double the number of possible solutions, an exponential increase not possible with classical machines.

Quantum computers, however, are finicky. It's hard to get qubits to remain stable long enough to return useful results. The act of communicating with qubits can perturb them. Engineers hope to add error correction techniques so quantum computers can tackle a much broader range of problems.

Plenty of people are quantum computing skeptics. Even some fans of the technology acknowledge we're years away from high-powered quantum computers. But already, quantum computing is a real business. Samsung, Daimler, Honda, JP Morgan Chase and Barclays are all quantum computing customers. Spending on quantum computers should reach hundreds of millions of dollars in the 2020s, and tens of billions in the 2030s, according to forecasts by Deloitte, a consultancy. China, Europe, the United States and Japan have sunk billions of dollars into investment plans. Ford and Microsoft say traffic simulation technology for quantum computers, adapted to run on classical machines, already is showing utility.

Right now quantum computers are used mostly in research. But applications with mainstream results are likely coming. The power of quantum computers is expected to allow for the creation of new materials, chemical processes and medicines by giving insight into the physics of molecules. Quantum computers will also help for greater optimization of financial investments, delivery routes and flights by crunching the numbers in situations with a large number of possible courses of action.

They'll also be used for cracking today's encryption, an idea spy agencies love, even if you might be concerned about losing your privacy or some snoop getting your password. New cryptography adapted for a quantum computing future is already underway.

Another promising application is artificial intelligence, though that may be years in the future.

"Eventually we'll be able to reinvent machine learning," Forrester's Hopkinssaid. But it'll take years of steady work in quantum computing beyond the progress of 2019. "The transformative benefits are real and big, but they are still more sci-fi and theory than they are reality."

Read the original here:

Quantum computing leaps ahead in 2019 with new power and speed - CNET

Quantum computing will be the smartphone of the 2020s, says Bank of America strategist – MarketWatch

When asked what invention will be as revolutionary in the 2020s as smartphones were in the 2010s, Bank of America strategist Haim Isreal said, without hesitation, quantum computing.

At the banks annual year ahead event last week in New York, Israel qualified his prediction, arguing in an interview with MarketWatch that the timing of the smartphones arrival on the scene in the mid-2000s, and its massive impact on the American business landscape in the 2010s, doesnt line up neatly with quantum-computing breakthroughs, which are only now being seen, just a few weeks before the start of the 2020s.

The iPhone already debuted in 2007, enabling its real impact to be felt in the 2010s, he said, while the first business applications for quantum computing won't be seen till toward the end of the coming decade.

But, Israel argued, when all is said and done, quantum computing could be an even more radical technology in terms of its impact on businesses than the smartphone has been. This is going to be a revolution, he said.

Quantum computing is a nascent technology based on quantum theory in physics which explains the behavior of particles at the subatomic level, and states that until observed these particles can exist in different places at the same time. While normal computers store information in ones and zeros, quantum computers are not limited by the binary nature of current data processing and so can provide exponentially more computing power.

Quantum things can be in multiple places at the same time, said Chris Monroe, a University of Maryland physicist and founder of IonQ told the Associated Press . The rules are very simple, theyre just confounding.

In October, Alphabet Inc. GOOG, +0.39% subsidiary Google claimed to have achieved a breakthrough by using a quantum computer to complete a calculation in 200 seconds on a 53-qubit quantum computing chip, a task it calculated would take the fastest current super-computer 10,000 years. Earlier this month, Amazon.com Inc. AMZN, +0.66% announced its intention to collaborate with experts to develop quantum computing technologies that can be used in conjunction with its cloud computing services. International Business Machines Corp. IBM, +1.17% and Microsoft Corp. MSFT, +1.02% are also developing quantum computing technology.

Israel argued these tools will revolutionize several industries, including health care, the internet of things and cyber security. He said that pharmaceutical companies are most likely to be the first commercial users of these devices, given the explosion of data created by health care research.

Pharma companies are right now subject to Moores law in reverse, he said. They are seeing the cost of drug development doubling every nine years, as the amount of data on the human body becomes ever more onerous to process. Data on genomics doubles every 50 days, he added, arguing that only quantum computers will be able to solve the pharmaceutical industrys big-data problem.

Quantum computing will also have a major impact on cybersecurity, an issue that effects nearly every major corporation today. Currently cyber security relies on cryptographic algorithms, but quantum computings ability to solve these equations in the fraction of the time a normal computer does will render current cyber security methods obsolete.

In the future, even robust cryptographic algorithms will be substantially weakened by quantum computing, while others will no longer be secure at all, according to Swaroop Sham, senior product marketing manager at Okta.

For investors, Israel said, it is key to realize that the first one or two companies to develop commercially applicable quantum-computing will be richly rewarded with access to untold amounts of data and that will only make their software services more valuable to potential customers in a virtuous circle.

What weve learned this decade is that whoever controls the data will win big time, he said.

See the article here:

Quantum computing will be the smartphone of the 2020s, says Bank of America strategist - MarketWatch