100 years old will be the new 60 | Health & Wellness – CL Charlotte

For the first time in history, leading scientists and entrepreneurs believe theres a way to slow aging and maybe even reverseit.

The latest research on longevity suggests there is no reason that people born today cant live to at least 120 years old... perhaps even to 150 and beyond.

How would you change your life if you could live to 120 years old and remain healthy?

What would you do differently today?

Last week we explored the Longevity Mindset. Today and next week, Ill do a quick review of the latest advancements toward rewiring the biology of aging.

Every year, I take a group of my Abundance 360 Members on a Platinum Longevity Trip to meet with the cutting-edge researchers and companies. Following are some of the companies and technologies we observed that have the potential to increase your healthspan the amount of time you have to live a healthy and functional life, avoiding expensive end-of-life care.

Todays blog will be part one of a two-part series covering these developments.

Lets dive in

Over the past two decades, the cost of sequencing the human genome has dropped 100,000-fold: from $100 million per genome, to below $1,000 per genome (current estimates are as low as $300).

Genome sequencing can uncover disease susceptibilities years before symptoms present, allowing for personalized preventative care to begin sooner than ever before.

For example, the Cancer Genome Atlas Program at the NIH is currently using gene sequencing to decode the genetic underpinnings of 30 cancer types.

Perhaps the most impactful potential of low-cost genome sequencing is its ability to be used in what is called a liquid biopsy the ability to find free-flowing cancer DNA in your bloodstream that might indicate the existence of an undetected cancer in your body. And, as we know, finding cancer at stage-zero or stage-one is the key to survival.

There are two major companies we visited with during our Platinum Longevity Trip:

Cancer detection company GRAILanalyzes the mutated, fractionated DNA and RNA from cancer cells in your blood (from a simple blood draw) to diagnose over 50 cancer types in early stages. GRAIL recently received an $8 billion buy-out offer from biotech giant Illumina.

Freenometakes a similar approach to early cancer diagnosis from a real-time blood draw (called a liquid biopsy), initially focused on colon cancer. Freenomes multiomics platform analyzes fragments of DNA, RNA, and protein from the cancer and from the host response. This form of precision medicine bridges early detection and early intervention to boost human healthspan.

One of the most powerful technologies now available in the fight for longevity is called gene therapy a technology theorized in the 1980s that has taken almost 40 years to mature. Gene therapy allows scientists to use a vector (typically an Adeno Associated Virus) to carry a desired gene to a set of desired cells in an organism. Want a specific gene put into retinal cells, or bone marrow, or neurons? No problem, there's a gene therapy approach for that.

A new biotech start-up called Gordian Biotechnologyis using the convergence of gene therapy and single cell sequencing to run hundreds of thousands of independent experiments in a single animal to determine the therapeutic effects of specific gene additions on specific cells of interest. Because aging is such a multifactorial process, this approach can run thousands of parallel experiments to tackle the many complexities of age-related diseases simultaneously.

Next week, well learn about a company called Rejuvenate Bio, and an extraordinary researcher named Dr. David Sinclair who is using gene therapy to potentially rejuvenate animals with the ultimate goal of age reversal in humans.

In addition to Gene Therapy, the other incredible tool in our longevity research arsenal is CRISPR.

You may know CRISPR as the molecular scissors that can edit genes think CTRL X (cut) and CTRL V (paste). But beyond cutting and pasting, CRISPR can also be used to help find and identify a sequence of DNA in your cell, sort of a CTRL F functionality. This discovery is so important and transformative that the Nobel Prize was just awarded this month to Dr. Jennifer Doudna of Gladstone Institutes for its discovery.

Here are several other exciting CRISPR developments:

Gladstone-UCSF Institute of Genomic Immunologyis using CRISPR to edit the T-cells of the immune system that play critical roles in cancer, infection, and autoimmunity. CRISPR can delete mutated genes or add new DNA to reprogram the T-cells. This personalized approach takes advantage of the bodys own immune system to tackle complex diseases.

Rather than snipping and replacing genes, Mammoth Bioscienceshas programmed CRISPR proteins to locate and cleave target genes, acting as molecular shredders. The cleaved gene serves as a molecular readout if the target is successfully bound, enabling CRISPR to serve as a diagnostic tool. Additionally, the companys novel CRISPR proteins (cas14, casV) exist in the micro and nano scales, opening the door for new delivery systems at smaller scales than ever before.

With the recent breakthroughs in CRISPR and Gene Therapy technologies, a variety of strategies for reversing disease have been tried. Yet countlessexperiments remain, and thats where AI can help.

The explosion of novel imaging, sensing, and sequencing tools has unleashed an abundance of patient data.

But bringing together this information across millions of patients to form actionable insights can only be achieved with Artificial Intelligence.

One of the leading companies in this area is Insilico Medicine, which is leveraging AI in its end-to-end drug pipeline, extending healthy longevity through drug discovery and aging research.

In their comprehensive drug discovery engine, Insilico uses millions of samples and multiple data types to A) discover signatures of diseases, and B) identify the most promising targets for billions of molecules. These molecules either already exist or can be generated de novo with the desired set of parameters.

Insilico uses an AI technique called generative adversarial networks (GANs) to imagine novel molecular structures. With reinforcement learning, Insilicos system lets you generate a molecule with any of up to 20 different properties to hit a specified target.

Thanks to converging breakthroughs in machine learning, drug discovery and molecular biology, companies like Insilico can now do with 50 people what the pharmaceutical industry can barely do with an army of 5,000.

Another extraordinary company on the Longevity Platinum Trip was a company out of the Buck Institute called Edifice Health, which has developed the ability to determine your inflammatory age using advanced AI to score biomarkers of immune health. Inflammation is a leading contributor to most chronic illnesses, and greater awareness of this symptom will enhance preventative care. Even more important than measuring inflammatory age, Edifice Health is screening thousands of molecules to determine which can quell such inflammation.

An additional company out of the Buck Institute is Gerostate Alpha, a pharmaceutical company that is using large-scale AI to test millions of compounds for their ability to extend the life of a worm-like creature called the nematode. Once they get a hit in nematodes (rather short-lived creatures), they will then test the molecules in mice and eventually in humans. The company is testing millions of compounds in parallel, hoping to literally discover the pharmaceutical fountain of youth.

In next weeks blog, well continue to review other exciting companies on the cutting-edge of longevity science, diving more into gene therapy, senolytic medicines, vaccines, and stem cells.

If tracking the latest breakthroughs in longevity is something you desire If developing a Longevity Mindset is important to you, then consider joining my Abundance 360Mastermind.

Every year, my team and I select a group of 360 entrepreneurs and CEOs to coach over the course of a year-long program. A360 starts each January with a live event and continues every two months with Implementation Workshops, in which I personally coach members in small groups over Zoom. (In January 2021, you have a choice of live in-person or virtual participation. See the A360 website for more info.)

My mission is to help A360 members identify their massively transformative purpose, select their moonshot, and hone an Abundance, Exponential, and Longevity Mindset. Together we will actively select and reinforce your preferred mindsets.

To learn more and apply, visit abundance360.com

Read the original post:
100 years old will be the new 60 | Health & Wellness - CL Charlotte

Lantronix Brings Advanced AI and Machine Learning to Smart Cameras With New Open-Q 610 SOM Based on the Powerful Qualcomm QCS610 System on Chip (SOC)…

IRVINE, Calif., Oct. 15, 2020 (GLOBE NEWSWIRE) -- Lantronix Inc. (NASDAQ: LTRX), a global provider of Software as a Service (SaaS), engineering services and hardware for Edge Computing, the Internet of Things (IoT) and Remote Environment Management (REM), today announced the availability of its new Lantronix Open-Q 610 SOM based on the powerful Qualcomm QCS610System on Chip (SOC). This micro System on Module (SOM) is designed for connected visual intelligence applications with high-resolution camera capabilities, on-device artificial intelligence (AI) processing and native Ethernet interface.

Our long and successful relationship with Qualcomm Technologies enables us to deliver powerful micro SOM solutions that can accelerate IoT design and implementation, empowering innovators to create IoT applications that go beyond hardware and enabletheir wildest dreams, said Paul Pickle, CEO of Lantronix.

The new Lantronix ultra-compact (50mm x 25mm), production-ready Open-Q 610 SOM is based on the powerful Qualcomm QCS610SOC, the latest in the Qualcomm Vision Intelligence Platform lineup targeting smart cameras with edge computing. Delivering up to 50 percent improved AI performance than the previous generation as well as image signal processing and sensor processing capabilities, it is designed to bring smart camera technology, including powerful artificial intelligence and machine learning features formerly only available to high-end devices, into mid-tier camera segments, including smart cities, commercial and enterprise, homes and vehicles.

Bringing Advanced AI and Machine Learning to Smart Camera Application

Created to bring advanced artificial intelligence and machine learning capabilities to smart cameras in multiple vertical markets, the Open-Q 610 SOM is designed for developers seeking to innovate new products utilizing the latest vision and AI edge capabilities, such as smart connected cameras, video conference systems, machine vision and robotics. With the Open-Q 610 SOM, developers gain a pre-tested, pre-certified, production-ready computing module that reduces risk and expedites innovative product development.

The Open-Q 610 SOM provides the core computing capabilities for:

Connectivity solutions include Wi-Fi/BT, Gigabit Ethernet, multiple USB ports and three-camera interfaces.

The Lantronix Open-Q 610 SOM provides advanced artificial intelligence and machine learning capabilities that enable developers to innovate new product designs, including smart connected cameras, video conference systems, machine vision and robotics, said Jonathan Shipman, VP of Strategy at Lantronix Inc. Lantronix micro SOMs and solutions enable IoT device makers to jumpstart new product development and accelerate time-to-market by shortening the design cycle, reducing development risk and simplifying the manufacturing process.

Open-Q 610 Development Kit

The companion Open-Q 610 Development Kit is a full-featured platform with available software tools, documentation and optional accessories. It delivers everything required to immediately begin evaluation and initial product development.

The development kit integrates the production-ready OpenQ 610 SOM with a carrier board, providing numerous expansion and connectivity options to support development and testing of peripherals and applications. The development kit, along with the available documentation, also provides a proven reference design for custom carrier boards, providing a low-risk fast track to market for new products.

In addition to production-ready SOMs, development platforms and tools, Lantronix offers turnkey product development services, driver and application software development and technical support.

For more information, visit Open-Q 610 SOM and Open Q 610 SOM Development kit.

About Lantronix

Lantronix Inc. is a global provider of software as a service (SaaS), engineering services and hardware for Edge Computing, the Internet of Things (IoT) and Remote Environment Management (REM). Lantronix enables its customers to provide reliable and secure solutions while accelerating their time to market. Lantronixs products and services dramatically simplify operations through the creation, development, deployment and management of customer projects at scale while providing quality, reliability and security.

Lantronixs portfolio of services and products address each layer of the IoT Stack, including Collect, Connect, Compute, Control and Comprehend, enabling its customers to deploy successful IoT and REM solutions. Lantronixs services and products deliver a holistic approach, addressing its customers needs by integrating a SaaS management platform with custom application development layered on top of external and embedded hardware, enabling intelligent edge computing, secure communications (wired, Wi-Fi and cellular), location and positional tracking and environmental sensing and reporting.

With three decades of proven experience in creating robust industry and customer-specific solutions, Lantronix is an innovator in enabling its customers to build new business models, leverage greater efficiencies and realize the possibilities of IoT and REM.Lantronixs solutions are deployed inside millions of machines at data centers, offices and remote sites serving a wide range of industries, including energy, agriculture, medical, security, manufacturing, distribution, transportation, retail, financial, environmental, infrastructure and government.

For more information, visit http://www.lantronix.com. Learn more at the Lantronix blog, http://www.lantronix.com/blog, featuring industry discussion and updates. To follow Lantronix on Twitter, please visit http://www.twitter.com/Lantronix. View our video library on YouTube at http://www.youtube.com/user/LantronixInc or connect with us on LinkedIn at http://www.linkedin.com/company/lantronix

Safe Harbor Statement under the Private Securities Litigation Reform Act of 1995: Any statements set forth in this news release that are not entirely historical and factual in nature, including without limitation statements related to our solutions, technologies and products as well as the advanced Lantronix Open-Q 610 SOM, are forward-looking statements. These forward-looking statements are based on our current expectations and are subject to substantial risks and uncertainties that could cause our actual results, future business, financial condition, or performance to differ materially from our historical results or those expressed or implied in any forward-looking statement contained in this news release. The potential risks and uncertainties include, but are not limited to, such factors as the effects of negative or worsening regional and worldwide economic conditions or market instability on our business, including effects on purchasing decisions by our customers; the impact of the COVID-19 outbreak on our employees, supply and distribution chains, and the global economy; cybersecurity risks; changes in applicable U.S. and foreign government laws, regulations, and tariffs; our ability to successfully implement our acquisitions strategy or integrate acquired companies; difficulties and costs of protecting patents and other proprietary rights; the level of our indebtedness, our ability to service our indebtedness and the restrictions in our debt agreements; and any additional factors included in our Annual Report on Form 10-K for the fiscal year ended June 30, 2019, filed with the Securities and Exchange Commission (the SEC) on September 11, 2019, including in the section entitled Risk Factors in Item 1A of Part I of such report, as well as in our other public filings with the SEC. Additional risk factors may be identified from time to time in our future filings. The forward-looking statements included in this release speak only as of the date hereof, and we do not undertake any obligation to update these forward-looking statements to reflect subsequent events or circumstances.

Lantronix Media Contact:Gail Kathryn MillerCorporate Marketing &Communications Managermedia@lantronix.com949-453-7158

Lantronix Analyst and Investor Contact:Jeremy WhitakerChief Financial Officerinvestors@lantronix.com 949-450-7241

Lantronix Sales: sales@lantronix.comAmericas +1 (800) 422-7055 (US and Canada) or +1 949-453-3990Europe, Middle East and Africa +31 (0)76 52 36 744Asia Pacific + 852 3428-2338China + 86 21-6237-8868Japan +81 (0) 50-1354-6201India +91 994-551-2488

2020 Lantronix, Inc. All rights reserved. Lantronix is a registered trademark, and EMG, and SLC are trademarks of Lantronix Inc. Other trademarks and trade names are those of their respective owners.

Qualcomm is a trademark or registered trademark of Qualcomm Incorporated.

Qualcomm Vision Intelligence Platform and Qualcomm QCS610 are products of Qualcomm Technologies, Inc. and/or its subsidiaries.

Read the original here:
Lantronix Brings Advanced AI and Machine Learning to Smart Cameras With New Open-Q 610 SOM Based on the Powerful Qualcomm QCS610 System on Chip (SOC)...

AI and Machine Learning are Redefining Banking Industry – Analytics Insight

In the given unprecedented times, digital transformation is vital. One of the significant challenges is modernizing banks and legacy business systems without disrupting the existing system. However, artificial intelligence (AI) and machine learning (ML) have played a pivotal role in conducting hassle-and risk-free digital transformation. An artificial intelligence and machine learning-led approach to system modernization will enable businesses to associate with other fintech services into embracing modern demands and regulations while ensuring safety and enabling security.

In the banking industry, with the growing pressure in managing risk along with increasing governance and regulatory requirements, banks must improve their services towards more unique and better customer service. Fintech brands are increasingly applying AI and ML in a wide range of applications across several channels to leverage all the available client data to predict how customers requirements are evolving. And they are also speculating what services will prove beneficial for them, what type of fraudulent activity has the highest possibility to attack customers systems. Leveraging the power of AI and ML in banking is required along with data science acceleration to enhance customers portfolio offerings.

Here are some significant roles of Artificial intelligence and Machine Learning in banking and finance listed below:

One of the practical examples to showcase the benefits of machine learning could be described in it. While sanctioning loans to customers, banks had to rely on the clients history to comprehend that particular customers creditworthiness. However, the process was not always seamless and accurate, and banks had to face challenges in approving the loans at times. With the digital transformation, the machine learning algorithms analyzes the user better to process the loan further in a much convenient manner.

Banks are undoubtedly one of the most highly regulated institutions and observe strict government regulations in order to protect defaulting or prevent fishing financial crimes within their systems. This is one of the primary reasons for the banking processes to shift to all-digital in such a short span of time. It is essential to be aware of the risk before any suspicious activity has begun to mitigate fraudulent activity. During the traditional process, banks had to violate some pre-set protocols to prevent users from fraudulent activity. Advances in machine learning can sense suspicious activity even before the external threat violates the customers account. The underlying benefit from this is that machines are capable of performing high-level analysis in real-time, which is impossible for humans to perform manually.

Chatbots are one of AI-led software that clones human conversation. The technology imbibed in chatbots makes it convenient for banks to respond to customers questions faster. The chatbots are proven beneficial for financial institutions to serve users issues at a large scale in a matter of a few hours.

The ability to identify the users past behaviour and craft targeted campaigns is a boon for both customers and banks. Such customised campaign creates all the necessary information the client would require while making it and will save both time and energy. Todays customers also enjoy services that are customised as per their preferences and enhance their banking experience.

With the increase of fintech companies and the rapid change in technology use, it was a matter of time that artificial intelligence and machine learning would enter modern banking, redefining the dynamics forever. The application of AI and ML will offer predictive data analysis as banks and financial institutions will try to offer better services with more actionable information like patterned data sets of customers behaviour and spending behaviour. Artificial intelligence adoption for financial institutions will be the key to obtaining a competitive edge as they will offer a fast, secure, and personalised banking experience to its customers.

Share This ArticleDo the sharing thingy

About AuthorMore info about author

Visit link:
AI and Machine Learning are Redefining Banking Industry - Analytics Insight

Futurism Reinforces Its Next-Gen Business Commerce Platform With Advanced Machine Learning and Artificial Intelligence Capabilities – Yahoo Finance

New AI capabilities pave way for an ultra-personalized customer experience

PISCATAWAY, N.J., Oct. 14, 2020 /PRNewswire/ --Futurism Technologies, a leading provider of digital transformation solutions, is bringing to life its Futurism Dimensions business commerce suite with additional artificial intelligence and machine learning capabilities. New AI capabilities will help online companies provide an exceptional personalized online customer experience and user journeys. Futurism Dimensions will not only help companies put their businesses online, but would also help to completely digitize their commerce lifecycle. The commerce life cycle includes digital product catalog creation and placement, AI-driven digital marketing, order generation to fulfillment, tracking, shipments, taxes and financial reporting, all from a unified platform.

With the "new norm," companies are racing to provide a better online experience for their customers. It's not just about putting up a website today, it's about creating personalized and smarter customer experiences. Using customer behavioral analysis, AI, machine learning and bots, Futurism's Dimensions creates that personalized experience. In addition, with Futurism Dimensions, companies become more efficient by transforming the entire commerce value chain and back office to digital.

"Companies such as Amazon have redefined online customer experience and set the bar very high. Every company will be expected to offer personalized, easy-to-use, online experience available from anywhere at any time and on any device," said Sheetal Pansare, CEO of Futurism Technologies. "We've armed Dimensions with advanced AI and ML to help companies provide exceptional personalized experiences to their customers. At the same time, with Dimensions, they can digitize their entire commerce value chain and become more efficient with business automation. Our ecommerce platform is affordable and suited for companies of all sizes," added Mr. Pansare.

Story continues

Futurism Dimensions highlights:

Secure and stable platform with 24/7 support and migration

As cybercrimes continue to evolve, e-commerce companies ought to keep up with advanced cybersecurity developments. Futurism Dimensions prides itself on its security for customers allowing them to receive the latest in technological advancements in cybersecurity. Dimensions leverages highly secure two-factor authentication and encryption to safeguard your customers' data and business from potential hackers.

To ensure seamless migration from existing implementations, Dimensions integrates with most legacy systems.

Dimensions offers 24/7 customer support, something you won't find with some of the dead-end platforms of the past. Others will simply have a help page or community forum, but that doesn't necessarily solve the problem. It can also be costly if you need to reach someone for support on other platforms, whereas Dimensions support is included in your plan.

Migrating to Dimensions is a seamless transition with little to no downtime. Protecting online businesses from cyber threats is a top priority while transitioning their websites from another platform or service. You get a dedicated team at your disposal throughout the transition to ensure timely completion and implementation.

Heat Map, Customer Session Playback, Live Chat and Analytics

Dimensions offers intelligent customer insights with Heat Map tracking, Full customer session playback, and live chat allowing you to understand customers' needs. Heat Map will help you identify the most used areas of your website and what your customers are clicking on. Further, customer session playback will help you identify how customers arrived at certain products or pages. Dimensions also has a live customer session that helps you provide prompt support.

Customer insights and analytics are lifeblood for any e-business in today's digital era. Dimensions offers intelligent insights into demographics to help you market to your target audiences.

Highly personalized user experience using Artificial Intelligence

Dimensions lets you deploy smart AI-powered bots that use machine learning algorithms to come up with smarter replies to customer questions thus, reducing response time significantly. Chatbots can help address customer queries that usually drop in after business hours with automated and pre-defined responses. Eureka! Never lose a sale.

Business Efficiency and Automation using AI and Machine Learning

AI and machine learning can help predict inventory and automate processes such as support, payments, and procurement. It can also expand business intelligence to help create targeted marketing plans. Lastly, it can give you live GPS logistics tracking.

Mobile Application

Dimensions team will design your mobile site application to look and function as if a consumer were viewing it on their computer. Fully optimized and designed for ease of use while not limiting anything from your main site.

About Futurism Technologies

Advancements in digital information technology continue to offer companies with the opportunities to drive efficiency, revenue, better understand and engage customers, and redefine their business models. At Futurism, we partner with our clients to leverage the power of digital technology. Digital evolution or a digital revolution, Futurism helps to guide companies on their DX journey.

Whether it is taking a business to the cloud to improve efficiency and business continuity, building a next-generation ecommerce marketplace and mobile app for a retailer, helping to define and implement a new business model for a smart factory, or providing end-to-end cybersecurity services, Futurism brings in the global consulting and implementation expertise it takes to monetize the digital journey.

Futurism provides DX services across the entire value chain including e-commerce, digital infrastructure, business processes, digital customer engagement, and cybersecurity.

Learn more about Futurism Technologies, Inc. at http://www.futurismtechnologies.com

Contact:

Leo J ColeChief Marketing OfficerMobile: +1-512-300-9744Email: communication@futurismtechnologies.com

Website: http://www.futurismtechnologies.com

Related Images

futurism-technologies.png Futurism Technologies

Related Links

Next-Gen Business Commerce Platform

View original content to download multimedia:http://www.prnewswire.com/news-releases/futurism-reinforces-its-next-gen-business-commerce-platform-with-advanced-machine-learning-and-artificial-intelligence-capabilities-301152696.html

SOURCE Futurism Technologies, Inc.

See the rest here:
Futurism Reinforces Its Next-Gen Business Commerce Platform With Advanced Machine Learning and Artificial Intelligence Capabilities - Yahoo Finance

Purebase Enhances Its Board of Advisors with An Expert on Machine Learning and Cheminformatics – GlobeNewswire

IONE, CA, Oct. 13, 2020 (GLOBE NEWSWIRE) -- Purebase Corporation (OTCQB: PUBC), a diversified resource company, headquartered in Ione, California, today announces that Dr. Newell Washburn, PhD, whom is an expert on machine learning and cheminformatics applied to complex materials applications has agreed to join the Purebase Advisory Board.

Dr. Washburn joins Dr. Karen Scrivener, PhD, Dr. Kimberly Kurtis, PhD, and Mr. Joe Thomas as part of the Purebase Advisory Board team that will provide expert guidance in the development and execution of Purebases rollout of next-generation, carbon emission reducing, supplementary cementitious materials (SCMs).

Purebases Chairman and CEO, Scott Dockter stated, We look forward to Dr. Washburn joining our team. He will be an asset and great resource as his primary focus is the use of data-driven approaches to formulate cementitious binders with high SCM content and to design chemical admixture systems for the broad deployment. In addition, his partnering with a broad range of chemical admixture and cement companies and the ARPA-E program in the Department of Energy. We are looking forward to working with him.

Newell R. Washburn, PhD is Associate Professor of Chemistry and Engineering at Carnegie Mellon University and CEO of Ansatz AI. Professor Washburn co-founded Ansatz AI to commercialize the hierarchical machine learning algorithm he and his collaborators developed at CMU for modeling and optimizing complex material systems based on sparse datasets. The company is currently working with clients in the US, Europe, and Japan on using chemical and materials informatics in product development and manufacturing. Professor Washburn received a BS in Chemistry from the University of Illinois at Urbana-Champaign, performed doctoral research at the University of California (Berkeley) on the solid state chemistry of magnetic metal oxides, and then did post-doctoral research in chemical engineering at the University of Minnesota (Twin Cities).

About Purebase Corporation

Purebase Corporation (OTCQB: PUBC) is a diversified resource company that acquires, develops, and markets minerals for use in the agriculture, construction, and other specialty industries.

Contacts

Emily Tirapelle | Purebase Corporation

emily.tirapelle@purebase.com,and please visit our corporate website http://www.purebase.com

Safe Harbor

This press release contains statements, which may constitute forward-looking statements within the meaning of the Securities Act of 1933 and the Securities Exchange Act of 1934, as amended by the Private Securities Litigation Reform Act of 1995. Those statements include statements regarding the intent, belief, or current expectations of Purebase Corporation and members of its management team as well as the assumptions on which such statements are based. Such forward-looking statements are not guarantees of future performance and involve risks and uncertainties, and that actual results may differ materially from those contemplated by such forward-looking statements. Important factors currently known to management that may cause actual results to differ from those anticipated are discussed throughout the Companys reports filed with Securities and Exchange Commission which are available at http://www.sec.gov as well as the Companys web site at http://www.purebase.com. The Company undertakes no obligation to update or revise forward-looking statements to reflect changed assumptions, the occurrence of unanticipated events or changes to future operating results.

Read the original:
Purebase Enhances Its Board of Advisors with An Expert on Machine Learning and Cheminformatics - GlobeNewswire

How to Beat Analysts and the Stock Market with Machine Learning – Knowledge@Wharton

Analyst expectations of firms earnings are on average biased upwards, and that bias varies over time and stocks, according to new research by experts at Wharton and elsewhere. They have developed a machine-learning model to generate a statistically optimal and unbiased benchmark for earnings expectations, which is detailed in a new paper titled, Man vs. Machine Learning: The Term Structure of Earnings Expectations and Conditional Biases. According to the paper, the model has the potential to deliver profitable trading strategies: to buy low and sell high. When analyst expectations are too pessimistic, investors should buy the stock. When analyst expectations are excessively optimistic, investors can sell their holdings or short stocks as price declines are forecasted.

[With the machine-learning model], we can predict how the prices of the stocks will behave based on whether or not the analyst forecast is too optimistic or too pessimistic, said Wharton finance professor Jules H. van Binsbergen, who is one of the papers authors. His co-authors are Xiao Han, a doctoral student at the University of Edinburgh Business School; and Alejandro Lopez-Lira, a finance professor at the BI Norwegian Business School.

The researchers found that the biases of analysts increase in the forecast horizon, or in the period when the earnings announcement date is not anytime soon. However, on average, analysts revise their expectations downwards as the date of the earnings announcement approaches. These revisions induce negative cross-sectional stock predictability, the researchers write, explaining that stocks with more optimistic expectations earn lower subsequent returns. At the same time, corporate managers have more information about their own firms than investors have, and can use that informational advantage by issuing fresh stock, Binsbergen and his co-authors note.

The Opportunity to Profit

Comparing analysts earnings expectations with the benchmarks provided by the machine-learning algorithm reveals the degree of analysts biases, and the window of opportunity it opens. Binsbergen explained how investors could profit from their machine-learning model. With our machine-learning model, we can measure the mistakes that the analysts are making by taking the difference between what theyre forecasting and what our machine-learning forecast estimates, he said.

We can measure the mistakes that the analysts are making by taking the difference between what theyre forecasting and what our machine-learning forecast estimates. Jules H. van Binsbergen

Using that arbitrage opportunity, investors could short-sell stocks for which analysts are overly optimistic, and book their profits when the prices come down to realistic levels as the earnings announcement date approaches, said Binsbergen. Similarly, they could buy stocks for which analysts are overly pessimistic, and sell them for a profit when their prices rise to levels that correspond with earnings that turn out to be higher than forecasted, he added.

Binsbergen identified two main findings of the latest research. One is how optimistic analysts are substantially over time. Sometimes the bias is higher, and sometimes it is lower. That holds for the aggregate, but also for individual stocks, he said. With our method, you can track over time the stocks for which analysts are too optimistic or too pessimistic. That said, there are more stocks for which analysts are optimistic than theyre pessimistic, he added.

The second finding of the study is that there is quite a lot of difference between stocks in how biased the analysts are, said Binsbergen. So, its not that were just making one aggregate statement, that on average for all stocks the analysts are too optimistic.

Capital-raising Window for Corporations

Corporations, too, could use the machine-learning algorithms measure for analysts biases. If you are a manager of a firm who is aware of those biases, then in fact you can benefit from that, said Binsbergen. If the price is high, you can issue stocks and raise money. Conversely, if analysts negative biases push down the price of a stock, they serve as a signal for the firm to avoid issuing fresh stock at that time.

When analysts biases lift or depress a stocks price, it implies that the markets seem to be buying the analysts forecasts and were not correcting them for over-optimism or over-pessimism yet, Binsbergen said. With the machine-learning model that he and his researchers have developed, you can have a profitable investment strategy, he added. That also means that the managers of the firms whose stock prices are overpriced can issue stocks. When the stock is underpriced they can either buy back stocks, or at least refrain from issuing stocks.

For their study, the researchers used information from firms balance sheets, macroeconomic variables, and analysts predictions. They constructed forecasts for annual earnings that are a year and two years ahead for annual earnings; similarly, they used forecasts that were one, two and three quarters ahead for quarterly earnings. With the benchmark expectation provided by their machine-learning algorithm, they then calculated the bias in expectations as the difference between the analysts forecasts and the machine-learning forecasts.

Read this article:
How to Beat Analysts and the Stock Market with Machine Learning - Knowledge@Wharton

Robotic Interviews, Machine Learning And the Future Of Workforce Recruitment – Entrepreneur

These would affect all aspects of HR functions such as the way HR professionals on-board and hire people, and the way they train them

Stay informed and join our daily newsletter now!

October12, 20204 min read

Opinions expressed by Entrepreneur contributors are their own.

You're reading Entrepreneur India, an international franchise of Entrepreneur Media.

Artificial intelligence (AI) is changing all aspects of our lives and that too at a rapid pace. This includes our professional lives, too. Experts expect that in the days ahead, AI would become a greater part of our careers as all companies are moving ahead with adopting such technology. They are using more machines that use AI technology that would affect our daily professional activities. Soon enough, we would seemachine learning and deep learning in HRtoo. It would affect all aspects of HR (human resources) such as the way HR professionals on-board and hire people, and the way they train them.

Impact on onboarding and recruitment

These days, companies are usingrobotics in HRto make sure they have found the right people for particular job profiles. This means that even before you have stepped into your new office, your company already knows that you are the best person for the job thanks to such technology. They are using AI to pre-screen candidates before they invite the best candidates for interviews. This especially applies to large companies that offer thousands of new jobs each year and where millions of applicants go looking for jobs.

Impact on training on the job

Companies are also usingmachine learning and deep learning in HRto help provide on-the-job training to employees. Just because you have landed a job and settled in it, it does not mean that you know it all. You need to get job-related training so you can keep getting better. This is where experts expect that AI would play a major role in the coming years. It will also help one generation of professionals in an organization transfer its skills to its successors. This will make sure that no company would ever suffer from skill gaps.

Workforce augmentation

Robotics in HRwill play a major role in improving the people working in organizations where the management implements such technology. A major reason why people are so apprehensive about using AI in an organization is that they feel it would replace them and do all that they can do now. This will consequently lead to job losses. However, in the present scenario, AI is all about augmenting such a workforce. This means that it would help you perform your job with greater efficiency. Contrary to popular opinion, it would not replace you.

Workplace surveillance

Companies can also usemachine learning and deep learning in HRto improve their workforce surveillance work. This is uncomfortable for several employees as they feel that such technology would encroach on their workplace privacy. Gartner recently did a survey where it found that more than half of the companies that had a yearly turnover over $750 million use digital tools to get data on the activities of their employees and monitor their overall performance. As part of this, they analyze their emails to find out how engaged and content they are with their work.

Usage of workplace robots

Apart fromrobotics in HR,companies these days are also using physical robots that can move around on their own. This is especially true for the warehousing and manufacturing companies. Experts expect that soon this would become a common feature in a lot of other workplaces too. Companies specializing in mobility are creating delivery robots that can move around the workplace and deliver items straight to your desk. Tech companies are also developing security robots. Experts believe they would become commonplace because they can assure the safety of commercial properties from trespassers. Companies are also developing software to help you park your cars in your office.

See the rest here:
Robotic Interviews, Machine Learning And the Future Of Workforce Recruitment - Entrepreneur

AI and Machine Learning Can Help Fintechs if We Focus on Practical Implementation and Move Away from Overhyped Narratives, Researcher Says – Crowdfund…

Artificial intelligence (AI) and machine learning (ML) algorithms are increasingly being used by Fintech platform developers to make more intelligent or informed decisions regarding key processes. This may include using AI to identify potentially fraudulent transactions, determining the creditworthiness of a borrower applying for a loan, and many other use cases.

Research conducted by Accenture found that 87% of business owners in the United Kingdom claim that theyre struggling with finding the best ways to adopt AI or ML technologies. Three out of four or 75% of C-Suite executives responding to Accentures survey said they really need to effectively adopt AI solutions within 5 years, so that they dont lose business to competitors.

As reported by IT Pro Portal, theres currently a gap between what may be considered just hype and actual or practical implementation of AI technologies and platforms.

Less than 5% of firms have actually managed to effectively apply Ai, meanwhile, more than 80% are currently just exploring basic proof of concepts for applying AL or ML algorithms. Many firms are also not familiar or dont have the expertise to figure out how to best apply these technologies to specific business use cases.

Yann Stadnicki, an experienced technologist and research engineer, argues that these technologies can play a key role in streamlining business operations. For example, they can help Fintech firms with lowering their operational costs while boosting their overall efficiency. They can also make it easier for a companys CFO to do their job and become a key player when it comes to supporting the growth of their firm.

Stadnicki points out that a research study suggests that company executives werent struggling to adopt AI solutions due to budgetary constraints or limitations. He adds that the study shows there may be certain operational challenges when it comes to effectively integrating AI and ML technologies.

He also mentions:

The inability to set up a supportive organizational structure, the absence of foundational data capabilities, and the lack of employee adoption are barriers to harnessing AI and machine learning within an organization.

He adds:

For businesses to harness the benefits of AI and machine learning, there needs to be a move away from an overhyped theoretical narrative towards practical implementation.It is important to formulate a plan and integration strategy of how your business will use AI and ML, to both mitigate the risks of cybercrime and fraud, while embracing the opportunity of tangible business impact.

Fintech firms and organizations across the globe are now leveraging AI and ML technologies to improve their products and services. In a recent interview with Crowdfund Insider, Michael Rennie, a U.K.-based product manager for Mendix, a Siemens business and the global leader in enterprise low-code, explained how emerging tech can be used to enhance business processes.

He noted:

Prior to low-code, the application and use of cutting-edge technologies within the banking sector have been more academic than actual. But low-code now enables you to apply emerging technologies like AI in a practical way so that they actually make an impact. For example, you could pair a customer-focused banking application built with low-code with a machine learning (ML) engine to identify user behaviors. Then you could make more informed decisions about where to invest in customer experience and most benefit your business.

He added:

Its easy to see the value in this. The problem is that without the correct technology, its too difficult to integrate traditional customer-facing applications with new technology systems. Such integrations typically require millions of dollars in investment and years of work. By the time an organization finishes that intensive work, the market may have moved on. Low-code eliminates that problem, makes integration easy and your business more agile.

Go here to see the original:
AI and Machine Learning Can Help Fintechs if We Focus on Practical Implementation and Move Away from Overhyped Narratives, Researcher Says - Crowdfund...

Top 8 Books on Machine Learning In Cybersecurity One Must Read – Analytics India Magazine

With the proliferation of information technologies and data among us, cybersecurity has become a necessity. Machine learning helps organisations by getting insights from raw data, predicting future outcomes and more.

For a few years now, such utilisation of machine learning techniques has been started being implemented in cybersecurity. It helps in several ways, including identifying frauds, malicious codes and other such.

In this article, we list down the top eight books, in no particular order, on machine learning In cybersecurity that one must-read.

About: Written by Sumeet Dua and Xian Du, this book introduces the basic notions in machine learning and data mining. It provides a unified reference for specific machine learning solutions to cybersecurity problems as well as provides a foundation in cybersecurity fundamentals, including surveys of contemporary challenges.

The book details some of the cutting-edge machine learning and data mining techniques that can be used in cybersecurity, such as in-depth discussions of machine learning solutions to detection problems, contemporary cybersecurity problems, categorising methods for detecting, scanning, and profiling intrusions and anomalies, among others.

Get the book here.

About: In Malware Data Science, security data scientist Joshua Saxe introduces machine learning, statistics, social network analysis, and data visualisation, and shows you how to apply these methods to malware detection and analysis.

Youll learn how to analyse malware using static analysis, identify adversary groups through shared code analysis, detect vulnerabilities by building machine learning detectors, identify malware campaigns, trends, and relationships through data visualisation, etc.

Get the book here.

About: This book begins with an introduction of machine learning and algorithms that are used to build AI systems. After gaining a fair understanding of how security products leverage machine learning, you will learn the core concepts of breaching the AI and ML systems.

With the help of hands-on cases, you will understand how to find loopholes as well as surpass a self-learning security system. After completing this book, readers will be able to identify the loopholes in a self-learning security system and will also be able to breach a machine learning system efficiently.

Get the book here.

About: In this book, youll learn how to use popular Python libraries such as TensorFlow, Scikit-learn, etc. to implement the latest AI techniques and manage difficulties faced by the cybersecurity researchers.

The book will lead you through classifiers as well as features for malware, which will help you to train and test on real samples. You will also build self-learning, reliant systems to handle the cybersecurity tasks such as identifying malicious URLs, spam email detection, intrusion detection, tracking user and process behaviour, among others.

Get the book here.

About: This book is for the data scientists, machine learning developers, security researchers, and anyone keen to apply machine learning to up-skill computer security. In this book, you will learn how to use machine learning algorithms with complex datasets to implement cybersecurity concepts, implement machine learning algorithms such as clustering, k-means, and Naive Bayes to solve real-world problems, etc.

You will also learn how to speed up a system using Python libraries with NumPy, Scikit-learn, and CUDA, combat malware, detect spam and fight financial fraud to mitigate cybercrimes, among others.

Get the book here.

About: This book teaches you how to use machine learning for penetration testing. You will learn a hands-on and practical manner, how to use the machine learning to perform penetration testing attacks, and how to perform penetration testing attacks on machine learning systems. You will also learn the techniques that few hackers or security experts know about.

Get the book here.

About: In this book, you will learn machine learning in cybersecurity self-assessment, how to identify and describe the business environment in cybersecurity projects using machine learning, etc.

The book covers all machine learning in cybersecurity essentials, such as extensive criteria grounded in the past and current successful projects and activities by experienced machine learning in cybersecurity practitioners, among others.

Get the book here.

About: This book presents a collection of state-of-the-art AI approaches to cybersecurity and cyber threat intelligence. It offers strategic defence mechanisms for malware, addressing cybercrime, and assessing vulnerabilities to yield proactive rather than reactive countermeasures.

Get the book here.

Read the original here:
Top 8 Books on Machine Learning In Cybersecurity One Must Read - Analytics India Magazine

Experian partners with Standard Chartered to drive Financial Inclusion with Machine Learning, powering the next generation of Decisioning – Yahoo…

Leveraging innovation in technology to provide access to credit during uncertain times to populations underserved by formal financial services.

This social impact was made possible by the Bank's digital first strategy and Experian's best-in-class decisioning platform. Experian's software enables the Bank to analyse a high volume of alternative data and execute machine learning models for better decision-making and risk management.

Since the first pilot implementation in India in December 2019, the Bank saw an improvement in approvals by increasing overall acceptance rates using big data and artificial intelligence. This enhanced risk management capabilities to test and learn, helping to expand access to crucial credit and financial services.

The Bank and Experian are committed to financial inclusion, with plans for rollouts across 6 more markets across Asia, Africa and the Middle East.

SINGAPORE, Oct. 15, 2020 /PRNewswire/ -- Experian a leading global information services company has announced a partnership with leading international banking group Standard Chartered to drive financial access across key markets in Asia, Africa and the Middle East by leveraging the latest technology innovation in credit decisioning. Without enough credit bureau data for financial institutions to determine their credit worthiness, especially in this time of unprecedented volatility, many underbanked communities are facing difficulties securing access to loans.

The collaboration involves Experian's leading global decisioning solution, PowerCurve Strategy Manager, integrated with machine learning capabilities that will enable deployment of advanced analytics to help organisations make the most of their data. In support of Standard Chartered's digital-first transformation strategy, this state-of-the-art machine learning capability provides the Bank with the ability to ingest and analyse a high volume of non-Bank or, with client consent, alternative data, enabling faster, more effective and accurate credit decisioning, resulting in better risk management for the Bank and better outcomes from clients.

Story continues

Launched in India in December 2019, Standard Chartered registered positive business outcomes such as increased acceptance rates and reduced overall delinquencies. The success in India meant that Standard Chartered is now able to improve risk management for more clients who previously would have been underbanked, empowering them with access to crucial credit and financial services in their time of need.

Beyond benefits to consumers, access to credit is vital for overall economic growth, with consumer spending helping businesses continue to operate during these difficult times.

"Social and economic growth in developing markets, especially in the coming period, will be driven by progress in financial inclusion. Experian strongly believes that a technology, advanced analytics and data-driven approach can address this opportunity and we remain deeply committed to the vision of progressing financial inclusion for the world's underserved and underbanked population. Our long-standing collaboration with Standard Chartered across our PowerCurve decisioning suite of solutions, leveraging machine learning and big data to advance to the next generation of credit decisioning, is focused on empowering these underbanked communities to access credit," said Mohan Jayaraman, Managing Director, Southeast Asia & Regional Innovation, Experian Asia Pacific.

Reaffirming a commitment towards financial inclusion, Experian and Standard Chartered are working on plans to deploy the solution to its retail franchise across Asia, Africa and the Middle East, in addition to India.

"We're committed to supporting sustainable social and economic development through our business, operations and communities. This partnership helps the Bank manage risk more effectively with a more robust data-driven credit decisioning which in turn enables more clients to gain access to financial services at a time when they need it the most," said Vishu Ramachandran, Group Head, Retail Banking, Standard Chartered.

"Partnerships are central to our digital banking strategy and how we better serve our clients. Experian was a natural choice as a partner given their strong track record in innovation and in driving financial inclusion," said Aalishaan Zaidi, Global Head, Client Experience, Channels & Digital Banking, Standard Chartered.

For more information, please visit Experian's Decisioning & Credit Risk Management solution.

For further information, please contact:

About Experian

Experian is the world's leading global information services company. During life's big moments from buying a home or a car, to sending a child to college, to growing a business by connecting with new customers we empower consumers and our clients to manage their data with confidence. We help individuals to take financial control and access financial services, businesses to make smarter decisions and thrive, lenders to lend more responsibly, and organisations to prevent identity fraud and crime.

We have 17,800 people operating across 45 countries and every day we're investing in new technologies, talented people and innovation to help all our clients maximise every opportunity. We are listed on the London Stock Exchange (EXPN) and are a constituent of the FTSE 100 Index.

Learn more at http://www.experian.com.sg or visit our global content hub at our global news blog for the latest news and insights from the Group.

About Standard Chartered

We are a leading international banking group, with a presence in 60 of the world's most dynamic markets, and serving clients in a further 85. Our purpose is to drive commerce and prosperity through our unique diversity, and our heritage and values are expressed in our brand promise, Here for good.

Standard Chartered PLC is listed on the London and Hong Kong Stock Exchanges.

For more stories and expert opinions please visit Insights at sc.com. Follow Standard Chartered on Twitter, LinkedIn and Facebook.

Logo - https://photos.prnasia.com/prnh/20201013/2947870-1LOGO-a Logo - https://photos.prnasia.com/prnh/20201013/2947870-1LOGO-b

SOURCE Experian

View post:
Experian partners with Standard Chartered to drive Financial Inclusion with Machine Learning, powering the next generation of Decisioning - Yahoo...

What is quantum computing?

Quantum computing is an area of study focused on the development of computer based technologies centered around the principles ofquantum theory. Quantum theory explains the nature and behavior of energy and matter on thequantum(atomic and subatomic) level. Quantum computing uses a combination ofbitsto perform specific computational tasks. All at a much higher efficiency than their classical counterparts. Development ofquantum computersmark a leap forward in computing capability, with massive performance gains for specific use cases. For example quantum computing excels at like simulations.

The quantum computer gains much of its processing power through the ability for bits to be in multiple states at one time. They can perform tasks using a combination of 1s, 0s and both a 1 and 0 simultaneously. Current research centers in quantum computing include MIT, IBM, Oxford University, and the Los Alamos National Laboratory. In addition, developers have begun gaining access toquantum computers through cloud services.

Quantum computing began with finding its essential elements. In 1981, Paul Benioff at Argonne National Labs came up with the idea of a computer that operated with quantum mechanical principles. It is generally accepted that David Deutsch of Oxford University provided the critical idea behind quantum computing research. In 1984, he began to wonder about the possibility of designing a computer that was based exclusively on quantum rules, publishing a breakthrough paper a few months later.

Quantum Theory

Quantum theory's development began in 1900 with a presentation by Max Planck. The presentation was to the German Physical Society, in which Planck introduced the idea that energy and matter exists in individual units. Further developments by a number of scientists over the following thirty years led to the modern understanding of quantum theory.

Quantum Theory

Quantum theory's development began in 1900 with a presentation by Max Planck. The presentation was to the German Physical Society, in which Planck introduced the idea that energy and matter exists in individual units. Further developments by a number of scientists over the following thirty years led to the modern understanding of quantum theory.

The Essential Elements of Quantum Theory:

Further Developments of Quantum Theory

Niels Bohr proposed the Copenhagen interpretation of quantum theory. This theory asserts that a particle is whatever it is measured to be, but that it cannot be assumed to have specific properties, or even to exist, until it is measured. This relates to a principle called superposition. Superposition claims when we do not know what the state of a given object is, it is actually in all possible states simultaneously -- as long as we don't look to check.

To illustrate this theory, we can use the famous analogy of Schrodinger's Cat. First, we have a living cat and place it in a lead box. At this stage, there is no question that the cat is alive. Then throw in a vial of cyanide and seal the box. We do not know if the cat is alive or if it has broken the cyanide capsule and died. Since we do not know, the cat is both alive and dead, according to quantum law -- in a superposition of states. It is only when we break open the box and see what condition the cat is in that the superposition is lost, and the cat must be either alive or dead.

The principle that, in some way, one particle can exist in numerous states opens up profound implications for computing.

A Comparison of Classical and Quantum Computing

Classical computing relies on principles expressed by Boolean algebra; usually Operating with a 3 or 7-modelogic gateprinciple. Data must be processed in an exclusive binary state at any point in time; either 0 (off / false) or 1 (on / true). These values are binary digits, or bits. The millions of transistors and capacitors at the heart of computers can only be in one state at any point. In addition, there is still a limit as to how quickly these devices can be made to switch states. As we progress to smaller and faster circuits, we begin to reach the physical limits of materials and the threshold for classical laws of physics to apply.

The quantum computer operates with a two-mode logic gate:XORand a mode called QO1 (the ability to change 0 into a superposition of 0 and 1). In a quantum computer, a number of elemental particles such as electrons or photons can be used. Each particle is given a charge, or polarization, acting as a representation of 0 and/or 1. Each particle is called a quantum bit, or qubit. The nature and behavior of these particles form the basis of quantum computing and quantum supremacy. The two most relevant aspects of quantum physics are the principles of superposition andentanglement.

Superposition

Think of a qubit as an electron in a magnetic field. The electron's spin may be either in alignment with the field, which is known as aspin-upstate, or opposite to the field, which is known as aspin-downstate. Changing the electron's spin from one state to another is achieved by using a pulse of energy, such as from alaser. If only half a unit of laser energy is used, and the particle is isolated the particle from all external influences, the particle then enters a superposition of states. Behaving as if it were in both states simultaneously.

Each qubit utilized could take a superposition of both 0 and 1. Meaning, the number of computations a quantum computer could take is 2^n, where n is the number of qubits used. A quantum computer comprised of 500 qubits would have a potential to do 2^500 calculations in a single step. For reference, 2^500 is infinitely more atoms than there are in the known universe. These particles all interact with each other via quantum entanglement.

In comparison to classical, quantum computing counts as trueparallel processing. Classical computers today still only truly do one thing at a time. In classical computing, there are just two or more processors to constitute parallel processing.EntanglementParticles (like qubits) that have interacted at some point retain a type can be entangled with each other in pairs, in a process known ascorrelation. Knowing the spin state of one entangled particle - up or down -- gives away the spin of the other in the opposite direction. In addition, due to the superposition, the measured particle has no single spin direction before being measured. The spin state of the particle being measured is determined at the time of measurement and communicated to the correlated particle, which simultaneously assumes the opposite spin direction. The reason behind why is not yet explained.

Quantum entanglement allows qubits that are separated by large distances to interact with each other instantaneously (not limited to the speed of light). No matter how great the distance between the correlated particles, they will remain entangled as long as they are isolated.

Taken together, quantum superposition and entanglement create an enormously enhanced computing power. Where a 2-bit register in an ordinary computer can store only one of four binary configurations (00, 01, 10, or 11) at any given time, a 2-qubit register in a quantum computer can store all four numbers simultaneously. This is because each qubit represents two values. If more qubits are added, the increased capacity is expanded exponentially.

Quantum Programming

Quantum computing offers an ability to write programs in a completely new way. For example, a quantum computer could incorporate a programming sequence that would be along the lines of "take all the superpositions of all the prior computations." This would permit extremely fast ways of solving certain mathematical problems, such as factorization of large numbers.

The first quantum computing program appeared in 1994 by Peter Shor, who developed a quantum algorithm that could efficiently factorize large numbers.

The Problems - And Some Solutions

The benefits of quantum computing are promising, but there are huge obstacles to overcome still. Some problems with quantum computing are:

There are many problems to overcome, such as how to handle security and quantum cryptography. Long time quantum information storage has been a problem in the past too. However, breakthroughs in the last 15 years and in the recent past have made some form of quantum computing practical. There is still much debate as to whether this is less than a decade away or a hundred years into the future. However, the potential that this technology offers is attracting tremendous interest from both the government and the private sector. Military applications include the ability to break encryptions keys via brute force searches, while civilian applications range from DNA modeling to complex material science analysis.

The rest is here:
What is quantum computing?

Fellow at Sandia Labs Appointed to National Quantum Computing Advisory Committee – HPCwire

ALBUQUERQUE, N.M., Oct. 13, 2020 Sandia National Laboratories Fellow Gil Herrera has been appointed to the newly established U.S. National Quantum Initiative Advisory Committee.

Herrera is one of two committee members representing the Department of Energy national laboratories. He joins 20 others from government, industry and academia tasked with advising the nations highest offices on matters concerning quantum information science. His appointment is for three years.

Quantum information science a broad field of study that includes quantum computing concerns machines that accomplish extraordinary tasks by manipulating matter at the smallest scales.

Quantum computing represents both an exceptional opportunity and a dire threat, Herrera said. On the positive side, when useful quantum computers can be built, they could solve molecular chemistry problems that could significantly reduce worldwide energy consumption or facilitate the rapid development of pharmaceuticals. On a more negative note, a quantum computer threatens public key encryption that protects almost all secure web communications.

In August, Sandia and more than a dozen collaborators, collectively called the Quantum Systems Accelerator, were selected as one of five national quantum research centers.

The national advisory committee, established on Aug. 28, informs offices such as the president and the secretary of energy about how to maintain U.S. leadership in this area of technology.

To me, leadership means that U.S. companies have the highest performing quantum computers, from qubits through apps, and the best quantum sensors and communication systems, Herrera said. Of equal importance, the U.S. quantum information technologies are not reliant on supply chains or intellectual property outside of the U.S., and the benefits of the U.S. government investments in quantum information science extend to all Americans, including those who manufacture quantum computing, sensing and communications systems.

A qubit is the basic processing unit of a quantum computer, analogous to a bit in a conventional computer.

Of his new appointment, which he will hold concurrently with his position at Sandia, Herrera said, I hope to help the program achieve a balance between the needs of scientific advancement, commercial interests of U.S. businesses, and national security interests.

Herrera has recently been coordinating COVID-19 research efforts across Sandias 14,000-strong workforce. A Sandia fellow since 2018, he also has spearheaded efforts to expand discovery research, served on an independent review team for a U.S. Department of Defense microelectronics program and has mentored staff members ranging from new hires to directors.

He previously served as the director of Sandias Microsystems Engineering, Science and Applications complex, which researches and produces quantum technology in addition to its main mission of producing specialized microelectronics for the nations nuclear stockpile.

Herrera has been director of the Laboratory for Physical Sciences a joint University of Maryland and U.S. government research institute and served at the White House Office of Science and Technology Policy as an American Association for the Advancement of Science/Sloan Fellow under President George H.W. Bush, where he worked on semiconductor and technology transfer policies.

He has received numerous awards for his service, including three Civilian Service medals from the Pentagon and the National Security Agency Research Medallion, and has received two distinguished alumni awards from the University of New Mexico.

Herrera earned his masters degree in electrical engineering from the University of California, Berkeley. An Albuquerque native, he received his bachelors degree in computer engineering from UNM.

Sandia National Laboratories is a multimission laboratory operated by National Technology and Engineering Solutions of Sandia LLC, a wholly owned subsidiary of Honeywell International Inc., for the U.S. Department of Energys National Nuclear Security Administration. Sandia Labs has major research and development responsibilities in nuclear deterrence, global security, defense, energy technologies and economic competitiveness, with main facilities in Albuquerque, New Mexico, and Livermore, California.

Source: Sandia National Laboratories

Read the original:
Fellow at Sandia Labs Appointed to National Quantum Computing Advisory Committee - HPCwire

Bringing the promise of quantum computing to nuclear physics – MSUToday

Quantum mechanics, the physics of atoms and subatomic particles, can be strange, especially compared to the everyday physics of Isaac Newtons falling apples. But this unusual science is enabling researchers to develop new ideas and tools, including quantum computers, that can help demystify the quantum realm and solve complex everyday problems.

Thats the goal behind a new U.S. Department of Energy Office of Science (DOE-SC) grant, awarded to Michigan State University (MSU) researchers, led by physicists at Facility for Rare Isotope Beams (FRIB). Working with Los Alamos National Laboratory, the team is developing algorithms essentially programming instructions for quantum computers to help these machines address problems that are difficult for conventional computers. For example, problems like explaining the fundamental quantum science that keeps an atomic nucleus from falling apart.

The $750,000 award, provided by the Office of Nuclear Physics within DOE-SC, is the latest in a growing list of grants supporting MSU researchers developing new quantum theories and technology.

The aim is to improve the efficiency and scalability of quantum simulation algorithms, thereby providing new insights on their applicability for future studies of nuclei and nuclear matter, said principal investigator Morten Hjorth-Jensen, an FRIB researcher who is also a professor in MSUs Department of Physics and Astronomy and a professor of physics at the University of Oslo in Norway.

Morten Hjorth-Jensen (Credit: Hilde Lynnebakken)

Although this grant focuses on nuclear physics, the algorithms it yields could benefit other fields looking to use quantum computings promise to more rapidly solve complicated problems. This includes scientific disciplines such as chemistry and materials science, but also areas such as banking, logistics, and data analytics.

There is a lot of potential for transferring what we are developing into other fields, Hjorth-Jensen said. Hopefully, our results will lead to an increased interest in theoretical and experimentaldevelopments of quantum information technologies. All the algorithms developed as part of this work will be publicly available, he added.

What makes quantum computers attractive tools for these applications is a freedom afforded by quantum mechanics.

Classical computers are constrained to a binary system of zeros and ones with transistors that are either off or on. The restrictions on quantum computers are looser.

Instead of transistors, quantum computers use technology called qubits (pronounced q-bits) that can be both on and off at the same time. Not somewhere in between, but in both opposite states at once.

Combined with the proper algorithms, this freedom enables quantum computers to run certain calculations much faster than their classical counterparts. The type of calculations, for instance, capable of helping scientists explain precisely how swarms of elementary particles known as quarks and gluons hold atomic nuclei together.

"It is really hard to do those problems, said Huey-Wen Lin, a co-investigator on the grant. I dont see a way to solve them any time soon with classical computers.

Huey-Wen Lin

Lin is an assistant professor in the Department of Physics and Astronomy and the Department of Computational Mathematics, Science and Engineering at MSU.

She added that quantum computers wont solve these problems immediately, either. But the timescales could be measured in years rather than careers.

Hjorth-Jensen believes this project will also help accelerate MSUs collaborations in quantum computing. Formally, this grant supports a collaboration of eight MSU researchers and staff scientist Patrick Coles at Los Alamos National Laboratory.

But Hjorth-Jensen hopes the project will spark more discussions and forge deeper connections with the growing community of quantum experts across campus and prepare the next generation of researchers. The grant will also open up new opportunities in quantum computing training for MSU students who are studying in the nations top-ranked nuclear physics graduate program.

The grant, titled From Quarks to Stars: A Quantum Computing Approach to the Nuclear Many-Body Problem, was awarded as part of Quantum Horizons: Quantum Information Systems Research and Innovation for Nuclear Science," a funding opportunity issued by DOE-SC.

Hjorth-Jensen and Lin are joined on this grant by their MSU colleagues Alexei Bazavov and Matthew Hirn from the Department of Computational Mathematics, Science and Engineering; Scott Bogner, Heiko Hergert, Dean Lee and Andrea Shindler from FRIB, and the Department of Physics and Astronomy. Hirn is also an assistant professor in the Department of Mathematics.

MSU is establishing FRIB as a new user facility for the Office of Nuclear Physics in the U.S. Department of Energy Office of Science. Under construction on campus and operated by MSU, FRIB will enable scientists to make discoveries about the properties of rare isotopes in order to better understand the physics of nuclei, nuclear astrophysics, fundamental interactions, and applications for society, including in medicine, homeland security, and industry.

The U.S. Department of Energy Office of Science is the single largest supporter of basic research in the physical sciences in the United States and is working to address some of todays most pressing challenges. For more information, visit energy.gov/science.

See more here:
Bringing the promise of quantum computing to nuclear physics - MSUToday

Ten-year Forecasts for Quantum Networking Opportunities and Deployments Over the Coming Decade – WFMZ Allentown

DUBLIN, Oct. 12, 2020 /PRNewswire/ -- The "Quantum Networking: A Ten-year Forecast and Opportunity Analysis" report has been added to ResearchAndMarkets.com's offering.

This report presents detailed ten-year forecasts for quantum networking opportunities and deployments over the coming decade.

Today there increasing talk about the Quantum Internet. This network will have the same geographical breadth of coverage as today's Internet but where the Internet carries bits, the Quantum Internet will carry qubits, represented by quantum states. The Quantum Internet will provide a powerful platform for communications among quantum computers and other quantum devices. It will also further enable a quantum version of the Internet-of-Things. Finally, quantum networks can be the most secure networks ever built - completely invulnerable if constructed properly.

Already there are sophisticated roadmaps showing how the Quantum Internet will come to be. At the present time, however, quantum networking in the real world consists of three research programs and commercialization efforts: Quantum Key Distribution (QKD) adds unbreakable coding of key distribution to public-key encryption. Cloud/network access to quantum computers is core to the business strategies of leading quantum computer companies. Quantum sensor networks promise enhanced navigation and positioning; more sensitive medical imaging modalities, etc. This report provides power ten-year forecasts of all three of these sectors.

This report provides a detailed quantitative analysis of where the emerging opportunities can be found today and how they will emerge in the future:

With regard to the scope of the report, the focus is, of course, on quantum networking opportunities of all kinds. It looks especially, however, on three areas: quantum key distribution (QKD,) quantum computer networking/quantum clouds, and quantum sensor networks. The report also includes in the forecasts breakouts by all the end-user segments of this market including military and intelligence, law enforcement, banking and financial services, and general business applications, as well as niche applications. There are also breakouts by hardware, software and services as appropriate.

In addition, there is also some discussion of the latest research into quantum networking, including the critical work on quantum repeaters. Quantum repeaters allow entanglement between quantum devices over long distances. Most experts predict repeaters will start to prototype in real-world applications in about five years, but this is far from certain.

This report will be essential reading for equipment companies, service providers, telephone companies, data center managers, cybersecurity firms, IT companies and investors of various kinds.

Key Topics Covered:

Executive SummaryE.1 Goals, Scope and Methodology of this ReportE.1.1 A Definition of Quantum NetworkingE.2 Quantum Networks Today: QKD, Quantum Clouds and Quantum Networked SensorsE.2.1 Towards the Quantum Internet: Possible Business OpportunitiesE.2.2 Quantum Key DistributionE.2.3 Quantum Computer Networks/Quantum CloudsE.2.4 Quantum Sensor NetworksE.3 Summary of Quantum Networking Market by Type of NetworkE.4 The Need for Quantum Repeaters to Realize Quantum Networking's PotentialE.5 Plan of this Report

Chapter One: Ten-year Forecast of Quantum Key Distribution1.1 Opportunities and Drivers for Quantum Key Distribution Networks1.1.1 QKD vs. PQC1.1.2 Evolution of QKD1.1.3 Technology Assessment1.2 Ten-year Forecasts of QKD Markets1.2.1 QKD Equipment and Services1.2.2 A Note on Mobile QKD1.3 Key Takeaways from this Chapter

Chapter Two: Ten-Year Forecast of Quantum Computing Clouds2.1 Quantum Computing: State of the Art2.2 Current State of Quantum Clouds and Networks2.3 Commercialization of Cloud Access to Quantum Computers2.4 Ten-Year Forecast for Cloud Access to Quantum Computers2.4.1 Penetration of Clouds in the Quantum Computing Space2.4.2 Revenue from Network Equipment for Quantum Computer Networks by End-User Industry2.4.3 Revenue from Network Equipment Software by End-User Industry2.5 Key Takeaways from this Chapter

Chapter Three: Ten-Year Forecast of Quantum Sensor Networks3.1 The Emergence of Networked Sensors3.1.1 The Demand for Quantum Sensors Seems to be Real3.2 The Future of Networked Sensors3.3 Forecasts for Networked Quantum Sensors3.4 Five Companies that will Shape the Future of the Quantum Sensor Business: Some Speculations

Chapter Four: Towards the Quantum Internet4.1 A Roadmap for the Quantum Internet4.1.1 The Quantum Internet in Europe4.1.2 The Quantum Internet in China4.1.3 The Quantum Internet in the U.S.4.2 Evolution of Repeater Technology: Ten-year Forecast4.3 Evolution of the Quantum Network4.4 About the Analyst4.5 Acronyms and Abbreviations Used In this Report

For more information about this report visit https://www.researchandmarkets.com/r/rksyxu

About ResearchAndMarkets.comResearchAndMarkets.com is the world's leading source for international market research reports and market data. We provide you with the latest data on international and regional markets, key industries, the top companies, new products and the latest trends.

Research and Markets also offers Custom Research services providing focused, comprehensive and tailored research.

Media Contact:

Research and MarketsLaura Wood, Senior Managerpress@researchandmarkets.com

For E.S.T Office Hours Call +1-917-300-0470For U.S./CAN Toll Free Call +1-800-526-8630For GMT Office Hours Call +353-1-416-8900

U.S. Fax: 646-607-1907Fax (outside U.S.): +353-1-481-1716

Read the original here:
Ten-year Forecasts for Quantum Networking Opportunities and Deployments Over the Coming Decade - WFMZ Allentown

The Contest to Protect Almost Everything on the Internet – The Wall Street Journal

Cryptographers are in the business of being paranoid, but their fears over quantum computers might be justified. Within the next 10 to 15 years, a quantum computer could solve some problems many millions of times faster than a classical computer and, one day, crack many of the defenses used to secure the internet.

The worst-case scenario is quite bad, says Chris Peikert, associate professor of computer science and engineering at the University of Michigan, who has been studying cryptography for two decades.

That is why Dr. Peikert and hundreds of the worlds top cryptographers are involved in a competition to develop new encryption standards for the U.S., which would guard against both classical and quantum-computing cyberattacks.

This summer, federal officials announced the 15 algorithms that will be considered for standardization, meaning the winners would become a part of the architecture of the internet, protecting peoples sensitive data.

Next, researchers will spend about a year trying to break them to see which ones hold up, and test them to get the best balance of performance and security.

Read this article:
The Contest to Protect Almost Everything on the Internet - The Wall Street Journal

This Week in Washington IP: Antitrust in the Ninth Circuit, Shaping Artificial Intelligence and Promoting Security in 5G Networks – IPWatchdog.com

This week in Washington IP events, the House of Representatives remains quiet during district work periods, while the Senate focuses this week on the nomination of Amy Coney Barrett to serve on the U.S. Supreme Court. Various tech related events will take place at policy institutes this week, including several at the Center for Strategic & International Studies exploring efforts to maintain American leadership in semiconductor manufacturing and innovation in the intelligence community. The Hudson Institute is hosting a virtual event this week to discuss the impacts of the Ninth Circuits recent decision to overturn Judge Lucy Kohs injunction against Qualcomms patent licensing practices.

The Hudson Institute

Antitrust in the 21st Century: The Ninth Circuits Decision in FTC v. Qualcomm

At 12:00 PM on Monday, online video webinar.

In early August, a panel of circuit judges in the U.S. Court of Appeals for the Ninth Circuit issued a unanimous 3-0 decision in favor of Qualcomm in its appeal against Judge Lucy Kohs ruling in favor of the Federal Trade Commission (FTC), which featured an injunction against Qualcomm for its patent licensing practices in the semiconductor industry. While the FTC pursues en banc review of the Ninth Circuits decision, this event will explore the FTCs chances for success on that petition as well as current guiding principles for those operating at the intersection of intellectual property rights and antitrust law. Speakers at this event will include Judge Paul R. Michel (Ret.), Former Chief Judge, U.S. Court of Appeals for the Federal Circuit; Richard A. Epstein, Professor of Law, New York University, Senior Fellow, Hoover Institution, and Professor Emeritus and Senior Lecturer, University of Chicago; Dina Kallay, Head of Competition (IPR, Americas, and Asia-Pacific), Ericsson; and Urka Petrov?i?, Senior Fellow, Hudson Institute.

Center for Strategic & International Studies

American Leadership in Semiconductor Manufacturing

At 2:00 PM on Tuesday, online video webinar.

This June, Representatives Michael McCaul (R-TX) and Doris Matsui (D-CA) introduced H.R. 7178, the Creating Helpful Incentives to Produce Semiconductors (CHIPS) for America Act. If enacted, the bill would create a tax credit for entities investing in semiconductor manufacturing facilities, among other incentives meant to support domestic chipmakers. This event, which will focus on the importance of maintaining dominance in the semiconductor sector in the face of growing challenges from China, will feature a discussion between Rep. McCaul, who is also Co-Chair, House Semiconductor Caucus & Lead Republican, House Foreign Affairs Committee; and James Andrew Lewis, Senior Vice President and Director, Technology Policy Program.

Information Technology & Innovation Foundation

How Will Quantum Computing Shape the Future of AI?

At 9:00 AM on Wednesday, online video webinar.

The power of quantum computing to compute algorithms more quickly than classical computing relies in large part upon the nascent technologys ability to model extremely complex problems, giving quantum computers the ability to create stronger forecasts in sectors where many variables come into play, such as weather predictions. In artificial intelligence (AI), quantum algorithms could be a great boon in solving complex problems like climate forecasts and discovering novel drug compounds, so those nations which can take the lead in quantum computing will also likely have an edge in AI development. This event will feature a discussion with a panel including Hodan Omaar, Policy Analyst, Center for Data Innovation, ITIF; Freeke Heijman, Director, Strategic Development, QuTech Delft; Joseph D. Lykken, Deputy Director of Research, Fermi National Accelerator Laboratory; Markus Pflitsch, Chairman and Founder, Terra Quantum AG; and moderated by Eline Chivot, Senior Policy Analyst, Center for Data Innovation, ITIF.

The Hudson Institute

The Future of American Spectrum Policy: Is DoDs Request for Information the Best Direction?

At 3:00 PM on Wednesday, online video webinar.

In early August, the White House and the U.S. Department of Defense (DoD) announced a plan to devise a spectrum sharing framework that frees up 100 megahertz (MHz) of continuous mid-band spectrum currently held by the DoD to be auctioned by the Federal Communications Commission (FCC) for supporting the growth of 5G networks across the U.S. A request for information (RFI) issued by the DoD on September 18 to identify innovative solutions for dynamic spectrum sharing which effectively supports national security while freeing up additional spectrum to be used by the 5G industry. Speakers at this event will include Harold Furchtgott-Roth, Director, Center for the Economics of the Internet; Michael ORielly, Commissioner, FCC; Robert McDowell, Former Commissioner, FCC; and Grace Koh, Ambassador and Special Advisor, Bureau of Economic and Business Affairs.

U.S. Patent and Trademark Office

Hear From USPTO Experts at the State Bar of Texas Advanced Intellectual Property Litigation Course

At 9:00 AM on Thursday, online video webinar.

On Thursday morning, the USPTO kicks off a two-day series of intellectual property litigation workshops being offered in partnership with the Intellectual Property Law Section of the State Bar of Texas. USPTO experts speaking at this event will include Molly Kocialski, Director, Rocky Mountain Regional Office; Miriam L. Quinn, Administrative Patent Judge, Patent Trial and Appeal Board; Todd J. Reves, Office of Policy and International Affairs; and Megan Hoyt, Dallas Regional Outreach Officer.

Center for Strategic & International Studies

Innovation in the Intelligence Community

At 3:00 PM on Thursday, online video webinar.

The U.S. intelligence community is careful to maintain secrecy in its operations but this can come at a cost to that sectors ability to support the development of innovative technologies like quantum computing and artificial intelligence. However, a recent report on the innovation race in the intelligence community issued by House Permanent Select Committee on Intelligences Subcommittee on Strategic Technologies and Advanced Research provides several recommendations for the intelligence community to support tech development in areas crucial for national security. This event will feature a discussion on the report between Representative Jim Hines (D-CT), Chairman, House Permanent Select Committee on Intelligences Subcommittee on Strategic Technologies and Advanced Research; and James Andrew Lewis, Senior Vice President and Director, Technology Policy Program.

Center for Strategic & International Studies

Sharpening Americas Innovative Edge

At 11:00 AM on Friday, online video webinar.

Although the United States lept to the forefront of global tech dominance thanks in large part to federal investment in R&D programs, the nations research funding continues to follow an outdated Cold War-era funding model for research. This event coincides with a report released by CSISs Trade Commission on Affirming American Leadership which outlines a national strategy for developing important technology sectors so that the U.S. can remain ahead of its global counterparts in those fields. This event will feature a discussion with a panel including Ajay Banga, CEO, Mastercard; Richard Levin, Former President, Yale University; Kavita Shukla, Founder and CEO, The FRESHGLOW Co.; and moderated by Matthew P. Goodman, Senior Vice President for Economics and Simon Chair in Political Economy, CSIS.

The Heritage Foundation

5G: The Emerging Markets Trojan Horse

At 1:00 PM on Friday, online video webinar.

The United States and several governments across Europe have sounded the alarm in recent years over the risks of foreign surveillance of domestic networks enabled by the use of network infrastructure hardware developed by growing Chinese telecom firms like Huawei and ZTE which have close ties with the Chinese communist government. While these developed nations have taken steps to prevent such issues in the 5G supply chain, governments in Africa and other developing areas of the world are forced to choose between protecting national security and building these crucial networks. Speakers at this event will include Bonnie Glick, Deputy Administrator, United States Agency for International Development; Joshua Meservy, Senior Policy Analyst, Africa and the Middle East; and hosted by Klon Kitchen, Director, Center for Technology Policy.

U.S. Patent and Trademark Office

2020 Patent Public Advisory Committee Annual Report Discussion

At 1:00 PM on Friday, online video webinar.

On Friday afternoon, the USPTOs Patent Public Advisory Committee (PPAC) will convene a meeting to discuss the annual report that the committee will prepare on the agencys policies, performance and user fees which will be delivered to the White House and Congress by the end of the fiscal year.

Image Source: Deposit PhotosCopyriht: jovannigImage ID: 12633480

Read the original:
This Week in Washington IP: Antitrust in the Ninth Circuit, Shaping Artificial Intelligence and Promoting Security in 5G Networks - IPWatchdog.com

Global quantum computing market is projected to register a healthy CAGR of 29.5% in the forecast period of 2019 to 2026. – re:Jerusalem

An all inclusive report will suit business requirements in many ways while also assisting in informed decision making and smart working. Company profiles of the key market competitors are analysed with respect to company snapshot, geographical presence, product portfolio, and recent developments. To figure out market landscape, brand awareness, latest trends, possible future issues, industry trends and customer behaviour, the finest market research report is very essential.Market research report provides myriad of benefits for a prosperous business.This report is the best to gain a competitive advantage in this quickly transforming marketplace.

Data Bridge Market Research recently released GlobalQuantum ComputingMarket research with more than 250 market data tables and figures and an easy to understand TOC in Global Quantum Computing Market research, so you can get a variety of ways to maximize your profits.Quantum Computingpredicted until 2026.The Quantum Computing market research report classifies the competitive spectrum of this industry in elaborate detail. The study claims that the competitive reach spans the companies of

Access Insightful Study about Quantum Computing market! Click Here to Get FREE PDF Sample Market Analysis:https://www.databridgemarketresearch.com/request-a-sample/?dbmr=global-quantum-computing-market&sc

Be the first to knock the door showing potential that Global Quantum Computing market is holding in it. Uncover the Gaps and Opportunities to derive most relevant insights from our research document to gain market size.

Global Quantum Computing Market :

Global quantum computing market is projected to register a healthyCAGR of 29.5% in the forecast period of 2019 to 2026.

On the off chance that you are associated with the Quantum Computing Analytics industry or mean to be, at that point this investigation will give you far reaching standpoint. Its crucial you stay up with the latest Quantum Computing Market segmented by:If you are involved in the Quantum Computing industry or intend to be, then this study will provide you comprehensive outlook. Its vital you keep your market knowledge up to datesegmentedBy System (Single Qubit Quantum System and Multiple Qubit System), Qubits (Trapped Ion Qubits, Semiconductor Qubits and Super Conducting), Deployment Model (On-Premises and Cloud), Component (Hardware, Software and Services), Application (Cryptography, Simulation, Parallelism, Machine Learning, Algorithms, Others), Logic Gates (Toffoli Gate, Hadamard Gate, Pauli Logic Gates and Others), Verticals (Banking And Finance, Healthcare & Pharmaceuticals, Defence, Automotive, Chemical, Utilities, Others) and Geography (North America, South America, Europe, Asia- Pacific, Middle East and Africa) Industry Trends and Forecast to 2026

Unlock new opportunities with DBMR reports to gain insightful analyses about the Quantum Computing market and have a comprehensive understanding. Learn about the market strategies that are being adopted by your competitors and leading organizations also potential and niche segments/regions exhibiting promising growth.

New vendors in the market are facing tough competition from established international vendors as they struggle with technological innovations, reliability and quality issues. The report will answer questions about the current market developments and the scope of competition, opportunity, cost and more.

According to the Regional Segmentation the Main Bearing Market provides the Information covers following regions:

The key countries in each region are taken into consideration as well, such as United States, Canada, Mexico, Brazil, Argentina, Colombia, Chile, South Africa, Nigeria, Tunisia, Morocco, Germany, United Kingdom (UK), the Netherlands, Spain, Italy, Belgium, Austria, Turkey, Russia, France, Poland, Israel, United Arab Emirates, Qatar, Saudi Arabia, China, Japan, Taiwan, South Korea, Singapore, India, Australia and New Zealand etc.

Market Dynamics:

Set of qualitative information that includes PESTEL Analysis, PORTER Five Forces Model, Value Chain Analysis and Macro Economic factors, Regulatory Framework along with Industry Background and Overview.

Some of the Major Highlights of TOC covers:

Chapter 1: Methodology & Scope

Definition and forecast parameters

Methodology and forecast parameters

Data Sources

Chapter 2: Executive Summary

Business trends

Regional trends

Product trends

End-use trends

Chapter 3: Quantum Computing Industry Insights

Industry segmentation

Industry landscape

Vendor matrix

Technological and innovation landscape

Chapter 4: Quantum Computing Market, By Region

North America

South America

Europe

Asia-Pacific

Middle East and Africa

Chapter 5: Company Profile

Business Overview

Financial Data

Product Landscape

Strategic Outlook

SWOT Analysis

Thanks for reading this article, you can also get individual chapter wise section or region wise report version like North America, Europe or Asia.

BROWSE FREE | TOC with selected illustrations and example pages of Global Quantum Computing Market @https://www.databridgemarketresearch.com/inquire-before-buying/?dbmr=global-quantum-computing-market&sc

In addition, the years considered for the study are as follows:

Historical year 2014-2019 | Base year 2019 | Forecast period 2020 to 2027

Key Insights that Study is going to provide:

The 360-Quantum Computing overview based on a global and regional level

Market Share & Sales Revenue by Key Players & Emerging Regional Players

Competitors In this section, various Quantum Computing industry leading players are studied with respect to their company profile, product portfolio, capacity, price, cost, and revenue.

A separate chapter on Market Entropy to gain insights on Leaders aggressiveness towards market [Merger & Acquisition / Recent Investment and Key Developments]

Patent Analysis** No of patents / Trademark filed in recent years.

A complete and useful guide for new market aspirants

Forecast information will drive strategic, innovative and profitable business plans and SWOT analysis of players will pave the way for growth opportunities, risk analysis, investment feasibility and recommendations

Supply and Consumption In continuation of sales, this section studies supply and consumption for the Quantum Computing Market. This part also sheds light on the gap between supply and consumption. Import and export figures are also given in this part

Production Analysis Production of the Quantum Computing is analyzed with respect to different regions, types and applications. Here, price analysis of various Quantum Computing Market key players is also covered.

Sales and Revenue Analysis Both, sales and revenue are studied for the different regions of the Quantum Computing Market. Another major aspect, price, which plays an important part in the revenue generation, is also assessed in this section for the various regions.

Other analyses Apart from the information, trade and distribution analysis for the Quantum Computing Market

Competitive Landscape:Company profile for listed players with SWOT Analysis, Business Overview, Product/Services Specification, Business Headquarter, Downstream Buyers and Upstream Suppliers.

May vary depending upon availability and feasibility of data with respect to Industry targeted

Inquire for further detailed information of Global Quantum Computing Market Report @https://www.databridgemarketresearch.com/inquire-before-buying/?dbmr=global-quantum-computing-market&sc

Key questions answered in this report-:

Research Methodology: Global Quantum Computing Market

Primary Respondents:OEMs, Manufacturers, Engineers, Industrial Professionals.

Industry Participants:CEOs, V.P.s, Marketing/Product Managers, Market Intelligence Managers and, National Sales Managers.

About Data Bridge Market Research:

An absolute way to forecast what future holds is to comprehend the trend today!

Data Bridge set forth itself as an unconventional and neoteric Market research and consulting firm with unparalleled level of resilience and integrated approaches. We are determined to unearth the best market opportunities and foster efficient information for your business to thrive in the market. Data Bridge endeavors to provide appropriate solutions to the complex business challenges and initiates an effortless decision-making process.

Data Bridge adepts in creating satisfied clients who reckon upon our services and rely on our hard work with certitude. We are content with our glorious 99.9 % client satisfying rate.

Contact:

Data Bridge Market ResearchUS: +1 888 387 2818UK: +44 208 089 1725Hong Kong: +852 8192 7475Email:Corporatesales@databridgemarketresearch.com

The rest is here:
Global quantum computing market is projected to register a healthy CAGR of 29.5% in the forecast period of 2019 to 2026. - re:Jerusalem

Commentary: Can AI and machine learning improve the economy? – FreightWaves

The views expressed here are solely those of the author and do not necessarily represent the views of FreightWaves or its affiliates.

In this installment of the AI in Supply Chain series (#AIinSupplyChain), I tried to discern the outlines of an answer to the question posed in the headline above by reading three academic papers. This article distills what I consider the most important takeaways from the papers.

Although the context of the investigations that resulted in these papers looks at the economy as a whole, there are implications that are applicable at the level of an individual firm. So, if you are responsible for innovation, corporate development and strategy at your company, its probably worth your time to read each of them and then interpret the findings for your own firm.

In this paper, Erik Brynjolfsson, Daniel Rock and Chad Syverson explore the paradox that while systems using artificial intelligence are advancing rapidly, measured economywide productivity has declined.

Recent optimism about AI and machine learning is driven by recent and dramatic improvements in machine perception and cognition. These skills are essential to the ways in which people get work done. So this has fueled hopes that machines will rapidly approach and possibly surpass people in their ability to do many different tasks that today are the preserve of humans.

However, productivity statistics do not yet reflect growth that is driven by the advances in AI and machine learning. If anything, the authors cite statistics to suggest that labor productivity growth fell in advanced economies starting in the mid-2000s and has not recovered to its previous levels.

Therein lies the paradox: AI and machine learning boosters predict it will transform entire swathes of the economy, yet the economic data do not point to such a transformation taking place. What gives?

The authors offer four possible explanations.

First, it is possible that the optimism about AI and machine learning technologies is misplaced. Perhaps they will be useful in certain narrow sectors of the economy, but ultimately their economywide impact will be modest and insignificant.

Second, it is possible that the impact of AI and machine learning technologies is not being measured accurately. Here it is pessimism about the significance of these technologies that prevents society from accurately measuring their contribution to economic productivity.

Third, perhaps these new technologies are producing positive returns to the economy, BUT these benefits are being captured by a very small number of firms and as such the rewards are enjoyed by only a minuscule fraction of the population.

Fourth, the benefits of AI and machine learning will not be reflected in the wider economy until investments have been made to build up complementary technologies, processes, infrastructure, human capital and other types of assets that make it possible for society to realize and measure the transformative benefits of AI and machine learning.

The authors argue that AI, machine learning and their complementary new technologies embody the characteristics of general purpose technologies (GPTs). A GPT has three primary features: It is pervasive or can become pervasive; it can be improved upon as time elapses; and it leads directly to complementary innovations.

Electricity. The internal combustion engine. Computers. The authors cite these as examples of GTPs, with which readers are familiar.

Crucially, the authors state that a GPT can at one moment both be present and yet not affect current productivity growth if there is a need to build a sufficiently large stock of the new capital, or if complementary types of capital, both tangible and intangible, need to be identified, produced, and put in place to fully harness the GPTs productivity benefits.

It takes a long time for economic production at the macro- or micro-scale to be reorganized to accommodate and harness a new GPT. The authors point out that computers took 25 years before they became ubiquitous enough to have an impact on productivity. It took 30 years for electricity to become widespread. As the authors state, the changes required to harness a new GPT take substantial time and resources, contributing to organizational inertia. Firms are complex systems that require an extensive web of complementary assets to allow the GPT to fully transform the system. Firms that are attempting transformation often must reevaluate and reconfigure not only their internal processes but often their supply and distribution chains as well.

The authors end the article by stating: Realizing the benefits of AI is far from automatic. It will require effort and entrepreneurship to develop the needed complements, and adaptability at the individual, organizational, and societal levels to undertake the associated restructuring. Theory predicts that the winners will be those with the lowest adjustment costs and that put as many of the right complements in place as possible. This is partly a matter of good fortune, but with the right roadmap, it is also something for which they, and all of us, can prepare.

In this paper, Brynjolfsson, Xiang Hui and Meng Liu explore the effect that the introduction of eBay Machine Translation (eMT) had on eBays international trade. The authors describe eMT as an in-house machine learning system that statistically learns how to translate among different languages. They also state: As a platform, eBay mediated more than 14 billion dollars of global trade among more than 200 countries in 2014. Basically, eBay represents a good approximation of a complex economy within which to examine the economywide benefits of this type of machine translation.

The authors state: We show that a moderate quality upgrade increases exports on eBay by 17.5%. The increase in exports is larger for differentiated products, cheaper products, listings with more words in their title. Machine translation also causes a greater increase in exports to less experienced buyers. These heterogeneous treatment effects are consistent with a reduction in translation-related search costs, which comes from two sources: (1) an increased matching relevance due to improved accuracy of the search query translation and (2) better translation quality of the listing title in buyers language.

They report an accompanying 13.1% increase in revenue, even though they only observed a 7% increase in the human acceptance rate.

They also state: To put our result in context, Hui (2018) has estimated that a removal of export administrative and logistic costs increased export revenue on eBay by 12.3% in 2013, which is similar to the effect of eMT. Additionally, Lendle et al. (2016) have estimated that a 10% reduction in distance would increase trade revenue by 3.51% on eBay. This means that the introduction of eMT is equivalent of [sic] the export increase from reducing distances between countries by 37.3%. These comparisons suggest that the trade-hindering effect of language barriers is of first-order importance. Machine translation has made the world significantly smaller and more connected.

In this paper, Brynjolfsson, Rock and Syverson develop a model that shows how GPTs like AI enable and require significant complementary investments, including co-invention of new processes, products, business models and human capital. These complementary investments are often intangible and poorly measured in the national accounts, even when they create valuable assets for the firm AND they develop a model that shows how this leads to an underestimation of productivity growth in the early years of a new GPT, and how later, when the benefits of intangible investments are harvested, productivity growth will be overestimated. Their model generates a Productivity J-Curve that can explain the productivity slowdowns often accompanying the advent of GPTs, as well as the increase in productivity later.

The authors find that, first, As firms adopt a new GPT, total factor productivity growth will initially be underestimated because capital and labor are used to accumulate unmeasured intangible capital stocks. Then, second, Later, measured productivity growth overestimates true productivity growth because the capital service flows from those hidden intangible stocks generates measurable output. Finally, The error in measured total factor productivity growth therefore follows a J-curve shape, initially dipping while the investment rate in unmeasured capital is larger than the investment rate in other types of capital, then rising as growing intangible stocks begin to contribute to measured production.

This explains the observed phenomenon that when a new technology like AI and machine learning, or something like blockchain and distributed ledger technology, is introduced into an area such as supply chain, it generates furious debate about whether it creates any value for incumbent suppliers or customers.

If we consider the reported time it took before other GPTs like electricity and computers began to contribute measurably to firm-level and economywide productivity, we must admit that it is perhaps too early to write off blockchains and other distributed ledger technologies, or AI and machine learning, and their applications in sectors of the economy that are not usually associated with internet and other digital technologies.

Give it some time. However, I think we are near the inflection point of the AI and Machine Learning Productivity J-curve. As I have worked on this #AIinSupplyChain series, I have become more convinced that the companies that are experimenting with AI and machine learning in their supply chain operations now will have the advantage over their competitors over the next decade.

I think we are a bit farther away from the inflection point of a Blockchain and Distributed Ledger Technologies Productivity J-Curve. I cannot yet make a cogent argument about why this is true, although in March 2014, I published #ChainReaction: Who Will Own The Age of Cryptocurrencies? part of an ongoing attempt to understand when blockchains and other distributed technologies might become more ubiquitous than they are now.

Examining this topic has added to my understanding of why disruption happens. The authors of the Productivity J-Curve paper state that the more transformative the new technology, the more likely its productivity effects will initially be underestimated.

The long duration during which incumbent firms underestimate the productivity effects of a relatively new GPT is what contributes to the phenomenon studied by Rebecca Henderson and Kim Clark in Architectural Innovation: The Reconfiguration of Existing Product Technologies and the Failure of Established Firms. It is also described as Supply Side Disruption by Josgua Gans in his book, The Disruption Dilemma, and summarized in this March 2016 HBR article, The Other Disruption.

If we focus on AI and machine learning specifically, in an exchange on Twitter on Sept. 27, Brynjolfsson said, The machine translation example is in many ways the exception. More often it takes a lot of organizational reinvention and time before AI breakthroughs translate into productivity gains.

By the time entrenched and industry-leading incumbents awaken to the threats posed by newly developed GPTs, a crop of challengers who had no option but to adopt the new GPT at the outset has become powerful enough to threaten the financial stability of an industry.

One example? E-commerce and its impact on retail in general.

If you are an executive, what experiments are you performing to figure out if and how your companys supply chain operations can be made more productive by implementing technologies that have so far been underestimated by you and other incumbents in your industry?

If you are not doing anything yet, are you fulfilling your obligations to your companys shareholders, employees, customers and other stakeholders?

If you are a team working on innovations that you believe have the potential to significantly refashion global supply chains, wed love to tell your story in FreightWaves. I am easy to reach on LinkedIn and Twitter. Alternatively, you can reach out to any member of the editorial team at FreightWaves at media@freightwaves.com.

Dig deeper into the #AIinSupplyChain Series with FreightWaves.

Commentary: Optimal Dynamics the decision layer of logistics?

Commentary: Combine optimization, machine learning and simulation to move freight

Commentary: SmartHop brings AI to owner-operators and brokers

Commentary: Optimizing a truck fleet using artificial intelligence

Commentary: FleetOps tries to solve data fragmentation issues in trucking

Commentary: Bulgarias Transmetrics uses augmented intelligence to help customers

Commentary: Applying AI to decision-making in shipping and commodities markets

Commentary: The enabling technologies for the factories of the future

Commentary: The enabling technologies for the networks of the future

Commentary: Understanding the data issues that slow adoption of industrial AI

Commentary: How AI and machine learning improve supply chain visibility, shipping insurance

Commentary: How AI, machine learning are streamlining workflows in freight forwarding, customs brokerage

Authors disclosure: I am not an investor in any early-stage startups mentioned in this article, either personally or through REFASHIOND Ventures. I have no other financial relationship with any entities mentioned in this article.

Read the rest here:
Commentary: Can AI and machine learning improve the economy? - FreightWaves

The secrets of small data: How machine learning finally reached the enterprise – VentureBeat

Over the past decade, big data has become Silicon Valleys biggest buzzword. When theyre trained on mind-numbingly large data sets, machine learning (ML) models can develop a deep understanding of a given domain, leading to breakthroughs for top tech companies. Google, for instance, fine-tunes its ranking algorithms by tracking and analyzing more than one trillion search queries each year. It turns out that the Solomonic power to answer all questions from all comers can be brute-forced with sufficient data.

But theres a catch: Most companies are limited to small data; in many cases, they possess only a few dozen examples of the processes they want to automate using ML. If youre trying to build a robust ML system for enterprise customers, you have to develop new techniques to overcome that dearth of data.

Two techniques in particular transfer learning and collective learning have proven critical in transforming small data into big data, allowing average-sized companies to benefit from ML use cases that were once reserved only for Big Tech. And because just 15% of companies have deployed AI or ML already, there is a massive opportunity for these techniques to transform the business world.

Above: Using the data from just one company, even modern machine learning models are only about 30% accurate. But thanks to collective learning and transfer learning, Moveworks can determine the intent of employees IT support requests with over 90% precision.

Image Credit: Moveworks

Of course, data isnt the only prerequisite for a world-class machine learning model theres also the small matter of building that model in the first place. Given the short supply of machine learning engineers, hiring a team of experts to architect an ML system from scratch is simply not an option for most organizations. This disparity helps explain why a well-resourced tech company like Google benefits disproportionately from ML.

But over the past several years, a number of open source ML models including the famous BERT model for understanding language, which Google released in 2018 have started to change the game. The complexity of creating a model the caliber of BERT, whose aptly named large version has about 340 million parameters, means that few organizations can even consider quarterbacking such an initiative. However, because its open source, companies can now tweak that publicly available playbook to tackle their specific use cases.

To understand what these use cases might look like, consider a company like Medallia, a Moveworks customer. On its own, Medallia doesnt possess enough data to build and train an effective ML system for an internal use case, like IT support. Yet its small data does contain a treasure trove of insights waiting for ML to unlock them. And by leveraging new techniques to glean these insights, Medallia has become more efficient, from recognizing which internal workflows need attention to understanding the company-specific language its employees use when asking for tech support.

So heres the trillion-dollar question: How do you take an open source ML model designed to solve a particular problem and apply that model to a disparate problem in the enterprise? The answer starts with transfer learning, which, unsurprisingly, entails transferring knowledge gained from one domain to a different domain that has less data.

For example, by taking an open source ML model like BERT designed to understand generic language and refining it at the margins, it is now possible for ML to understand the unique language employees use to describe IT issues. And language is just the beginning, since weve only begun to realize the enormous potential of small data.

Above: Transfer learning leverages knowledge from a related domain typically one with a greater supply of training data to augment the small data of a given ML use case.

Image Credit: Moveworks

More generally, this practice of feeding an ML model a very small and very specific selection of training data is called few-shot learning, a term thats quickly become one of the new big buzzwords in the ML community. Some of the most powerful ML models ever created such as the landmark GPT-3 model and its 175 billion parameters, which is orders of magnitude more than BERT have demonstrated an unprecedented knack for learning novel tasks with just a handful of examples as training.

Taking essentially the entire internet as its tangential domain, GPT-3 quickly becomes proficient at these novel tasks by building on a powerful foundation of knowledge, in the same way Albert Einstein wouldnt need much practice to become a master at checkers. And although GPT-3 is not open source, applying similar few-shot learning techniques will enable new ML use cases in the enterprise ones for which training data is almost nonexistent.

With transfer learning and few-shot learning on top of powerful open source models, ordinary businesses can finally buy tickets to the arena of machine learning. But while training ML with transfer learning takes several orders of magnitude less data, achieving robust performance requires going a step further.

That step is collective learning, which comes into play when many individual companies want to automate the same use case. Whereas each company is limited to small data, third-party AI solutions can use collective learning to consolidate those small data sets, creating a large enough corpus for sophisticated ML. In the case of language understanding, this means abstracting sentences that are specific to one company to uncover underlying structures:

Above: Collective learning involves abstracting data in this case, sentences with ML to uncover universal patterns and structures.

Image Credit: Moveworks

The combination of transfer learning and collective learning, among other techniques, is quickly redrawing the limits of enterprise ML. For example, pooling together multiple customers data can significantly improve the accuracy of models designed to understand the way their employees communicate. Well beyond understanding language, of course, were witnessing the emergence of a new kind of workplace one powered by machine learning on small data.

View original post here:
The secrets of small data: How machine learning finally reached the enterprise - VentureBeat

Bespoken Spirits raises $2.6M in seed funding to combine machine learning and accelerated whiskey aging – TechCrunch

Bespoken Spirits, a Silicon Valley spirits company that has developed a new data-driven process to accelerate the aging of whiskey and create specific flavors, today announced that it has raised a $2.6 million seed funding round. Investors include Clos de la Tech owner T.J. Rodgers and baseballs Derek Jeter.

The company was co-founded by former Bloom Energy, BlueJeans and Mixpanel exec Stu Aaron and another Bloom Energy alumn, Martin Janousek, whose name can be found on a fair number of Bloom Energy patents.

Bespoken isnt the first startup to venture into accelerated aging, a process that tries to minimize the time it takes to age these spirits, which is typically done in wooden barrels. The company argues that its the first to combine that with a machine learning-based approach though what it calls its ACTivation technology.

Rather than putting the spirit in a barrel and passively waiting for nature to take its course, and just rolling the dice and seeing what happens, we instead use our proprietary ACTivation technology with the A, C and T standing for aroma, color and taste to instill the barrel into the spirit, and actively control the process and the chemical reactions in order to deliver premium quality tailored spirits and to be able to do that in just days rather than decades, explained Aaron.

Image Credits: Bespoken Spirits

And while there is surely a lot of skepticism around this technology, especially in a business that typically prides itself on its artisanal approach, the company has won prizes at a number of competitions. The team argues that traditional barrel aging is a wasteful process, where you lose 20% of the product through evaporation, and one that is hard to replicate. And because of how long it takes, it also creates financial challenges for upstarts in this business and it makes it hard to innovate.

As the co-founders told me, there are three pillars to its business: selling its own brand of spirits, maturation-as-a-service for rectifiers and distillers and producing custom private label spirits for retailers, bars and restaurants. At first, the team mostly focused on the latter two and especially its maturation-as-a-service business. Right now, Aaron noted, a lot of craft distilleries are facing financial strains and need to unlock their inventory and get their product to market sooner and maybe at a better quality and hence higher price point than they previously could.

Theres also the existing market of rectifiers, who, at least in the U.S., take existing products and blend them. These, too, are looking for ways to improve their processes and make it more replicable.

Interestingly, a lot of breweries, too, are now sitting on excess or expired beer because of the pandemic. Theyre realizing that rather than paying somebody to dispose of that beer and taking it back, they can actually recycle or upcycle maybe is a better word the beer, by distilling it into whiskey, Aaron said. But unfortunately, when a brewery distills beer into whiskey, its typically not very good whiskey. And thats where we come in. We can take that beer bin, as a lot of people call initial distillation, and we can convert it into a premium-quality whiskey.

Image Credits: Bespoken Spirits

Bespoken is also working with a few grocery chains, for example, to create bespoke whiskeys for their house brands that match the look and flavor of existing brands or that offer completely new experiences.

The way the team does this is by collecting a lot of data throughout its process and then having a tasting panel describe the product for them. Using that data and feeding it into its systems, the company can then replicate the results or tweak them as necessary without having to wait for years for a barrel to mature.

Were collecting all this data and some of the data that were collecting today, we dont even know yet what were going to use it for, Janousek said. Using its proprietary techniques, Bespoken will often create dozens of samples for a new customer and then help them whittle those down.

I often like to describe our company as a cross between 23andme, Nespresso and Impossible Foods, Aaron said. Were like 23andme, because again, were trying to map the customer to preference to the recipe to results. There is this big data, genome mapping kind of a thing. And were like Nespresso because our machine takes spirit and supply pods and produces results, although obviously were industrial scale and theyre not. And its like Impossible Foods, because its totally redefining an age-old antiquated model to be completely different.

The company plans to use the new funding to accelerate its market momentum and build out its technology. Its house brand is currently available for sale in California, Wisconsin and New York.

The companys ability to deliver both quality and variety is what really caught my attention and made me want to invest, said T.J. Rogers. In a short period of time, theyve already produced an incredible range of top-notch spirits, from whiskeys to rum, brandy and tequila all independently validated time and again in blind tastings and prestigious competitions.

Full disclaimer: The company sent me a few samples. Im not enough of a whiskey aficionado to review those, but I did enjoy them (responsibly).

See the article here:
Bespoken Spirits raises $2.6M in seed funding to combine machine learning and accelerated whiskey aging - TechCrunch