University of Exeter: Speeding up machine learning by means of light – India Education Diary

An international team of researchers has developed a next-generation computer accelerator chip that processes data using light rather than electronics.

Speeding up machine learning by means of lightScientists have developed a pioneering new approach that will rapidly speed up machine learning using light.

An international team of researchers from the Universities of Mnster, Oxford, Exeter, Pittsburgh, cole Polytechnique Fdrale (EPFL) and IBM Research Zurich has developed a next-generation computer accelerator chip that processes data using light rather than electronics.

The results are published in the leading scientific journal Nature on Wednesday, January 6th.

Professor C. David Wright of the University of Exeter, who leads the EU project Fun-COMP which funded this work said: Conventional computer chips are based on electronic data transfer and are comparatively slow, but light-based processors such as that developed in our work enable complex mathematical tasks to be processed at speeds hundreds or even thousands of times faster, and with hugely reduced energy consumption.

The team of researchers, led by Prof. Wolfram Pernice from the Institute of Physics and the Center for Soft Nanoscience at the University of Mnster, combined integrated photonic devices with phase-change materials (PCMs) to deliver super-fast, energy-efficient matrix-vector (MV) multiplications.

MV multiplications lie at the heart of modern computing from AI to machine learning and neural network processing and the imperative to carry out such calculations at ever-increasing speeds, but with ever-decreasing energy consumption, is driving the development of a whole new class of processor chips, so-called tensor processing units (TPUs).

The team developed a new type of photonic TPU one capable of carrying out multiple MV multiplications simultaneously and in parallel, using a chip-based frequency comb as a light source, along with wavelength-division-multiplexing.

The matrix elements were stored using PCMs the same material currently used for re-writable DVD and BluRay optical discs making it possible to preserve matrix states without the need for an energy supply.

In their experiments, the team used their photonic TPU in a so-called convolutional neural network for the recognition of handwritten numbers and for image filtering. Our study is the first to apply frequency combs in the field of artificial neural networks, says Prof. Wolfram Pernice.

Our results could have a wide range of applications, explained Prof. Harish Bhaskaran from the University of Oxford, a key member of the team: A photonic TPU could quickly and efficiently process huge data sets used for medical diagnoses, such as those from CT, MRI and PET scanners, he continued.

Further applications could also be found in self-driving vehicles which depend on fast, rapid evaluation of data from multiple sensors as well as for the provision of IT infrastructure such as cloud computing.

Link:
University of Exeter: Speeding up machine learning by means of light - India Education Diary

Meet the AAS Keynote Speakers: Dr. Brian Nord – Astrobites

In this series of posts, we sit down with a few of the keynote speakers of the 237th AAS meeting to learn more about them and their research. You can see a full schedule of their talks here, and read our other interviews here!

You might have noticed a rise in the number of astronomy publicationsand a corresponding increase in the number of Astrobites!about machine learning (ML). Over the last decade, ML has become a powerful statistical tool, but as ML expert Dr. Brian Nord knows, its not a one-size-fits-all solution. You can apply machine learning to everything, which is not always the best idea, Nord says, then smiles. Which took me a few years to learn.

So what exactly is the role of ML in astronomy? Nord (who has an impressive list of positions: Scientist at Fermilab, CASE Scientist in the Department of Astronomy and Astrophysics and Senior Member of the Kavli Institute of Cosmological Physics at the University of Chicago, and co-founder of the Deep Skies Lab) plans to discuss this exact question in his plenary talk at #AAS237.

Id like to talk about the winding road that machine learning has gone through in astronomy, Nord says. I think theres a lot to learn from the path that was taken as a community, and I want to review that and give a sense of where I think well get the most use out of deep learning. But Nords plenary talk will also go one step further. The other part will be: whats the role of the scientist who develops deep learning tools [] in the societal implications of the work were doing? If Im trying to improve an algorithm to make it better at recognizing galaxies, how much am I also contributing to algorithms that are good at facial recognitionwhich we know is biased against people of color? What role do I play as a scientist? What role do I play as a Black scientist?

Nords own research in machine learning has taken three primary directions: using ML to analyze data, using ML to design experiments, and studying ML itself.

Nord, a cosmologist by training, has often used ML to classify objects like gravitational lenses, which are useful for cosmology (see, for example, this recent Astrobite).

One of the big things that you might want to do with machine learning is detect rare objects. Strong gravitational lenses are a rare type of object in the way they look on the sky, Nord says. The field [of gravitational lens studies], for a better part of the last 30 or 40 years, has been mostly using human visual inspection to detect whether something is a strong lens or not. As you start thinking about [Rubin Observatory] and JWST and other big telescopes and surveys, thats not really gonna cut it anymore.

Machine learning is useful not just for analyzing data, but also for getting dataspecifically, improving experiment design. The Dark Energy Spectroscopic Instrument (DESI) is a massive spectrograph that is just beginning to obtain spectra for tens of millions of galaxies. As DESI was being built in the 2010s, Nord says, some colleagues and I started asking [] Why arent we simulating the entire instrument at once? Nord and colleagues built SPectrOscopic KEn Simulation (SPOKES), a tool that simulates the instrument from end to end and lets you see the scope of our knowledge kenabout the instrument. Now, Nord is thinking even bigger. Im starting to ask questions like: Should we be fully operating surveys manually? Nord says. Perhaps ML can be used with tools like SPOKES to automate experiment design.

Finally, and perhaps most importantly, Nord tries to understand ML itself, and the problems inherent with ML approaches. Deep learning is not great at giving error estimates that are immediately interpretable to a physicist. It doesnt come out in terms of, you know, statistical uncertainty and systematic uncertainty, Nord says. He recently worked with a postdoc to test different ML methods to simulate a simple pendulum system. We showed that if you just take these [algorithms] off the shelf and think youre going to get an answer out of it youre not. So were trying to develop new tools to do this uncertainty quantification.

So what is the path moving forward for ML as a statistical tool for astronomy? People should be asking what their ultimate goal is, and working backwards from there, Nord explains. So if your goal is to classify and find things, then the main thing one would need to worry about is bias from the training sample to the test sample. Thats not a solved problem, but there are tools to mitigate that []. But if we want to go beyond that, into actually getting a measurement, then we need to be careful that were not treating deep learning like its anything magic. Its a tool that we need to really deeply understand.

We also need to think about how ML can be used as a tool in power structures, both in science and in society at large. Nord points out this 2016 ProPublica article that discusses how in several states in the US, judges can use closed-source ML algorithms to help decide how long to send someone to jailand how these algorithms are biased against people of color, especially Black people.

When we see a new technology come about that is even half as disruptive as artificial intelligence, I think one of the questions that we should be asking is: How will this be used to concentrate power? And, alongside that, how can it be used to equilibrate power dynamics? Nord says. The further along we get in the development of artificial intelligence as a fundamental toolin science and in other places in societywithout asking those questions, I think the landscape of our opportunity for using it as a chance to equilibrate power changes. [] I think theres still time to get the word out, but it feels like were losing time. I dont want to lose this opportunity.

How did Nord end up studying ML and cosmology? I started in physics in college, and at the end of college I was going to do string theory, Nord says. But as he began his PhD at the University of Michigan, he found that string theory started to sit farther and farther away from the questions I wanted to ask about the universe. So he went into cosmology, which he felt provided a more direct conduit to philosophical questions about existence and my place in the universe and my childhood dreams of space travel, and he eventually decided to switch into the subfield of strong lensing after his PhD. Thats when he got into ML. I faced this problem that made me sad, to have to hand-scan tens of thousands of images, [and] I started thinking about deep learning.

Nord is grateful for his experiences switching fields. Im glad that I tested different kinds of science. [] You dont need to like all of physics. You can pick one thing, and then like it and do it. For students who are trying to pick a field to work in, Nord suggests thinking carefully about the sociology of the scientific community that [youre] in. [] If the projects great but the advisor is terrible, youre gonna have to live with that for years to come.

Nord also encourages current students to remember that at the end of the day, working in physics and astronomy is just a job. You can love it, but it doesnt have to be the thing that owns our lives. I think theres this idea in academia that if youre not a full part of it in every single way, if you dont give up yourself completely, then youre not worthy. And I think thats terrible, he says. Even as a PI, thats still out there. [] The systems that we work in still dictate a lot of the power.

As a result, people who choose to work to change these systems should be careful when setting their expectations. The institution of the academy has significant flaws that allow people to be disenfranchised and oppressed, Nord says. Systemic racism exists in academia, full stop, its still there. Systemic misogyny exists in academia, full stop, its still there. And even though equity, diversity, and inclusion efforts sometimes claim to work against these flaws, they are not equivalent to justice. Those three terms in tandem form this conceptual framework that are often used to, either in purpose or by accident, detract from actual justice efforts.

Finally, Nord reminds students that while the goal of science is to learn about nature in an objective and unbiased way, scientists are not ourselves objective. And when we try to convince anyone that we are, we just look more and more foolish. This subjectivity that we have, it exists because were human and were social creatures, so we need to accept that and figure out ways to create a just community for ourselves.

Interested in machine learning in astronomy and society? Check out Dr. Nords plenary talk at 3:10PM ET on Monday, January 11 at #AAS237!

Astrobite edited by: Gloria Fonseca Alvarez

Featured image credit: American Astronomical Society

About Mia de los ReyesI'm a grad student at Caltech, where I study the chemical compositions of nearby dwarf galaxies. Before coming to sunny California, I spent a year as a postgrad at the University of Cambridge, studying star formation in galaxies. Now that I've escaped to warmer climates, my hobbies include rock climbing, aerial silks, and finding free food on campus.

Continue reading here:
Meet the AAS Keynote Speakers: Dr. Brian Nord - Astrobites

Global Machine Learning (ML) Platforms Market Growth, Size, Analysis, Outlook by 2020 – Trends, Opportunities and Forecast to 2025 – AlgosOnline

A recent report added by Market Study Report, LLC, on ' Machine Learning (ML) Platforms Market' provides a detailed analysis on the industry size, revenue forecasts and geographical landscape pertaining to this business space. Additionally, the report highlights primary obstacles and latest growth trends accepted by key players that form a part of the competitive spectrum of this business.

The research study on the Machine Learning (ML) Platforms market projects this industry to garner substantial proceeds by the end of the projected duration, with a commendable growth rate liable to be registered over the estimated timeframe. Elucidating a pivotal overview of this business space, the report includes information pertaining to the remuneration presently held by this industry, in tandem with a meticulous illustration of the Machine Learning (ML) Platforms market segmentation and the growth opportunities prevailing across this vertical.

Request a sample Report of Machine Learning (ML) Platforms Market at:https://www.marketstudyreport.com/request-a-sample/3132103?utm_source=algosonline.com&utm_medium=SHR

A brief run-through of the industry segmentation encompassed in the Machine Learning (ML) Platforms market report:

Competitive landscape:

Companies involved:

Vital pointers enumerated:

The Machine Learning (ML) Platforms market report provides an outline of the vendor landscape that includes companies such as

The study mentions the products manufactured by these esteemed companies as well the product price prototypes, profit margins, valuation accrued, and product sales.

Ask for Discount on Machine Learning (ML) Platforms Market Report at:https://www.marketstudyreport.com/check-for-discount/3132103?utm_source=algosonline.com&utm_medium=SHR

Geographical landscape:

Regions involved: USA, Europe, Japan, China, India, South East Asia

Vital pointers enumerated:

Segmented into USA, Europe, Japan, China, India, South East Asia, as per the regional spectrum, the Machine Learning (ML) Platforms market apparently covers most of the pivotal geographies, claims the report, which compiles a highly comprehensive analysis of the geographical arena, including details about the product consumption patterns, revenue procured, as well as the market share that each zone holds.

The study presents details regrading the consumption market share and product consumption growth rate of the regions in question, in tandem with the geographical consumption rate with regards to the products and the applications.

Product landscape

Product types involved:

Vital pointers enumerated:

The Machine Learning (ML) Platforms market report enumerates information with respect to every product type among

Application landscape:

Application sectors involved:

Vital pointers enumerated:

The Machine Learning (ML) Platforms market report, with respect to the application spectrum, splits the industry into

The Machine Learning (ML) Platforms market report also includes substantial information about the driving forces impacting the commercialization landscape of the industry as well as the latest trends prevailing in the market. Also included in the study is a list of the challenges that this industry will portray over the forecast period.

Other parameters like the market concentration ratio, enumerated with reference to numerous concentration classes over the projected timeline, have been presented as well, in the report.

For More Details On this Report: https://www.marketstudyreport.com/reports/global-machine-learning-ml-platforms-market-growth-status-and-outlook-2020-2025

Related Reports:

2. Global Transformer Dismantling & Recycling Services Market Growth (Status and Outlook) 2020-2025Transformer Dismantling & Recycling Services Market Report covers the makers' information, including shipment, value, income, net benefit, talk with record, business appropriation and so forth., this information enables the buyer to think about the contenders better. This report additionally covers every one of the districts and nations of the world, which demonstrates a provincial advancement status, including market size, volume and esteem, and also value information. It additionally covers diverse enterprises customers data, which is critical for the producers.Read More: https://www.marketstudyreport.com/reports/global-transformer-dismantling-recycling-services-market-growth-status-and-outlook-2020-2025

Read More Reports On: https://www.marketwatch.com/press-release/manganese-sulfate-market-trends-and-opportunities-by-types-and-application-in-grooming-regions-2021-01-06?tesla=y

Contact Us:Corporate Sales,Market Study Report LLCPhone: 1-302-273-0910Toll Free: 1-866-764-2150 Email: [emailprotected]

Read more:
Global Machine Learning (ML) Platforms Market Growth, Size, Analysis, Outlook by 2020 - Trends, Opportunities and Forecast to 2025 - AlgosOnline

Here’s Why Short Story Dispensers Are The New Normal

The short story is the newest trend globally. The need to read and write those short stories is always there. However, the source for those stories is the question. You can find short stories in the books, magazines, e-books, on websites, in libraries or short story dispensers. Short story dispensers are the latest in the market. There are many books on various things such as home improvement contract, short stories or e-book etc. Thus, these you can benefit from these books as well.

Above all, the short story dispensers are the new normal. People have been using these since its invention. This has made it easier for us to read and enjoy short stories to kill our time and for educational purposes. This literary vending machine prints a blend of original, classic and contemporary works of literature at various locations. This has encouraged people to engage with literary art. Here are some of the reasons why short story dispensers are the new normal.

IT IS EASY TO USE BY ALL AGE GROUPS

Short story dispensers are one of the best ways to read and experience more creativity. These dispensers can be used by all age groups. These are highly accessible by all age groups. Thus, anyone can use these dispensers to enjoy literature anywhere anytime. Students can have easy access to these if their schools and colleges have those dispensers installed in the campus. However, you can search for locations of short story dispensers online as well. You will surely find one near you.

HIGHLY EFFECTIVE AND CUSTOMER-FRIENDLY

man in grey crew-neck t-shirt smiling to woman on counter

One of the best things about these dispensers is that these are highly effective and customer-friendly. You can easily use these even if you are using these for the first time. Therefore, it allows you to work efficiently as well. In addition to this, these dispensers are easy to install. It is customer friendly too. The dispensers are operated automatically. You just have to select the type of story and the duration of the reading. Rest of the work is done by the dispenser itself. Thus, it is very effective to use.

THE DISPENSER BRINGS CULTURE TO UNEXPECTED PLACES

Many underdeveloped or developing countries do not have proper libraries. Thus, it is often very difficult for the students to study and enjoy some quality content. Therefore, dispensers can be used in such areas to promote literature. Dispensers are very light on budget and easy to install. Therefore, these can be easily installed anywhere. This will help in bringing the culture to even unexpected areas. Due to this, the short story dispensers are very common these days.

PAPER AND THE ENVIRONMENT

tree trunk

Nothing could be better than an eco-friendly literature vending machine. The short story dispenser uses thermal printing for printing the stories. You do not need to have any ink or cartridge for printing. Moreover, the paper used in the machine is recycled and is of good quality. Thus, it is highly nature friendly that allows you to enjoy your favourite stories without causing any harm to the environment. The short edition story on a qualitative paper encourages people to keep the story instead of throwing it away.

LOW PRICE LIST

These machines are very light on the pocket. Libraries and colleges can easily install these dispensers. The subscription is for the ones who install it. However, you can use these for free. With the help of 3G internet, you get to choose what type of story you want to hear. Thus, you get to enjoy your favourite stories free of cost. You can also customize messaging at the end of each story. Therefore, this low cost and the highly efficient machine can do wonders to bring back the literature to life.

THE BOTTOM LINE:

Short story dispensers are one of the best things that you need to have in 2020. It is best to be used in colleges, coffee shops, restaurants and other public places. These are simple to use by all age groups. Moreover, it allows to build spark creativity in various platforms to showcase the diversity of the arts in the world. Users are able to print a randomized story from the catalogue of short stories. Thus, it brings back the passion for reading as well. You can find these machines almost in every developed country and city. Moreover, it is highly accessible for the students to read and enjoy some creative content apart from using social media. This is the reason why these literature vending machines are so in these days.

Quantum computing is so last-decade. Get ready to invest in the final frontier… teleportation – MarketWatch

If 2020 had you wishing you could say Beam me up, Scotty, youre not alone. You may be one tiny step closer to getting your wish in a few decades or so.

Scientists from Fermilab, Caltech, NASAs Jet Propulsion Laboratory and the University of Calgary achieved long-distance quantum teleportation in mid-2020, they confirmed in an academic journal article published last month. Its another step toward realizing whats often called quantum computing, and also toward understanding physics on a different level than we do now, perhaps well enough to someday teleport humans. And while there is no ETF specifically for that yet, here are some broad guidelines for thinking about how to invest in very nascent technologies.

For starters, its good to understand the broad contours of the industry supporting the idea. A 2020 market research analysis estimates the quantum computing market will top $65 billion per year by 2030, while a 2019 BCG report makes the case for investing now, rather than waiting for things to take off. As MarketWatch reported in late 2019, quantum computing is expected to remake everything from pharmaceuticals to cybersecurity.

Right now, there are several blue-chip biggies involved in the quantum race. Scientists from AT&T were involved in the 2020 experiments, and big companies like Microsoft MSFT, -2.13%, Tencent TCEHY, +1.34%, and IBM IBM, -1.54% all have initiatives.

Its easy enough to find exchange-traded funds with big holdings of those giants likely easier than finding publicly-traded small companies on the bleeding edge of these technologies but its also important to remember how small a share of their revenues experimental ventures like these are.

There are still some good models for funds constructed around developing industries like this one, noted Todd Rosenbluth, head of mutual fund and ETF research at CFRA. One is the Procure Space ETF UFO, -1.49%, which sports the ticker UFO. UFO launched before Virgin Galactic SPCE, -2.19% went public, at a moment when it was hard to call it a true pure-play space fund. As MarketWatch noted at the time, UFO is composed of companies involved in existing space-related business lines: ground equipment manufacturing that uses satellite systems, rocket and satellite manufacturing and operation, satellite-based telecommunications and broadcasting, and so on.

The one ETF that might now be said to be closest to offering access to quantum technology takes a similar approach. The Defiance Quantum ETF QTUM, +0.71% has quantum in its name, but says it provides exposure to companies on the forefront of cloud computing, quantum computing, machine learning, and other transformative computing technologies.

Another consideration might be an ETF specializing in very early-stage technology. In December, MarketWatch profiled the Innovator Loup Frontier Technology ETF LOUP, -0.08%. Rosenbluth has also been watching the Direxion Moonshot Innovators ETF MOON, -0.66%.

Disruptive technology themes have gotten a boost from one of biggest success stories of 2020, he said in an interview. ARK Invests fund lineup took in billions of dollars and enjoyed triple-digit gains as their bets on technology had a moment.

The next-gen narrative seems to resonate with investors, and complex themes like these make a good case for investing in actively-managed funds that benefit from researchers expertise. That means that when it succeeds, Theres a snowball effect of investors coming to see the benefits of using ETFs for these kinds of themes, Rosenbluth said.

I think the future is bright for these types of ETFs, Rosenbluth told MarketWatch. Theres less white space in the ETF world than there was before, but its inevitable that there will be a teleportation-related ETF.

Read next: What will 2021 bring for ETFs?

Read this article:
Quantum computing is so last-decade. Get ready to invest in the final frontier... teleportation - MarketWatch

Major Quantum Computing Projects And Innovations Of 2020 – Analytics India Magazine

Quantum computing has opened multiple doors of possibilities for quick and accurate computation for complex problems, something which traditional methods fail at doing. The pace of experimentation in quantum computing has very naturally increased in recent years. 2020 too saw its share of such breakthroughs, which lays the groundwork for future innovations. We list some of the significant quantum computing projects and experiments of 2020.

IT services company Atos devised Q-Score for measuring quantum performance. As per the company, this is the first universal quantum metric that applies to all programmable quantum processors. The company said that in comparison to qubits, the standard figure of merit for performance assessment, Q-Score provides explicit, reliable, objective, and comparable results when solving real-world optimisation problems.

The Q-Score is calculated against three parameters: application-driven, ease of use, and objectiveness and reliability.

Googles AI Quantum team performed the largest chemical simulation, to date, on a quantum computer. Explaining the experiment in a paper titled, Hartree-Fock on a superconducting qubit quantum computer, the team said it used variational quantum eigensolver (VQE) to simulate chemical mechanisms using quantum algorithms.

It was found that the calculations performed in this experiment were two times larger than the previous similar experiments and contained about ten times the number of quantum gate operations.

The University of Sydney developed an algorithm for characterising noise in large scale quantum computers. Noise is one of the major obstacles in building quantum computers. With this newly developed algorithm, they have tried to tame the noise by reducing interference and instability.

A new method was introduced to return an estimate of the effective noise with relative precision. The method could also detect all correlated errors, enabling the discovery of long-range two-qubit correlations in the 14 qubit device. In comparison, the previous methods would render infeasible for device size above 10 qubits.

The tool is highly scalable, and it has been tested successfully on the IBM Quantum Experience device. The team believes that with this, the efficiency of quantum computers in solving computing problems will be addressed.

Canadian quantum computing D-Wave Systems announced the general availability of its next-generation quantum computing platform. This platform offers new hardware, software, and tools for accelerating the delivery of quantum computing applications. The platform is now available in the Leap quantum cloud service and has additions such as Advantage quantum system with 5000 qubits and 15-way qubit connectivity.

It also has an expanded solver service that can perform calculations of up to one million variables. With these capabilities, the platform is expected to assist businesses that are running real-time quantum applications for the first time.

Physicists at MIT reported evidence of Majorana fermions on the surface of gold. Majorana fermions are particles that are theoretically their own antiparticle; it is the first time these have been observed on metal as common as gold. With this discovery, physicists believe that this could prove to be a breakthrough for stable and error-free qubits for quantum computing.

The future innovation in this direction would be based on the idea that combinations of Majorana fermions pairs can build qubit in such a way that if noise error affects one of them, the other would still remain unaffected, thereby preserving the integrity of the computations.

In December, Intel introduced Horse Ridge II. It is the second generation of its cryogenic control chip, considered a milestone towards developing scalable quantum computers. Based on its predecessor, Horse Ridge I, it supports a higher level of integration for the quantum systems control. It can read qubit states and control several gates simultaneously to entangle multiple qubits. One of its key features is the Qubit readout that provides the ability to read the current qubit state.

With this feature, Horse Ridge II allows for faster on-chip, low latency qubit state detection. Its multigate pulsing helps in controlling the potential of qubit gates. This ability allows for the scalability of quantum computers.

I am a journalist with a postgraduate degree in computer network engineering. When not reading or writing, one can find me doodling away to my hearts content.

More here:
Major Quantum Computing Projects And Innovations Of 2020 - Analytics India Magazine

Quantum Computing Entwined with AI is Driving the Impossible to Possible – Analytics Insight

Mergingquantum computing with artificial intelligence (AI)has been on the priority list for researchers and scientists. Even though quantum computing is still in the early phases of development, there have been many innovations and breakthrough. However, it is still unclear on whether the world will change for good or bad when AI is totally influenced by quantum computing.

Quantum computingis similar to traditional computing. It relies on bits, which are 0s and 1s to encode information. The data keeps growing despite limiting it. Moores law has observed that the number of transistors on integrated circuits wills double every two years, making way for tech giants to run the race of making the smallest chips. This has also induced tech companies to compete for the first launch of a viable quantum computer that would be exponentially more powerful than todays computers. The futuristic computer will process all the data we generate and solve increasingly complex problems.

Remarkably, the use ofquantum algorithms in artificial intelligencetechniques will boost machines learning abilities. This will lead to improvements in an unprecedented way. The main goal of the merger is to achieve a so-calledquantum advantage, where complex algorithms can be calculated significantly faster than with the best classical computer. The expected change will be a breakthrough in AI. Experts and business leaders predict thatquantum computings processing powercould begin to improve AI systems within about five years. However, combining AI and quantum is considered scary from an angle. The late researcher and scientist Stephen Hawking has said that the development of full AI could spell the end of the human race. Once humans develop AI, it will take off on its own and redesign itself at an ever-increasing rate. Humans, who are limited by slow biological evolution couldnt compete and would supersede.

Can solve complex problems quickly

One of the major expectations that people have fromquantum computingis to have increased computational skill. It is predicted that quantum computers will be able to complete calculations within seconds that would take thousands of years to calculate. Google claims that the company has a quantum computer that is 100 million times faster than any existing computer. This futuristic and quick way of calculating will solve all the data problems in minutes if not seconds. The key to availing the transition is by converting all the existing data into quantum language.

Enhance warfighter capabilities

Even though the improvement of quantum computing is in the initial stage, it is expected to enhance warfighter capabilities significantly in the future. It is predicted that quantum computing is likely to impact ISR (intelligence, surveillance and reconnaissance), solving logistic problems more quickly. While we know the types of problems and general application space, optimisation problems will be some of the first where we will see advantages.

Applications in the banking sector

Malpractice and constant forgeries are common in the banking and financial sector. Fortunately, the combination of AI with quantum computing might help improve and combat fraud detection. Models trained using a quantum computer will be capable of detecting patterns that are hard to spot using conventional equipment. Meanwhile, the acceleration of algorithms will yield great advantages in terms of the volume of information that the machines handle for this purpose.

Help integrate data from different datasets

Quantum computers are anticipated to be experts in merging different datasets. Although this seems quite impossible without human intervention in the initial phase, computers will eventually learn to integrate data in the future. Henceforth, if there are different raw data sources with unique schema attached to them and a research team wants to compare them, a computer would have to understand the relationship between the schemas before the data could be compared.

All is not good though

In some way, AI and quantum computing worry people with an equal amount of expectations it gives. Quantum computing technology will be very futuristic, but we cant assure you that it is human-friendly. It could be far better than humans suppressing people in their jobs. Quantum computing also poses athreat to security. The latest Thales Data Threat report says that 72% of surveyed security experts worldwide believe quantum computing will have a negative impact on data security within the next five years.

Read more:
Quantum Computing Entwined with AI is Driving the Impossible to Possible - Analytics Insight

Quantum Computing Technologies Market, Share, Application Analysis, Regional Outlook, Competitive Strategies & Forecast up to 2025 – AlgosOnline

Market Study Report, LLC, has added a detailed study on the Quantum Computing Technologies market which provides a brief summary of the growth trends influencing the market. The report also includes significant insights pertaining to the profitability graph, market share, regional proliferation and SWOT analysis of this business vertical. The report further illustrates the status of key players in the competitive setting of the Quantum Computing Technologies market, while expanding on their corporate strategies and product offerings.

The report on Quantum Computing Technologies market presents insights regarding major growth drivers, potential challenges, and key opportunities that shape the industry expansion over analysis period.

Request a sample Report of Quantum Computing Technologies Market at:https://www.marketstudyreport.com/request-a-sample/2673446?utm_source=algosonline.com&utm_medium=AG

According to the study, the industry is predicted to witness a CAGR of XX% over the forecast timeframe (2020-2025) and is anticipated to gain significant returns by the end of study period.

COVID-19 outbreak has caused ups and downs in industries, introducing uncertainties in the business space. Along with the immediate short-term impact of the pandemic, some industries are estimated to face the challenges on a long-term basis.

Most of the businesses across various industries are taking measures to cater the uncertainty and have revisited their budget to draft a roadmap for profit making in the coming years. The report helps companies operating in this particular business vertical to prepare a robust contingency plan taking into consideration all notable aspects.

Key inclusions of the Quantum Computing Technologies market report:

Ask for Discount on Quantum Computing Technologies Market Report at:https://www.marketstudyreport.com/check-for-discount/2673446?utm_source=algosonline.com&utm_medium=AG

Quantum Computing Technologies Market segments covered in the report:

Regional segmentation: North America, Europe, Asia-Pacific, South America, Middle East and Africa

Product types:

Applications spectrum:

Competitive outlook:

For More Details On this Report: https://www.marketstudyreport.com/reports/global-quantum-computing-technologies-market-2020-by-company-regions-type-and-application-forecast-to-2025

Some of the Major Highlights of TOC covers:

Chapter 1: Methodology & Scope

Definition and forecast parameters

Methodology and forecast parameters

Data Sources

Chapter 2: Executive Summary

Business trends

Regional trends

Product trends

End-use trends

Chapter 3: Quantum Computing Technologies Industry Insights

Industry segmentation

Industry landscape

Vendor matrix

Technological and innovation landscape

Chapter 4: Quantum Computing Technologies Market, By Region

Chapter 5: Company Profile

Business Overview

Financial Data

Product Landscape

Strategic Outlook

SWOT Analysis

Related Reports:

2. Global Mechanical Computer Aided Design (MCAD) Market 2021 by Company, Regions, Type and Application, Forecast to 2026Mechanical Computer Aided Design (MCAD) Market Report covers the makers' information, including shipment, value, income, net benefit, talk with record, business appropriation and so forth., this information enables the buyer to think about the contenders better. This report additionally covers every one of the districts and nations of the world, which demonstrates a provincial advancement status, including market size, volume and esteem, and also value information. It additionally covers diverse enterprises customers data, which is critical for the producers.Read More: https://www.marketstudyreport.com/reports/global-mechanical-computer-aided-design-mcad-market-2021-by-company-regions-type-and-application-forecast-to-2026

Contact Us:Corporate Sales,Market Study Report LLCPhone: 1-302-273-0910Toll Free: 1-866-764-2150 Email: [emailprotected]

Originally posted here:
Quantum Computing Technologies Market, Share, Application Analysis, Regional Outlook, Competitive Strategies & Forecast up to 2025 - AlgosOnline

Quantum Computing And Investing – ValueWalk

At a conference on quantum computing and finance on December 10, 2020, William Zeng, head of quantum research at Goldman Sachs, told the audience that quantum computing could have a revolutionary impact on the bank, and on finance more broadly. In a similar vein, Marco Pistoia of JP Morgan stated that new quantum machines will boost profits by speeding up asset pricing models and digging up better-performing portfolios. While there is little dispute that quantum computing has great potential to perform certain mathematical calculations much more quickly, whether it can revolutionize investing by so doing is an altogether different matter.

Get the entire 10-part series on Seth Klarman in PDF. Save it to your desktop, read it on your tablet, or email to your colleagues.

Q3 2020 hedge fund letters, conferences and more

The hope is that the immense power of quantum computers will allow investment managers earn superior investment returns by uncovering patterns in prices and financial data that can be exploited. The dark side is that quantum computers will open the door to finding patterns that either do not actually exist, or if they did exist at one time, no longer do. In more technical terms, quantum computing may allow for a new level of unwarranted data mining and lead to further confusion regarding the role of nonstationarity.

ValueWalk's Raul Panganiban interviews George Mussalli, Chief Investment Officer and Head of Equity Research at PanAgora Asset Management. In this epispode, they discuss quant ESG as well as PanAgoras unique approach to it. The following is a computer generated transcript and may contain some errors. Q3 2020 hedge fund letters, conferences and more Interview . Read More

Any actual sequence of numbers, even one generated by a random process, will have certain statistical quirks. Physicist Richard Feynman used to make this point with reference to the first 767 digits of Pi, replicated below. Allegedly (but unconfirmed) he liked to reel off the first 761 digits, and then say 9-9-9-9-9 and so on.[1] If you only look at the first 767 digits the replication of six straight nines is clearly an anomaly a potential investment opportunity. In fact, there is no discernible pattern in the digits of Pi. Feynman was purposely making fun of data mining by focusing on the first 767 digits.

3 .1 4 1 5 9 2 6 5 3 5 8 9 7 9 3 2 3 8 4 6 2 6 4 3 3 8 3 2 7 9 5 0 2 8 8 4 1 9 7 1 6 9 3 9 9 3 7 5 1 0 5 8 2 0 9 7 4 9 4 4 5 9 2 3 0 7 8 1 6 4 0 6 2 8 6 2 0 8 9 9 8 6 2 8 0 3 4 8 2 5 3 4 2 1 1 7 0 6 7 9 8 2 1 4 8 0 8 6 5 1 3 2 8 2 3 0 6 6 4 7 0 9 3 8 4 4 6 0 9 5 5 0 5 8 2 2 3 1 7 2 5 3 5 9 4 0 8 1 2 8 4 8 1 1 1 7 4 5 0 2 8 4 1 0 2 7 0 1 9 3 8 5 2 1 1 0 5 5 5 9 6 4 4 6 2 2 9 4 8 9 5 4 9 3 0 3 8 1 9 6 4 4 2 8 8 1 0 9 7 5 6 6 5 9 3 3 4 4 6 1 2 8 4 7 5 6 4 8 2 3 3 7 8 6 7 8 3 1 6 5 2 7 1 2 0 1 9 0 9 1 4 5 6 4 8 5 6 6 9 2 3 4 6 0 3 4 8 6 1 0 4 5 4 3 2 6 6 4 8 2 1 3 3 9 3 6 0 7 2 6 0 2 4 9 1 4 1 2 7 3 7 2 4 5 8 7 0 0 6 6 0 6 3 1 5 5 8 8 1 7 4 8 8 1 5 2 0 9 2 0 9 6 2 8 2 9 2 5 4 0 9 1 7 1 5 3 6 4 3 6 7 8 9 2 5 9 0 3 6 0 0 1 1 3 3 0 5 3 0 5 4 8 8 2 0 4 6 6 5 2 1 3 8 4 1 4 6 9 5 1 9 4 1 5 1 1 6 0 9 4 3 3 0 5 7 2 7 0 3 6 5 7 5 9 5 9 1 9 5 3 0 9 2 1 8 6 1 1 7 3 8 1 9 3 2 6 1 1 7 9 3 1 0 5 1 1 8 5 4 8 0 7 4 4 6 2 3 7 9 9 6 2 7 4 9 5 6 7 3 5 1 8 8 5 7 5 2 7 2 4 8 9 1 2 2 7 9 3 8 1 8 3 0 1 1 9 4 9 1 2 9 8 3 3 6 7 3 3 6 2 4 4 0 6 5 6 6 4 3 0 8 6 0 2 1 3 9 4 9 4 6 3 9 5 2 2 4 7 3 7 1 9 0 7 0 2 1 7 9 8 6 0 9 4 3 7 0 2 7 7 0 5 3 9 2 1 7 1 7 6 2 9 3 1 7 6 7 5 2 3 8 4 6 7 4 8 1 8 4 6 7 6 6 9 4 0 5 1 3 2 0 0 0 5 6 8 1 2 7 1 4 5 2 6 3 5 6 0 8 2 7 7 8 5 7 7 1 3 4 2 7 5 7 7 8 9 6 0 9 1 7 3 6 3 7 1 7 8 7 2 1 4 6 8 4 4 0 9 0 1 2 2 4 9 5 3 4 3 0 1 4 6 5 4 9 5 8 5 3 7 1 0 5 0 7 9 2 2 7 9 6 8 9 2 5 8 9 2 3 5 4 2 0 1 9 9 5 6 1 1 2 1 2 9 0 2 1 9 6 0 8 6 4 0 3 4 4 1 8 1 5 9 8 1 3 6 2 9 7 7 4 7 7 1 3 0 9 9 6 0 5 1 8 7 0 7 2 1 1 3 4 9 9 9 9 9 9

When it comes to investing, there is only one sequence of historical returns. With sufficient computing power and with repeated torturing of the data, anomalies are certain to be detected. A good example is factor investing. The publication of a highly influential paper by Professors Eugene Fama and Kenneth French identified three systematic investment factors, which started an industry focused on searching for additional factors. Research by Arnott, Harvey, Kalesnik and Linnainmaa reports that by year-end 2018 an implausibly large 400 significant factors had been discovered. One wonders how many such anomalies quantum computers might find.

Factor investing is just one example among many. Richard Roll, a leading academic financial economist with in-depth knowledge of the anomalies literature has also been an active financial manager. Based on his experience Roll stated that his money management firms attempted to make money from numerous anomalies widely documented in the academic literature but failed to make a nickel.

The simple fact is that if you have machines that can look closely enough at any historical data set, they will find anomalies. For instance, what about the anomalous sequence 0123456789 in the expansion of Pi.? That anomaly can be found beginning at digit 17,387,594,880.

The digits of Pi may be random, but they are stationary. The process that generates the first million digits is the same as the one which generates the million digits beginning at one trillion. The same is not true of investing. Consider, for example, providing a computer the sequence of daily returns on Apple stock from the day the company went public to the present. The computer could sift through the returns looking for patterns, but this is almost certainly a fruitless endeavor. The company that generated those returns is far from stationary. In 1978, Apple was run by two young entrepreneurs and had total revenues of $0.0078 billion. By 2019, the company was run by a large, experienced, management team and had revenues of $274 billion, an increase of about 35,000 times. The statistical process generating those returns is almost certainly nonstationary due to fundamental changes in the company generating them. To a lesser extent, the same is true of nearly every listed company. The market is constantly in flux and the companies are constantly evolving as consumer demands, government regulation, and technology, among other things, continually change. It is hard to imagine that even if there were past patterns in stock prices that were more than data mining, they would persist for long due to nonstationarity.

In the finance arena, computers and artificial intelligence work by using their massive data processing skills to find patterns that humans may miss. But in a nonstationary world the ultimate financial risk is that by the time they are identified those patterns will be gone. As a result, computerized trading comes to resemble a dog chasing its tail. This leads to excessive trading and ever rising costs without delivering superior results on average. Quantum computing risks simply adding fuel. Of course, there are individual cases where specific quant funds make highly impressive returns, but that too could be an example of data mining. Given the large number of firms in the money management business, the probability that a few do extraordinarily well is essentially one.

These criticisms are not meant to imply that quantum computing has no role to play in finance. For instance, it has great potential to improve the simulation analyses involved in assessing risk. The point here is that it will not be a holy grail for improving investment performance.

Despite the drawbacks associated with data mining and nonstationarity, there is one area in which the potential for quantum computing is particularly bright marketing quantitative investment strategies. Selling quantitative investment has always been an art. It involves convincing people that the investment manager knows something that will make them money, but which is too complicated to explain to them and, in some cases, too complicated for the manager to understand. Quantum computing takes that sales pitch to a whole new level because virtually no one will be able to understand how the machine decided that a particular investment strategy is attractive.

This skeptics take is that quantum computing will have little impact on what is ultimately the source of successful investing allocating capital to companies that have particularly bright prospects for developing profitable business in a highly uncertain and non-stationary world. Perhaps at some future date a computer will development the business judgment to determine whether Teslas business prospects justify its current stock price. Until then being able to comb through historical data in search of obscure patterns at ever increasing rates is more likely to produce profits through the generation of management fees rather than the enhancement of investor returns.

[1] The Feynman story has been repeated so often that the sequence of 9s starting at digit 762 is now referred to as the Feynman point in the expansion of Pi.

Go here to read the rest:
Quantum Computing And Investing - ValueWalk

Farewell 2020: Bleak, Yes. But a Lot of Good Happened Too – HPCwire

Here on the cusp of the new year, the catchphrase 2020 hindsight has a distinctly different feel. Good riddance, yes. But also proof of sciences power to mobilize and do good when called upon. Theres gratitude by those who came through less scathed, and, maybe more willingness to assist those who didnt.

Despite the unrelenting pandemic, high performance computing (HPC) proved itself an able member of the worldwide community of pandemic fighters. We should celebrate that, perhaps quietly since the work isnt done. HPC made a significant difference in speeding up and enabling vastly distributed research and funneling the results to those who could turn them into patient care, epidemiology guidance, and now vaccines. Remarkable really. Necessary, of course, but actually got done too. (Forget the quarreling; thats who we are.)

Across the Tabor family of publications, weve run more than 200 pandemic-related articles. I counted nearly 70 significant pieces in HPCwire. The early standing up of Fugaku at RIKEN, now comfortably astride the Top500 for a second time and by a significant margin, to participate in COVID-19 research is a good metaphor for HPCs mobilization. Many people and organizations contributed to the HPC v. pandemic effort and that continues.

Before spotlighting a few pandemic-related HPC activities and digging into a few other topics, lets do a speed-drive through the 2020 HPC/AI technology landscape.

Consolidation continued among chip players (Nvidia/Arm, AMD/Xilinx) while the AI chip newcomers (Cerebras, Habana (now Intel), SambaNova, Graphcore et. al.) were winning deals. Nvidias new A100 GPU is amazing and virtually everyone else is taking potshots for just that reason. Suddenly RISC-V looks very promising. Systems makers weathered 2020s storm with varying success while IBM seems to be winding down its HPC focus; it also plans to split/spin off its managed infrastructure services. Firing up Fugaku (notably a non-accelerated system) quickly was remarkable. The planned Frontier (ORNL) supercomputer now has the pole position in the U.S. exascale race ahead of the delayed Aurora (ANL).

The worldwide quantum computing frenzy is in full froth as the U.S. looks for constructive ways to spend its roughly $1.25 billion (U.S. Quantum Initiative) and, impressively, China just issued a demonstration of quantum supremacy. Theres a quiet revolution going on in storage and memory (just ask VAST Data). Nvidia/Mellanox introduced its line of 400 Gbs network devices while Ethernet launched its 800 Gbs spec. HPC-in-the-cloud is now a thing not a soon-to-be thing. AI is no longer an oddity but quickly infusing throughout HPC (That happened fast).

Last but not least, hyperscalers demonstrably rule the IT roost. Chipmakers used to, consistently punching above their weight (sales volume). Not so much now:

Ok then. Apologies for the many important topics omitted (e.g. exascale and leadership systems, neuromorphic tech, software tools (can oneAPI flourish?), newer fabrics, optical interconnect, etc.).

Lets start.

I want to highlight two HPC pandemic-related efforts, one current and one early on, and also single out the efforts of Oliver Peckham, HPCwires editor who leads our pandemic coverage which began in earnest with articles on March 6 (Summit Joins the Fight Against the Coronavirus) and March 13 (Global Supercomputing Is Mobilizing Against COVID-19). Actually, the very first piece Tech Conferences Are Being Canceled Due to Coronavirus, March 3 was more about interrupted technology events and we picked it up from our sister pub, Datanami which ran it on March 2. Weve since become a virtualized event world.

Heres an excerpt from the first Summit piece about modeling COVID-19s notorious spike:

Micholas Smith, a postdoctoral researcher at the University of Tennessee/ORNL Center for Molecular Biophysics (UT/ORNL CMB), used early studies and sequencing of the virus to build a virtual model of the spike protein.[A]fter being granted time on Summit through a discretionary allocation, Smith and his colleagues performed a series of molecular dynamics simulations on the protein, cycling through 8,000 compounds within a few days and analyzing how they bound to the spike protein, if at all.

Using Summit, we ranked these compounds based on a set of criteria related to how likely they were to bind to the S-protein spike, Smith said in aninterviewwith ORNL. In total, the team identified 77 candidate small-molecule compounds (such as medications) that they considered worthy of further experimentation, helping to narrow the field for medical researchers.

It took us a day or two whereas it would have taken months on a normal computer, said Jeremy Smith, director of UT/ORNL CMB and principal researcher for the study. Our results dont mean that we have found a cure or treatment for the Wuhan coronavirus. We are very hopeful, though, that our computational findings will both inform future studies and provide a framework that experimentalists will use to further investigate these compounds. Only then will we know whether any of them exhibit the characteristics needed to mitigate this virus.

The flood (and diversity) of efforts that followed was startling. Olivers advice on what to highlight catches the flavor of the challenge: You could go with something like the Fugaku vs. COVID-19 piece or the grocery store piece, maybe contrast them a bit, earliest vs. current simulations of viral particle spreador something like the LANL retrospective piece vs. the piece I just wrote up on their vaccine modeling. Think that might work for a how far weve come angle, either way.

Theres too much to cover.

Last week we ran Olivers article on LANL efforts to optimize vaccine distribution (At Los Alamos National Lab, Supercomputers Are Optimizing Vaccine Distribution). Heres a brief excerpt:

The new vaccines from Pfizer and Moderna have been deemed highly effective by the FDA; unfortunately, doses are likely to be limited for some time. As a result, many state governments are struggling to weigh difficult choices should the most exposed, like frontline workers, be vaccinated first? Or perhaps the most vulnerable, like the elderly and immunocompromised? And after them, whos next?

LANL was no stranger to this kind of analysis: earlier in the year, the lab had used supercomputer-powered tools like EpiCast to simulate virtual cities populated by individuals with demographic characteristics to model how COVID-19 would spread under different conditions. The first thing we looked at was whether it made a difference to prioritize certain populations such as healthcare workers or to just distribute the vaccine randomly,saidSara Del Valle, the LANL computational epidemiologist who is leading the labs COVID-19 modeling efforts. We learned that prioritizing healthcare workers first was more effective in reducing the number of COVID cases and deaths.

You get the idea. The well of HPC efforts to tackle and stymie COVID-19 is extremely deep. Turning unproven mRNA technology into a vaccine in record time was awe-inspiring and required many disciplines. For those unfamiliar with mRNA mechanism heres a brief CDC explanation as it relates to the new vaccines. Below are links to a few HPCwirearticles on the worldwide effort to bring HPC computational power to bear. (The last is a link to the HPCwire COVID-19 Archive which has links to all our major pandemic coverage):

COVID COVERAGE LINKS

Global Supercomputing Is Mobilizing Against COVID-19 (March 12, 2020)

Gordon Bell Special Prize Goes to Massive SARS-CoV-2 Simulations (November 19, 2020)

Supercomputer Research Leads to Human Trial of Potential COVID-19 Therapeutic Raloxifene (October 29, 2020)

AMDs Massive COVID-19 HPC Fund Adds 18 Institutions, 5 Petaflops of Power (September 14, 2020)

Supercomputer-Powered Research Uncovers Signs of Bradykinin Storm That May Explain COVID-19 Symptoms (July 28, 2020)

Researchers Use Frontera to Investigate COVID-19s Insidious Sugar Coating (June 16, 2020)

COVID-19 HPC Consortium Expands to Europe, Reports on Research Projects (May 28, 2020)

At SC20, an Expert Panel Braces for the Next Pandemic (December, 17, 2020)

Whats New in Computing vs. COVID-19: Cerebras, Nvidia, OpenMP & More (May 18, 2020)

Billion Molecules Against COVID-19 Challenge to Launch with Massive Supercomputing Support (April 22, 2020)

Pandemic Wipes Out 2020 HPC Market Growth, Flat to 12% Drop Expected (March 31, 2020)

[emailprotected]Turns Its Massive Crowdsourced Computer Network Against COVID-19 (March 16, 2020)

2020 HPCwire Awards Honor a Year of Remarkable COVID-19 Research (December, 23, 2020)

HPCWIRE COVID-19 COVERAGE ARCHIVE

Making sense of the processor world is challenging. Microprocessors are still the workhorses in mainstream computing with Intel retaining its giant market share despite AMDs encroachment. That said, the rise of heterogeneous computing and blended AI/HPC requirements has shifted focus to accelerators. Nvidias A100 GPU (54 billion transistors on 826mm2of silicon, worlds largest seven-nanometer chip) was launched this spring. Then at SC20 Nvidia announced an enhanced version of the A100, doubling its memory to 80GB; it now delivers 2TB/s of bandwidth. The A100 is an impressive piece of work.

The A100s most significant advantage, says Rick Stevens, associate lab director, Argonne National Laboratory, is its multi-instance GPU capability.

For many people the problem is achieving high occupancy, that is, being able to fill the GPU up because that depends on how much work you have to do. [By] introducing this MIG, this multi instance stuff that they have, theyre able to virtualize it. Most of the real-world performance wins are actually kind of throughput wins by using the virtualization. What weve seen isour big performance improvement is not that individual programs run much faster its that we can run up to seven parallel things on each GPU. When you add up the aggregate performance, you get these factors of three to five improvement over the V100, said Stevens.

Meanwhile, Intels XE GPU line is slowly trickling to market, mostly in card form. At SC20 Intel announced plans to make its high performance discrete GPUs available to early access developers. Notably, the new chips have been deployed at ANL and will serve as a transitional development vehicle for the future (2022) Aurora supercomputer, subbing in for the delayed IntelXE-HPC (Ponte Vecchio) GPUs that are the computational backbone of the system.

AMD, also at SC20, launched its latest GPU the MI100. AMD says it delivers 11.5 teraflops peak double-precision (FP64), 46.1 teraflops peak single-precision matrix (FP32), 23.1 teraflops peak single-precision (FP32), 184.6 teraflops peak half-precision (FP16) floating-point performance, and 92.3 peak teraflops of bfloat16 performance. HPCwire reported, AMDs MI100GPU presents a competitive alternative to Nvidias A100 GPU, rated at 9.7 teraflops of peak theoretical performance. However, the A100 is returning even higher performance than that on its FP64 Linpack runs. It will be interesting to see the specs of the GPU AMD eventually fields for use in its exascale system wins.

The stakes are high in what could become a GPU war. Today, Nvidia is the market leader in HPC.

Turning back to CPUs, which many in HPC/AI have begun to regard as the lesser of CPU/GPU pairings. Perhaps that will change with the spectacular showing of Fujitsus A64FX at the heart of Fugaku. Nvidias proposed acquisition of Arm, not a done deal yet (regulatory concerns), would likely inject fresh energy in what was already a surging Arm push into the datacenter. Of course, Nvidia has jumped into the systems business with its DGX line and presumably wants a home-grown CPU. The big mover of the last couple of years, AMDs Epyc microprocessor line, continues its steady incursion into Intel x86 territory.

Theres not been much discussion around Power10 beyond IBMs summer announcement that Power10 would offer a ~3x performance gain and ~2.6x core efficiency gain over Power9. The new executive director of OpenPOWER Foundation, James Kulina, says attracting more chipmakers to build Power devices is a top goal. Well see. RISC-V is definitely drawing interest but exactly how it fits into the processor puzzle is unclear. Esperanto unveiled a RISC-V based chip aimed at machine learning with 1,100 low-power cores based on the open-source RISC-V. Esperanto reported a goal of 4,000 cores on a single device. Europe is betting on RISC-V. However, at least near-term, RISC-V variants are seen as specialized chips.

The CPU waters are murkier than ever.

Sort of off in a land of their own are AI chip/system players. Their proliferation continues with the early movers winning important deployments. Some observers think 2021 will start sifting winners from the losers. Lets not forget that last year Intel stopped development of its newly-acquired Nervana line in favor of its even more newly-acquired Habana products. Its a high-risk, high-reward arena still.

PROCESSOR COVERAGE LINKS

Intel Xe-HP GPU Deployed for Aurora Exascale Development

Is the Nvidia A100 GPU Performance Worth a Hardware Upgrade?

LLNL, ANL and GSK Provide Early Glimpse into Cerebras AI System Performance

David Patterson Kicks Off AI Hardware Summit Championing Domain Specific Chips

Graphcores IPU Tackles Particle Physics, Showcasing Its Potential for Early Adopters

Intel Debuts Cooper Lake Xeons for 4- and 8-Socket Platforms

Intel Launches Stratix 10 NX FPGAs Targeting AI Workloads

Nvidias Ampere A100 GPU: Up to 2.5X the HPC, 20X the AI

AMD Launches Three New High-Frequency Epyc SKUs Aimed at Commercial HPC

IBM Debuts Power10; Touts New Memory Scheme, Security, and Inferencing

AMDs Road Ahead: 5nm Epyc, CPU-GPU Coupling, 20% CAGR

AI Newcomer SambaNova GAs Product Lineup and Offers New Service

Japans AIST Benchmarks Intel Optane; Cites Benefit for HPC and AI

Storage and memory dont get the attention they deserve. 3D XPoint memory (Intel and Micron), declining flash costs, and innovative software are transforming this technology segment. Hard disk drives and tape arent going away, but traditional storage management approaches such as tiering based on media type (speed/capacity/cost) are under attack. Newcomers WekaIO, VAST Data, and MemVerge are all-in on solid state, and a few leading-edge adopters (NERSC/Perlmutter) are taking the plunge. Data-intensive computing driven by the data flood and AI compute requirements (gotta keep those GPUs busy!) are big drivers.

Our storage systems typically see over an exabyte of I/O annually. Balancing this I/O intensive workload with the economics of storage means that at NERSC, we live and breathe tiering. And this is a snapshot of the storage hierarchy we have on the floor today at NERSC. Although it makes for a pretty picture, we dont have storage tiering because we want to, and in fact, Id go so far as to say its the opposite of what we and our users really want. Moving data between tiers has nothing to do with scientific discovery, said NERSC storage architect Glenn Lockwood during an SC20 panel.

To put some numbers behind this, last year we did a study that found that between 15% and 30% of that exabyte of I/O is not coming from our users jobs, but instead coming from data movement between storage tiers. That is to say that 15% to 30% of the I/O at NERSC is a complete waste of time in terms of advancing science. But even before that study, we knew that both the changing landscape of storage technology and the emerging large-scale data analysis and AI workloads arriving at NERSC required us to completely rethink our approach to tiered storage, said Lockwood.

Not surprisingly Intel and Micron (Optane/3D XPoint) are trying to accelerate the evolution. Micron released what it calls a heterogeneous-memory storage engine (HSE) designed for solid-state drives, memory-based storage and, ultimately, applications requiring persistent memory. Legacy storage engines born in the era of hard disk drives have historically failed to architecturally provide for the increased performance and reduced latency of next-generation nonvolatile media, said the company. Again, well see.

Software defined storage leveraging newer media has all the momentum at the moment with all of the established players IBM, DDN, Panasas, etc., mixing those capabilities into their product sets. WekaIO and Intel have battled it out for the top IO500 spot the last couple of years and Intels DAOS (distributed asynchronous object store) is slated for use in Aurora.

The concept of asynchronous IO is very interesting, noted Ari Berman, CEO, BioTeam research consultancy. Its essentially a queue mechanism at the system write level so system waits in the processors dont have to happen while a confirmed write back comes from the disks. So asynchronous IO allows jobs can keep running while youre waiting on storage to happen, to a limit of course. That would really improve the data input-output pipelines in those systems. Its a very interesting idea. I like asynchronous data writes and asynchronous storage access. I can see there very easily being corruption that creeps into those types of things and data without very careful sequencing. It will be interesting to watch. If it works it will be a big innovation.

Change is afoot and the storage technology community is adapting. Memory technology is also advancing.

Micron introduced a 176-layer 3D NAND flash memory at SC230 that it says increases read and write densities by more than 35 percent.JEDEC published the DDR5 SDRAM spec, the next-generation standard for random access memory (RAM) in the summer. Compared to DDR4, the DDR5 spec will deliver twice the performance and improved power efficiency, addressing ever-growing demand from datacenter and cloud environments, as well as artificial intelligence and HPC applications. At launch, DDR5 modules will reach 4.8 Gbps, providing a 50 percent improvement versus the previous generation. Density goes up four-fold with maximum density increasing from 16 Gigabits per die to 64 Gigabits per die in the new spec. JEDEC representatives indicated there will be 8 Gb and 16 Gb DDR5 products at launch.

There are always the wildcards. IBMs memristive technology is moving closer to practical use. One outlier is DNA-based storage. Dave Turek, longtime IBMer, joined DNA storage start-up Catalog this year and, says Catalog is working on proof of concepts with government agencies and a number of Fortune 500 companies. Some of these are whos-who HPC players, but some are non-HPC players many names you would recognizeWere at what I would say is the beginning of the commercial beginning. Again, well see.

STORAGE & MEMORY LINKS

SC20 Panel OK, You Hate Storage Tiering. Whats Next Then?

Intels Optane/DAOS Solution Tops Latest IO500

Startup MemVerge on Memory-centric Mission

HPC Strategist Dave Turek Joins DNA Storage (and Computing) Company Catalog

DDN-Tintri Showcases Technology Integration with Two New Products

Intel Refreshes Optane Persistent Memory, Adds New NAND SSDs

Micron Boosts Flash Density with 176-Layer 3D NAND

DDR5 Memory Spec Doubles Data Rate, Quadruples Density

IBM Touts STT MRAM Technology at IDEM 2020

The Distributed File Systems and Object Storage Landscape: Whos Leading?

Its tempting to omit quantum computing this year. Too much happened to summarize easily and the overall feel is of steady carry-on progress from 2019. There was, perhaps, a stronger pivot at least by press release count towards seeking early applications for near-term noisy intermediate scale quantum (NISQ) computers. Ion trap qubit technology got another important player in Honeywell which formally rolled out its effort and first system. Intel also stepped out from the shadows a bit in terms of showcasing its efforts. D-Wave launched a giant 5000-qubit machine (Advantage), again using a quantum annealing approach thats different from universal gate-based quantum system. IBM announced a stretch goal of achieving one million qubits!

Calling quantum computing a market is probably premature but monies are being spent. The Quantum Economic Development Consortium (QED-C) and Hyperion Research issued a forecast that projects the global quantum computing (QC) market worth an estimated $320 million in 2020 to grow 27% CAGR between 2020 and 2024. That would reach approximately $830 million by 2024. Chump change? Perhaps but real activity.

IBMs proposed Quantum Volume metric has drawn support as a broad benchmark of quantum computer performance. Honeywell promoted the 128QV score of its launch system. In December IBM reported it too had achieved a 128QV. The first QV reported by IBM was 16 in 2019 at the APS March meeting. Just what a QV of 128 means in determining practical usefulness is unclear but it is steady progress and even Intel agrees that QV is as good as any measure at the moment. DoE is also working on benchmarks, focusing a bit more on performance on given workloads.

[One] major component of benchmarking is asking what kind of resources does it take to run this or that interesting problem. Again, these are problems of interest to DoE, so basic science problems in chemistry and nuclear physics and things like that. What well do is take applications in chemistry and nuclear physics and convert them into what we consider a benchmark. We consider it a benchmark when we can distill a metric from it. So the metric could be the accuracy, the quality of the solution, or the resources required to get a given level of quality, said Raphael Pooser, PI for DoEs Quantum Testbed Pathfinder project at ORNL, during an HPCwire interview.

Next year seems likely to bring more benchmarking activity around system quality, qubit technology, and performance on specific problem sets. Several qubit technologies still vie for sway superconducting, trapped ion, optical, quantum dots, cold atoms, et al. The need to operate at near-zero (K) temps complicates everything. Google claimed achieving Quantum Supremacy last year. This year a group of China researchers also did so. The groups used different qubit technologies (superconducting v. optical) and Chinas effort tried to skirt criticisms that were lobbed at Googles effort. Frankly, both efforts were impressive. Russia reported early last year it would invest $790 million in quantum with achieving quantum supremacy as one goal.

Whats happening now is a kind of pell-mell rush among a larger and increasingly diverse quantum ecosystem (hardware, software, consultants, governments, academia). Fault tolerant quantum computing still seems distant but clever algorithms and error mitigation strategies to make productive use of NISQ systems, likely on narrow applications, look more and more promising.

Here are a few snapshots:

The persistent question is when will all of these efforts pay off and will they be as game-changing as many believe. With new money flowing into quantum, one has the sense there will be few abrupt changes in the next couple years barring untoward economic turns.

QUANTUM COVERAGE LINKS

IBMs Quantum Race to One Million Qubits

Googles Quantum Chemistry Simulation Suggests Promising Path Forward

Intel Connects the (Quantum) Dots in Accelerating Quantum Computing Effort

D-Wave Delivers 5000-qubit System; Targets Quantum Advantage

Honeywell Debuts Quantum System, Subscription Business Model, and Glimpse of Roadmap

Global QC Market Projected to Grow to More Than $800 million by 2024

ORNLs Raphael Pooser on DoEs Quantum Testbed Project

RigettiComputing Wins $8.6M DARPA Grant to Demonstrate Practical Quantum Computing

Braket: Amazons Cloud-First Quantum Environment Is Generally Available

IBM-led Webinar Tackles Quantum Developer Community Needs

Microsofts Azure Quantum Platform Now Offers Toshibas Simulated Bifurcation Machine

As always theres personnel shuffling. Lately hyperscalers have been taking HPC folks. Two long-time Intel executives, Debra Goldfarb and Bill Magro, recently left for the cloud Goldfarb to AWS as director for HPC products and strategy, and Magro to Google as CTO for HPC. Going in the other direction, John Martinis left Googles quantum development team and recently joined Australian start-up Silicon Quantum Computing. Ginny Rometty, of course, stepped down as CEO and chairman at IBM. IBMs long-time HPC exec Dave Turek left to take position with DNA storage start-up, Catalog, and last January, IBMer Brad McCredie joined AMD as corporate VP, GPU platforms.

View post:
Farewell 2020: Bleak, Yes. But a Lot of Good Happened Too - HPCwire

Collaboration is the Future – Mediate.com

Lawyers love conflict. They thrive on it. If anyone can coexist with conflict, its a lawyer.

At least thats how most people think of lawyers. In reality, the opposite is more often true. The only people who love conflict might be candidates for the therapists couch. Most of us, especially lawyers, are averse to it.

The lawyer turned clinical psychologist, Larry Richard, has given personality assessments to over 5,000 lawyers over 20 years. As a tribe lawyers are disproportionately low in the personality traits of resilience and sociability. Resilience is the mark of emotional intelligence that allows one to accept failure, rejection and loss. Were not so good at that it turns out.

That may be, but what does that have to do with the economics of a successful legal practice or law department? It might surprise a few of us who subscribe to the zealous advocacy theory of legal practice that collaboration is more economically sustainable than exclusive competition.

Hold this thought in mind: in 2017 $10 billion in legal services revenue went from the BigLaw vault into the pockets of alternative legal service providers that are not law firms.

Why? Our conflict aversion is our greatest enemy in the Exponential Age of digital data, artificial intelligence and blockchain technologies. Doing better, faster and cheaper is the mantra of the collaborative economy. The legal business model that has worked extremely well in the competitive economy is on the verge of collapse, though that claim may seem a bit grandioseeven for a lawyer. But lets examine the evidence.

Unresolved Conflict in Workplaces is Expensive

Howatt HR Consulting provides a conflict cost calculator to gauge the cost of unresolved conflict in law firms and legal departments. I recently ran my calculator from the perspective of the most conflict-rich workplace I remember being a part of. It only cost $100,000 per year in lost productivity, absenteeism, health-care claims, turnover and other profit-destroying contributors. That is simply the impact of one person in that workplace! Howatt points out that the Canadian economy suffers a loss of over $16 billion each year due to unresolved conflict in the nations workplaces.

Its customarily calculated that the cost of an employees turnoverthrough termination or voluntary departure, then replacementcosts 120 percent of that employees annual compensation. For a $55,000-a-year paralegal, the cost of losing him or her is $66,000. Lost productivity, training and bringing a replacement to the same level of performance as a predecessor is not cheap.

At the British Legal Technology Forum 2018, Kevin Gold, a Mishcon de Reya managing partner, stated in a plenary session that the firm had calculated the costs of bringing a new young attorney to the point of return on investment; it was 250,000, or roughly $340,000.

I have listened as partners proudly describe the economic brilliance of their firms leverage model in terms such as, We have one associate make partner for every eight associates we hire. Theyre expendable. If they cant figure out how to succeed in our business model, we dont need them. There are more waiting for the empty chair. But losing seven associates to every one who makes partner is a very expensive proposition. Most associates who arent going to make partner are gone, voluntarily or otherwise, before they achieve third-year status.

According to Gold, the young lawyers at Mishcon de Reya become revenue-neutral somewhere close to their third year. Under the business model in my partner-friends firm, the firm loses about $2.5 million for every successful associate. Adjust the variables however you wish and the loss of treating associate attorneys as fungible is economically foolhardy, if not disastrous.

Similarly, the numerous accounts and studies of lateral attorney hires reflect how rarely the transition is economically beneficial for the firm. The laterally hired partner usually makes out like a bandit, but the firm often breaks even at best. More often the transaction is a loss leader. It may be worth the headlines, but the price borne by the bottom line can be less than rosy.

Of course, the law is one of the only professions that prohibits noncompete agreements with lawyers. A high-value executive can be bound by non-competes, but not lawyers. As a former firm executive committee member, we often said that a law firm is the only business that allows its inventory to walk out the door each night. If the lawyer doesnt return the next day, neither do their clients in most cases. When negotiating with a lateral attorney, the deal is usually cut on the basis of the attorneys portable business.

Whats the cause of all this lost revenue and profit? Unresolved conflict is usually the culprit. Perhaps its the associate who isnt popular enough with the firms power brokers and influencers to be worth the effort to resource, train, develop and treat as the resource Mishcon de Reya recognizes him or her to be. Or partners at odds with each over origination credits in the last compensation wars are more likely to engage in passive-aggressive behavior than have a conversation intended to reach agreement over a proper allocation of credit.

Admit it, you know its true. After 40 years of legal practice, Ive witnessed more unresolved conflict in law firms and legal departments than in prisons. Prisoners just take it outside. Lawyers demonstrate what we call Nashville Nice around these parts. You learn how to smile to their faces and then stab them in the back with a politically correct criticism in the Nashville fashion: Oh, shes a nice person, and I would never say anything bad about her, bless her heart. Thats conflict aversion.

Frankly, its more than an economic problem. Its a societal, emotional and health problem. Lawyer addiction, suicide and relational dysfunction exceed the general norm by a large margin. That, too, is an economic scourge.

The statistics cannot be questioned. Gender diversity in law school is far superior to that in law firms, legal departments, firm management committees, partnerships and the executive suite. Racial diversity doesnt even begin to reflect the population. The steady reduction in diversity as the organizational level of power and status increases is an indictment on our entire profession. What are the economic costs? The answer is simply unimaginableand totally unacceptable.

Thriving in the Collaborative Economy

We all remember the 1L experience when the most intimidating professor in our assigned classes made the recurrent sobering remark: Look to your right, look to the left . . . . Thus began our steady march into the competitive mindset of thinking like a lawyer. Unfortunately, for those of us wired that way, this culture of competition fed all our worst instincts. For others it was soul destroying. Richard, the lawyer turned clinical therapist, indicates thats the reason he became a psychologist.

While the law has perfected radical competitiveness, the rest of the business world is becoming radically collaborative. This transformative transition is due to the inevitability of digital power and pace. For a full exploration of the exponential nature of the Digital Age and its impact on commerce and culture, read The Second Machine Age: Work, Progress, and Prosperity in a Time of Brilliant Technologies, by Erik Brynjolfsson and Andrew McAfee. The authors brilliantly compare the attributes of the first half of the Machine Agefrom the steam engine up to 2006to the second half. The first was competitive leading to scarcity. The second, also known as the Exponential Age, is collaborative leading to abundance.

A recent visit to Silicon Valley revealed how cooperative business has become. I spoke with a software engineer working for Dell who supervises a software development team. Nothing abnormal about that. However, he manages a team whose members change every day on projects that change every day. A Dell engineer manages a team that one day might consist of developers from Microsoft, SAP, Google, Apple and others. They are working on open-source software that builds open-source softwarefor the benefit of all.

Some say attorneys could never do that. It would be unethical, wouldnt it? Ask Pfizer and the small number of law firms that won the privilege of doing Pfizers legal work. A few years ago the pharmaceutical company required its successful law firm bidders to share work product, lessons learned and mistakes made with the other Pfizer core counsel after each matter. Thats distinctly unconventionaland the hallmark of successful business models in the Exponential Age.

Many other professions have already arrived in the cooperative age of business. Preparing for a recent training program for the Vanderbilt Medical School Leadership College, I discovered Quantum Leadership: Building Better Partnerships for Sustainable Health, by Tim Porter-OGrady and Kathy Malloch. Remove the word health and replace it with law and the parallels are unmistakable. The tools of technology, artificial intelligence, blockchain, the internet of things and cryptocurrency are, or will be, changing everything. Even quantum computing has arrived, making traditional computing look like the tortoise versus the harethat is, quantum computers can calculate 100,000 times faster. As a result the old keep-it-so-no-one-else-can-get-it mindset is evaporating. Do you want to work on IBMs quantum computer, operating at 20 qubits and soon to be 50 qubits? Its free and open source. Go right ahead.

When did all this happen, you ask. Seemingly overnight, and without warning. Thats exponential. As a result no disciplinary expertise is sufficient in itself. Cognitive diversity is the fuel of innovation. Seeing a problem from the same perspective leads to the same old solutions. Seeing the same problem from multiple perspectives (gender, racial, religious, sexual orientation, disability and national origin) brings creativity to the table, and competition is inimical to its success.

What quantum leadership requires is a new form of leadership: one thats radically collaborative. The old commercial model is hierarchical, structured and highly command-and-control oriented. The new model is flat, team-based and relational.

The new commercial model is focused on accountability rather than responsibility and output rather than effort. My life as a lawyer was spent selling effort, not output. Time has been the coin of the realm in the law since 1956, when the ABA informed lawyers that time is your most valuable asset. Man, did we buy that, and so did our clientsuntil they tired of it. Now they want value, not effort.

The difference between the old commercial order and the new is stunning. Working in teams is not taught in law school. I have been teaching Legal Project Management at Vanderbilt Law School for six years. Law students routinely report that this class is the first time they have been asked to work in a team in law school unless they are joint J.D./M.B.A. candidates. Business students dont understand why law school doesnt value teamwork. Therein lies one of our greatest problems: our clients are team-based, and we dont know how to do that.

Replacing Hypercompetition with Collaboration

Lets return to the question of the missing $10 billion. How could BigLaw lose that much value in a year? Lets examine the data.

The data isnt secret. Its been building over 10 years. Its more than an aberration; its a statistical trend. The data is submitted voluntarily by the nations largest law firmsnamely, the Am Law 300on a monthly basis and reported in the Thomson Reuters Peer Monitor Index reports. Although anonymized, the data collected over the last 10 years is stunning. Law firms are losing market share steadily, relentlessly and without response.

Spend time with the data reported in the Georgetown Law Centers and Thomson Reuters Legal Executive Institutes annual Report on the State of the Legal Market. Ten years of BigLaw self-reporting reveals the following: all the data reflecting financial progress in time billed and billings realized, collected and banked in firm law treasuries is in long-term decline. There are two rising trends: rates and costs. This dangerous economic state is obvious to everyone. Nothing is being done except by a few high-flying firms that have figured out the antidote to demise.

Check out Table 15 in the Georgetown/Thomson Reuters report. The missing $10 billion went to nonlawyers and nonlaw firms such as PwC, Deloitte, Axiom, lexunited, Pangea3, LegalZoom and a growing host of alternative legal service providers doing law better, faster and cheaperand sometimes without a law license. Thats what the market wants.

The report pulls no punches this year. It states: Stop doubling down on your failing strategy! Citing the Harvard Business Review analysis by the same title, the report warns BigLaw leaders that their conflict aversion could make these hallmark firms irrelevant.

How so? Harvard and Georgetown Law cite the power of our mind-blindness in the face of economic peril. Its all about heuristics, the state of mind that partially determines how we react to stress and threat. Our worldview is only valuable in the context of how it was formed. Another way of saying it is, You cant tell a room full of millionaires their business model is broken. They cant hear it. This is not a function of intelligence but of experience. We cant know what we dont know.

Specifically, the mental heuristics that take over our cognitive capacity in times of economic peril can be summarized with startling reality in the following ways:

When combined, these mental heuristics, which reflect simply how the human brain works, can be a toxic brew of mind-blindness, obscuring paths to rescue and ways out of a dilemma of our own making.

Whats a body to do? We must overcome our conflict aversion and welcome a path to open, respectful and strategic conflict competence rather than our preferred resort to passive-aggressive behavior.

The Harvard Business Review article suggests rules to follow to achieve conflict competence:

Embracing the Cooperative Economy

Although unfamiliar to those of us steeped in a competitive model of economic success, the world has moved on and is continuing to stake out new opportunities for economic success through previously unheard-of degrees of cooperative effort.

Start small and learn as you go. Discover the power and the scope of building bridges rather than silos. As our digital world continues to explode in data and the power to process it, learn to learn from other disciplines. Make friends with a data scientist, a software engineer or a legal project manager. Learn to see from their perspectives.

And, most importantly, jump in, the waters fine.

Follow this link:
Collaboration is the Future - Mediate.com

Tech trends in 2021: How artificial intelligence and technology will reshape businesses – The Financial Express

What better time than now to unveil what to look out for in the world of AI and technology in 2021.

By Prithwis De

The year 2020 will be marked as an unprecedented year in history due to the adverse impact of coronavirus worldwide. This pandemic has started bringing extraordinary changes in some key areas. The trends of faster drug development, effective remote care, efficient supply chain, etc, will continue into 2021. Drone technology is already playing a vital role in delivering food and other essentials alongside relief activities.

With Covid-19 came a new concept of the Internet of Behaviour within organisations to track human behaviour in the work environment and trace any slack in maintaining guidelines. Now on, organisations are set to capture and combine behaviour-related data from different sources and use it. We can assertively say it will affect the way organisations interact with people, going forward. Students are experiencing distance learning, taking examinations under remotely-monitored and proctored surveillance systems through identity verification and authentication in real time.

All these will have a high impact on technology, which will shape our outlook in the future. Businesses around the globe are taking the giant leap to become tech-savvy with quantum computing, artificial intelligence (AI), cybersecurity, etc. AI and cloud computing are alluring us all towards an environment of efficiency, security, optimisation and confidence. What better time than now to unveil what to look out for in the world of AI and technology in 2021.

What 2020 has paved the way for is quantum computing. Now, be prepared to adapt to a hybrid computing approach (conventional cum quantum computing) to problem-solving. This paradigm shift in computing will result in the emergence of implausible ways to solve existing business problems and ideate new opportunities. Its effects will be visible on our ability to perform better in diverse areasfinancial forecasting, weather predictions, drug and vaccine development, blood-protein analysis, supply chain planning and optimisation, etc. Quantum Computing as a Service (QCaaS) will be a natural choice for organisations to plug into the experiments as we advance. Forward-thinking businesses are excited to take the quantum leap, but the transition is still in a nascent stage. This new year will be a crucial stepping stone towards the future of things to change in the following years.

Cloud providers such as Amazon (AWS), Microsoft (Azure) and Google will continue to hog the limelight as the AI tool providers for most companies leaning towards real-time experiments in their business processes in the months to follow. Efficiency, security and customisation are the advantages for which serverless and hybrid cloud computing are gaining firm ground with big enterprises. It will continue to do so in 2021.

Going forward, the aim is to make the black box of AI transparent with explainable AI. The lack of clarity hampers our ability to trust AI yet. Automated machine learning (AutoML), another crucial area, is likely to be very popular in the near future. One more trend that caught on like wildfire in 2020 is Machine Learning Operations (MLOps). It provides organisations visibility of their models and has become an efficient tool to steer clear of duplicated efforts in AI. Most of the companies have been graduating from AI experimentations and pilot projects to implementation. This endeavour is bound to grow further and enable AI experts to have more control over their work from end-to-end now onwards.

Cybersecurity will gain prime importance in 2021 and beyond as there is no doubt that hacking and cybercrime prevention are priorities for all businesses with sensitive data becoming easily accessible with advanced phishing tools. Advanced prediction algorithms, along with AI, will play a decisive role in the future to prevent such breaches in data security.

AI and the Internet of Things along with edge computing, which is data processing nearer the source closer to the device at the edge of the network, will usher in a new era for actionable insights from the vast amount of data. The in-memory-accelerated-real-time AI will be needed, particularly when 5G has started creating new opportunities for disruption.

In 2020, there was a dip in overall funding as the pandemic had badly impacted the investment sector due to a reduction in activity. Some of the technology start-ups are still unable to cope up with the challenges created due to Covid-19 and the consequent worsening economic conditions. According to NASSCOM, around 40% of Indian start-ups were forced to stop their operations. In 2021, mergers and acquisitions of start-ups are expected. The larger companies are likely to target smaller companies, specialised mainly in niche and innovative areas such as drug development, cybersecurity, AI chips, cloud computing, MLOps, etc.

The businesses in 2021 and beyond will develop into efficient workplaces for everybody who believes in the power of technology. It is important to bear in mind that all trends are not necessarily independent of each other, but rather form the support base of the other as well as work in tandem with human intervention. So, are the hybrid trends and solutions here to stay for the next few years for the smooth running of various organisations? Only time will tell. But the need for AI and newer technology adoption and modernisation increases manifold.

The author is an analytics and AI professional, based in London, working in a big IT company. Views are personal

Get live Stock Prices from BSE, NSE, US Market and latest NAV, portfolio of Mutual Funds, Check out latest IPO News, Best Performing IPOs, calculate your tax by Income Tax Calculator, know markets Top Gainers, Top Losers & Best Equity Funds. Like us on Facebook and follow us on Twitter.

Financial Express is now on Telegram. Click here to join our channel and stay updated with the latest Biz news and updates.

Go here to read the rest:
Tech trends in 2021: How artificial intelligence and technology will reshape businesses - The Financial Express

Tech trends to watch in 2021 – India Today

The year 2020 has been one of the most unpredictable years and in parallel, we have seen the transition of technology in various sectors that has really helped humanity predict & prepare for any catastrophic condition. Considering the Covid-19 pandemic as one of the situations, many Scientists, Engineers & other techies have realized that a lot of development is still required to make life easier with accessible technology. Therefore, we bring to you some of the top tech trends to watch in 2021:

In the last decade, we have seen that there is no limit for technology & with the rise of digitalization in India, there will be a need for Quantum computing in order to protect Banking systems & IT security from cybercrime. With database processing as a critical strength of quantum computing, technologies such as artificial intelligence will be one application that will get significant benefit from the superior processing of Quantum computers.

Therefore, it can be seen that there will be massive competition among the big IT giants to provide services in cybersecurity, drug development, climatic condition prediction, etc., with the help of quantum computing.

In IoT applications, there were two challenges: the range and battery life. These two challenges are now overcome with the help of NB IoT. Considering the fact that approximately 21 billion devices will be connected by 2025, there will be a huge competition between Telecoms like Jio, Airtel, Vodafone, and others to provide cost-effective & efficient solution to their consumers in SaaS (Software as a Service) and PaaS (Platform as a Service) model. Moreover, India is working actively on NB IoT. In a first, BSNL with Skylo has launched the world's first satellite-based on NB-IoT to streamline various sectors, including fishers, farmers, construction, mining and logistics enterprises.

Tech trends to watch in 2021 | Representational image

IPA is the advanced version of RPA i.e., Robotics Process Automation. It is actually a combination of RPA & Machine Learning. Due to the outbreak of COVID-19, most of the IT industry has given intimation that there is a possibility to announce permanent work from home & some of the companies have already declared the same, including TCS, Deloitte, Twitter, etc. It is imperative for any industry to check the activeness, productivity & relative output from the workforce during this scenario.

Therefore, IPA techniques will be expected to increase process efficiency, better customer experience, optimize workforce productivity, and generate a relatively surge in revenue generation. In 2019-2020 we saw how chatbot helped the firms automate customer interaction & thereby reducing the operational cost. Similarly, various IPA techniques will help firms of any kind to construct any raw data into a structured one. In consequence, IPA techniques will be going to reducing human error & enhancing customer satisfaction.

Artificial Intelligence will expand its footprints in various sectors, including military, defence, agriculture, automotive, education, medical, construction, etc. power & scope of AI is unimaginable; it's endless. According to Fox News, the artificial intelligence algorithm, developed by heron systems, swept a human F-16 pilot in a simulated dogfight 5-0 on August 2020. Additionally, with the launch of GPT-3, an autoregressive language model that uses deep learning to produce human-like text developed by the OpenAI lab team.

This model expects to generate excellent quality text, making it difficult to distinguish whether the text is generated by humans or machines. In the agriculture sector, too, there will be some expectation to increase crop productivity with the help of AI techniques & thereby increasing farmers' income.

With the announcement of NEP 2020 by the Ministry of Education, there will be a change in all the institutions' learning patterns. We can see the rise in technologies such as Artificial Intelligence, Machine Learning, Big Data, blockchain, etc. Hence, the education ministry will put strenuous effort into upgrading India's education quality to make a skilled workforce.

Most awaited 5G or the 5th generation cellular network technology services is expected to launch in 2021 as telecom giants including Bharti Airtel, Jio, Vodafone Idea is ramping up to move early trails to commercialization with their respective partners.

Meanwhile, Reliance CMD Mr. Mukesh Ambani has already declared that Jio is ready with the infrastructure & Jio will pioneer the 5G revolution in India in the second half of 2021. This way, we can realize that technologies have made our lives easier and better in many misfortune situations. Hence, it will be our primary need in the future to let humans and machines work together to protect humans.

-Article by Abhishek Gupta, CEO & Co-founder, Hex N Bit

READ | Education sector's 2021 outlook and trends to keep in mind

READ | 5 edtech startups to watch out for in 2021

READ | What skills are Indians learning for 2021

See the rest here:
Tech trends to watch in 2021 - India Today

The science of looking ahead – Deccan Herald

At the turn of the millennium, when scientists sequenced the human genome, its full implications escaped popular imagination. Amid debates over its possible benefits and risks, genome science gave an unprecedented push to advances in biology, never as evident as now, two decades later, as the world battles a pandemic.

No one, after the coronavirus pandemic, can deny the capacity of science to surpass human imagination. Never before in the history ofsciencehave multiple vaccines emerged within months after the discovery of a newvirus.Production and even immunisation started even before 2020 ended. What the past year has shown us is what science can do if research advances, political will and coordinated global efforts merge.

With this backdrop in mind,we do some crystal gazing to explore what might become the reality in the next 10 yearsin select scientific areas. All may not fructify, but many could, particularly if science is backed by society.

SPACE: Are we alone in this universe?

This is a query that has enamoured scientists for decades. It received a boost half-a-century ago when Cornell University physicist Frank Drake, in a famous formula, demonstrated the theoretical possibility of having millions of such advanced civilisations just in the Milky Way galaxy alone. Soon the search for Extra-Terrestrial Intelligence (SETI) began and till date, there is no dearth of excitement. The cigar-shaped Oumuamua that zipped through the solar system two years ago has added more fuel to the interest.

The next decade is likely to provide several crucial clues to answer this long-standing query. Astrophysicists are of the opinion that it would be an epoch-breaking decade in human understanding of the cosmos, because of the 6-meter class James Webb Space Telescope that will bethree times more powerful than the Hubble space telescope and would probe deep space as never before. The James Webb Space Telescope is expected to provide unprecedented information about atmospheres of extrasolar planets and perhaps help identify the molecular building blocks necessary for life there.

The grandiose space telescope would receive able support from three giant ground-based telescopes European Extra Large Telescope, Thirty Meter Telescope and the Giant Magellanic Telescope that will allow astronomers to penetrate the farthest part of the visible universe and probe the faintest objects in our own galaxy. The next generation radio telescope Square Kilometre Array will add heft to the quest by unveiling the most enigmatic, yet to be discovered radio signals from the universe.

Some discoveries that are likely include bio-signatures in the atmosphere of Earth-like exo-planets, implying the presence of life, discovery of the elusive ninth solar planet, exo-moons, first generation stars and better understanding of dark matter and dark energy that comprises the bulk of the universe.

But a human landing on Mars or colonisation of the moon are unlikely. More travel to the moon is possible, but is there a chance of settling there? Certainly notin the coming decade.

NANOTECHNOLOGY: 'Plenty of room at the bottom'

The late AmericanNobel laureate Richard Feynmanhadobservedin a 1959 lecture that there is plenty of room at the bottom, spawning the genesis of nanotechnologyor the science of the ultra small, but the beauty of Feynmans staggeringly small worldhas become evident only overthe last two decadeswith the realisation of the tools to see, measure and manipulate matter at the nanoscale.And to give you an idea about the scale that we are talking about, a single strand of human hair measures 50,000 nanometres across.

Research in nanotechnology has diversified enormously, fuelled by massive improvement inelectron microscopy, physical and chemical synthesis routes, emergence of the new class of materials (starting from graphene in 2004), and device technology to translate nano materials to product.Thegeneral physical properties of matter at nano-scale are relatively well-understood now, and there is a global efforttoexploit these properties to achieve unique therapeutic methodologies, as well as materials and devices that can impact life directly.

Medicine is one area where the technology holds enormous promises.Breakthroughs are likely in areas ranging from wearable fitness technology that would monitor our health daily to electronic tattoos to sense vital signs.There could even be sensors inside the body and multi-billion pharmaceutical firm GSK is alreadypursuing researchon electroceuticals. Also, scientists envision havingnano-robots inside the blood. Such nanobots will swim in the bloodstream to deliver cancer drugs to the targeted cellswithout damaging others. This, however, is unlikely to be realised in the next 10 years as scientists have to overcome the challenge ofunderstanding the toxic effects ofsuchswarms of nanobots inside the blood and how to mitigate them.

More realistic possibilities areadvancementsindeviceminiaturisation andimprovement in their performance. Its entirely possible to have computers with storage capacity 10 times more or completely foldable laptops and mobile screens as well as foldable electronic newspapers.There could be nano-sensors on aircraft, bridges or nuclear power plants to monitor health so that minor problems dont turn into a major operational issues.Paint industry is also an area that may be transformedas there would bepaintswith nanomaterials to keep your walls dry even in rain,resist scratches andmake a tankvanish before the eyes of the enemy.

WATER:The hunt to harvest

Nanotechnologywillplay a crucial role in improving peoples access to water. Although oceans fill uptwo-thirdsof the planet, scarcity of fresh water is severely threatening both agriculture and the availability of drinking water for regular household usage.Thesolutions that may be realised in the next decadewill depend largely on nanotechnology and nanomaterials.Technological breakthroughs are expected to lower the energy requirement of the desalination process so that they become commercially viable. Removal of arsenic and fluoride using new materials and technology is entirely doable. Scientists have made progress in harvesting water from natural sources like humidity and fog, which may come closer to reality in the next 10 years.

With the advancementof artificial intelligence and better solutions to big data problems, what is likely to be realised is a Google Earth kind of platform on water resources, mapping the water usageof everyhouseholdin the world and the nature of spending. Scientists believe this wouldnot onlyautomaticallylead to enormous savings in water use, but alsoconvert everycivil infrastructure intoa placeto harvest and conserve water.

COMPUTATION: The big wave is coming

There are several low-hanging fruits to be realised within the next 10 years, but it would take decades to witness the full potential of quantum computing the holy grail of computing. A foundation of the quantum computings backbone may be laid in the next 10 years.

Artificial intelligence, big data processing and IoT are beginning to change urban lives, even though their potential is far more. AI is the next big thing, which would result in self-driving vehicles, swarms of drones and rockets, robotic manufacturing, managing complex logistics and vertical farming. From stock markets to healthcare, AI will rule everywhere.

Riding on a 5G backbone, Internet of Things will make smart homes and offices a reality with remote and intelligent operations. In such homes and offices, every home appliance is connected and can be operated remotely. By 2025, it is projected that nearly 100 trillion devices will be connected through smart interfaces with an economic impact between $2.7 to $6.2 trillion annually and IoT will change the fundamental nature of business. But all of them will pale before quantum communication technologies.

A future quantum computer could, for example, crack any of the modern common security systems such as 128-bit AES encryption, the best one in the market in seconds. Even the best supercomputer would take millions of years to do the same job. However, it would not be easy to get there, even though the US National Institute of Standards and Technology has predicted that quantum computers will be able to crack the 128-bit AES encryption by 2029. Scientists hope, in the next 10 years, a backbone for a global secure quantum communication network would be in place, but problems like what materials are to be used in quantum computers, what architecture is to follow and what types of protocols are needed in quantum communication may take a far longer time to resolve. A better understanding of the quantum world would also equip the scientists with weapons to cross the final frontier the brain.

BRAIN: Cracking the cerebral codes

Every advancement in biology in the last century was aimed at the ultimate goal of treating diseases of the body.Theongoing centurywill see an equal,if not more,thrust on treating diseases of the mind aswell, withan increasing pool oftop-classbiologists, physicists and computer scientistsjoininghandstounravel the mysteries of the brain.

Dementia is one such area that would progress enormouslyin the next 10 yearsas thedisease now gets worldwide attentiondue to itshuge economicconsequence. Thegoal isnowtoidentifyearlybiomarkersthat get activatedtwo to three decades before the disease sets in.Earlydetectionwould lead toearlyintervention and better management ofmany such neurological illnesses.

As scientists try to crack the cerebralcodes,they often face a handicap due to theabsence of relevant disease models tocome out withnewdrugs and diagnostics.Advancement in stem cell technologyand creationof organoids provided good leadsso far, but the next decade will witnessrapidprogress leading to an accelerated pace of drug development.An increasingly more number of scientists wouldalso explorethe brain as an integrated system along with thebody'simmune system or microbiome.The aim, once again, would be tofind out thecurefordiseases of the mind.

Morefundamentalquestions likewhatdefinescognition orwhether there is free will, would have to wait longer for an answer.

GENETIC ENGINEERING: Look before you leap

Now, this one is aminefield. No doubt engineered microbes would bring revolutions in chemical and industrial processes, while advancements in RNA technology (as seen in Covid-19 vaccines) will overhaul vaccine development with its potential to create life-saving shots within weeks. But the big fear is whether technological progress would usher in an era of eugenics 2.0.

At the core lies CRISPR gene-editing technology a tool so powerful that humans can even think of playing God. Chinese scientist He Jiankuis feat of producing designer babies exacerbated such fears. There are two ways to use gene-alterations. It can be done through somatic editing to cure a particular disease or disorder caused by defective genes. This, in all probability, would emerge as a therapy. But, more dangerous would be germline editing, which would allow genetic changes to transmit to the next generation. Just think what would happen if traits like good looks, athleticism and intelligence become modifiable and hereditary. It's a complete no-no at the global scale and there are really tough scientific challenges to overcome, but scientists do fear the creation of a grey market for such designer babies somewhere in the world.

See the article here:
The science of looking ahead - Deccan Herald

World Finance offers in-depth and high-quality journalism on a huge variety of topics in its eagerly anticipated Winter 2021 issue, released today -…

LONDON, Jan. 4, 2021 /PRNewswire/ -- The cover story of this 226 page issue features the impressive Dame Jayne-Anne Gadhia, formerly of Virgin Money, who founded Snoop in 2020. And her start-up leads us into a feature length piece of investigative journalism from Emily Cashen on open banking and fintech.

The use of physical cash has been decreasing for many years now and a global pandemic with enforced shutdowns hastened that trend. Laura French explores whether we are now ready to embrace a cashless world.

Elsewhere in the magazine, Alex Katsomitros explores the potential impact of a set of proposals from the OECD that would completely reform corporation tax, put together after continued concerns over inequality and the need for a post-pandemic economic recovery.

Meanwhile, with governments the whole world over providing loans and financial aid packages to levels never previously seen before, Selwyn Parker discusses what happens next as we potentially venture into a sea of debt.

Additionally, Richard Willsher looks at how the forex markets navigated a pandemic by seamlessly shifting operations to a WFH environment thanks to the rise of e-platforms and online tools.

Topics also covered in the winter edition of World Finance includecryptocurrencies, corporate art, shipbuilding, Zoom's breakthrough year, quantum computing and recession success stories.

To read about all of this and more, pick up the latest issue of World Finance magazine, available in print, on tablet and online now.

http://www.worldfinance.com

World News Media is a leading publisher of quality financial and business magazines. It benefits from a global distribution network that includes subscriber lists of prominent decision-makers around the world.

CONTACT INFORMATION

World News Media Richard Willcox+44 (0)207 553 4151[emailprotected]

SOURCE World News Media

Read more:
World Finance offers in-depth and high-quality journalism on a huge variety of topics in its eagerly anticipated Winter 2021 issue, released today -...

The silver lining of 2020 – SouthCoastToday.com

Tyler Cowen| Bloomberg Opinion

Columns share an author's personal perspective and are often based on facts in the newspaper's reporting.

For obvious reasons, 2020 will not go down as a good year. At the same time, it has brought more scientific progress than any year in recent memory and these advances will last long after COVID-19 as a major threat is gone.

Two of the most obvious and tangible signs of progress are the mRNA vaccines now being distributed across America and around the world. These vaccines appear to have very high levels of efficacy and safety, and they can be produced more quickly than more conventional vaccines. They are the main reason to have a relatively optimistic outlook for 2021. The mRNA technology also may have broader potential, for instance by helping to mend damaged hearts.

Other advances in the biosciences may prove no less stunning. A very promising vaccine candidate against malaria, perhaps the greatest killer in human history, is in the final stages of testing. Advances in vaccine technology have created the real possibility of a universal flu vaccine, and work is proceeding on that front. New CRISPR techniques appear on the verge of vanquishing sickle-cell anemia, and other CRISPR methods have allowed scientists to create a new smartphone-based diagnostic test that would detect viruses and offer diagnoses within half an hour.

It has been a good year for artificial intelligence as well. GPT-3 technology allows for the creation of remarkably human-like writing of great depth and complexity. It is a major step toward the creation of automated entities that can react in very human ways. DeepMind, meanwhile, has used computational techniques to make major advances in protein folding. This is a breakthrough in biology that may lead to the easier discovery of new pharmaceuticals.

One general precondition behind many of these advances is the decentralized access to enormous computing power, typically through cloud computing. China seems to be progressing with a photon method for quantum computing, a development that is hard to verify but could prove to be of great importance.

Computational biology, in particular, is booming. The Moderna vaccine mRNA was designed in two days, and without access to COVID-19 itself, a remarkable achievement that would not have been possible only a short while ago. This likely heralds the arrival of many other future breakthroughs from computational biology.

Internet access itself will be spreading. Starlink, for example, has a plausible plan to supply satellite-based internet connections to the entire world.

It also has been a good year for progress in transportation.

Driverless vehicles appeared to be stalled, but Walmart will be using them on some truck deliveries in 2021. Boom, a startup that is pushing to develop feasible and affordable supersonic flight, now has a valuation of over $1 billion, with prototypes expected next year. SpaceX achieved virtually every launch and rocket goal it had announced for the year. Toyota and other companies have announced major progress on batteries for electric vehicles, and the related products are expected to debut in 2021.

All this will prove a boon for the environment, as will progress in solar power, which in many settings is as cheap as any relevant alternative. China is opening a new and promising fusion reactor. Despite the absence of a coherent U.S. national energy policy, the notion of a mostly green energy future no longer appears utopian.

In previous eras, advances in energy and transportation typically have brought further technological advances, by enabling humans to conquer and reshape their physical environments in new and unexpected ways. We can hope that general trend will continue.

Finally, while not quite meeting the definition of a scientific advance, the rise of remote work is a real breakthrough. Many more Zoom meetings will be held, and many business trips will never return. Many may see this as a mixed blessing, but it will improve productivity significantly. It will be easier to hire foreign workers, easier for tech or finance workers to move to Miami, and easier to live in New Jersey and commute into Manhattan only once a week. The most productive employees will be able to work from home more easily.

Without a doubt, it has been a tragic year. Alongside the sadness and failure, however, there has been quite a bit of progress. Thats something worth keeping in mind, even if we cant quite bring ourselves to celebrate, as we look back on 2020.

Tyler Cowen is a Bloomberg Opinion columnist. He is a professor of economics at George Mason University and writes for the blog Marginal Revolution. His books include "Big Business: A Love Letter to an American Anti-Hero."

Get the news delivered to your inbox. Sign up for our Newsletters: Morning, afternoon, sports, entertainment and breaking news.

Support local journalism: Sign up for an online subscription for less than 50 cents a day.

See the original post here:
The silver lining of 2020 - SouthCoastToday.com

How long does numbness last after tooth extraction?

Numbness is one of the typical problems that people experience after tooth extraction.

That is, you may experience numbness in your gums, lower lip, chin, and some other parts of your mouth after extraction.

black and silver stethoscope on white surface

But that's not a big problem as it is common for most people to feel numbness after their tooth extraction.

However, people still worry about how to cure numbness and how long does numbness last after tooth extraction?

Here's the complete guide regarding this issue:

Numbness is a lack of feeling around your gums, chin, and some other oral parts after the surgery.

According to dental experts, numbness can last for 10 - 12 hours or sometimes 24 hours after tooth extraction.

It usually lasts not more than a day in the typical tooth extraction.

However, numbness may go longer than weeks or even months if caused by nerve damage during the tooth extraction procedure. And in that case, you need to take strict care of your oral health to avoid any severe complications.

What causes numbness after tooth extraction?

Numbness is a typical condition usually caused by the dental anesthesia given during the tooth extraction procedure.

Not only anesthesia but also other oral treatments like Implant surgery, denture placement, root canals can cause numbness in your mouth.

Thus, one shouldn’t be worried about this issue unless and until it does not go longer than a few days.

Consult with your dentist if numbness last longer than usual.

How to get rid of numbness after tooth extraction?

It is genuinely possible to reduce or completely heal numbness at your home.

Here are some of the professional ways that you must try!

Watch what you eat

If you've recently gone through tooth extraction, then it's highly likely your dentist has already given you a list of items that you should forget eating for a while, these may include eating solid foods, drinking soda, and even smoking is something you should refrain yourself from doing until you fully heal so be sure you're not doing either of these things!

Massage your cheeks

Massage and a warm compress can help increase blood flow to the affected area.

Just soak a piece of cloth in hot water and squeeze it well. Then apply it directly to your cheeks where you are feeling numbness.

Repeat this remedy twice a day until the complete relief.

Also, massage the affected areas with your fingers.

Cold compression

It is a beneficial way to minimize swelling after extraction.

Just add some ice cubes in a plastic bag and apply them to your cheeks where you feel numbness.

Hold it for at least 5 minutes.

Repeat this remedy twice a day.

Note: You can skip it if you are experiencing any problem through it.

Anti-inflammatory medicines

green and silver stethoscope on white envelope

Anti-inflammatory medications can help to reduce swelling, numbness, and pain from the extraction site.

Ask your dentist to prescribe you the medicines that may help you get rid of this issue.

Do not take any of the medications without the prescription of your medical expert.

Keep your head up

It is always recommended to keep your head upright position after tooth extraction.

That helps the blood to flow properly and avoids excessive bleeding and numbness from the extraction site.

Keep extra pillows under your head so you can easily elevate your head even while sleeping.

Take a nap

A nap is crucial after tooth extraction, because falling asleep can let to get your mind off about the pain or numbness.

Sit back on a sofa or bed, relax, and let the numbing sensation go away.

Sleep for at least 8 - 10 hours at night after tooth extraction.

Rinse after every meal

Rinse your mouth with warm water after taking any food or drink.

That will help to clean your entire mouth, especially the extraction site from the stuck particles.

But don't rinse forcefully, and keep care of your extraction site while doing that.

Be cautious while brushing

Wrong brushing technique can affect your tooth extraction site and may lengthen the healing time.

Do not use the brush over the extraction site for at least 24 – 48 hours after the extraction.

Use brushes with soft bristles and brush in light movements.

Also, don't spit forcefully, as this can also affect the extraction site.

Avoid tough activities

Eating, talking, and other movements through the mouth can cause post-extraction problems.

That is, you should avoid hard and spicy foods on the first day after tooth extraction.

Consume soft and semi-soft foods like yogurt, mashed potatoes, oatmeal, and less spicy soups on the first day.

Avoid smoking coffee, soda, alcohol, tobacco as they can cause infection.

Also, do not talk too much and give rest to your mouth and tooth extraction site.

How long does numbness last after filling?

As we mentioned above, numbness can also be caused after a tooth filling.

Numbness after filling lasts for a few days to a couple of weeks.

But you shouldn't worry about that, as numbness is typical for most people after dental filling.

Can your ear numb after wisdom teeth removal?

Ear numbness is a rare condition after teeth removal.

If you are experiencing this issue, you should consult with an ENT specialist.

Don’t get late, or you’ll worsen the condition.

The Bottom Line

Numbness is not a thing to worry about after tooth extraction.

You can quickly get rid of it using some of the simple tips we mentioned above.

All in all, we hope you've got the answer to what you were searching for plus have learned so many new things regarding your problem.

Four Benefits Of Artificial Intelligence And Machine Learning In Banking – CIO Applications

Artificial intelligence in banking helps clients evaluate the vast amount of information, from the users request in social networks to make informed and safe decisions.

Fremont, CA: Artificial intelligence and machine learning in banking offer many opportunities for personalization, data analysis, tasks solving abilities, and also reasonable costs for implementation.

The widespread rise in the importance of artificial intelligence and machine learning for banking has strong foundations as the technologies offer new and useful benefit.

Here are four benefits of artificial intelligence and machine learning in banking:

A Cutting Edge Advantage:

Machine learning in banks have the capability to make users more competitive according to the task they want to solve.

Advanced Data Analysis:

Banks used to evaluate data with less access to information such as when a client comes with a request to issue a loan, the decision was made only based on the statement of income, current assets and liabilities of the client, and the credit history. Today, artificial intelligence in banking helps clients evaluate the vast amount of information, from the users request in social networks to make informed and safe decisions.

Better Security:

Artificial intelligence in banking can be implemented in various ways to achieve higher security. Credit card fraud detection implementing machine learning has become a common application of the technology, and innovative cameras with face recognition can identify if a client has wrong intentions by judging the facial expressions.

Costs Cut:

Artificial intelligence and machine learning can help cut costs for banks and financial institutions based on how these technologies are used. Integrating robo-advisors in the support team can help reduce the cost of staff maintenance.

See Also:

TopBanking Technology Solution Companies

TopBanking Technology Consulting/Service Companies

The rest is here:
Four Benefits Of Artificial Intelligence And Machine Learning In Banking - CIO Applications

National Grid sees machine learning as the brains behind the utility business of the future – TechCrunch

If the portfolio of a corporate venture capital firm can be taken as a signal for the strategic priorities of their parent companies, then National Grid has high hopes for automation as the future of the utility industry.

The heavy emphasis on automation and machine learning from one of the nations largest privately held utilities with a customer base numbering around 20 million people is significant. And a sign of where the industry could be going.

Since its launch, National Grids venture firm, National Grid Partners, has invested in 16 startups that featured machine learning at the core of their pitch. Most recently, the company backed AI Dash, which uses machine learning algorithms to analyze satellite images and infer the encroachment of vegetation on National Grid power lines to avoid outages.

Another recent investment, Aperio, uses data from sensors monitoring critical infrastructure to predict loss of data quality from degradation or cyberattacks.

Indeed, of the $175 million in investments the firm has made, roughly $135 million has been committed to companies leveraging machine learning for their services.

AI will be critical for the energy industry to achieve aggressive decarbonization and decentralization goals, said Lisa Lambert, the chief technology and innovation officer at National Grid and the founder and president of National Grid Partners.

National Grid started the year off slowly because of the COVID-19 epidemic, but the pace of its investments picked up and the company is on track to hit its investment targets for the year, Lambert said.

Modernization is critical for an industry that still mostly runs on spreadsheets and collective knowledge that has locked in an aging employee base, with no contingency plans in the event of retirement, Lambert said. Its that situation thats compelling National Grid and other utilities to automate more of their business.

Most companies in the utility sector are trying to automate now for efficiency reasons and cost reasons. Today, most companies have everything written down in manuals; as an industry, we basically still run our networks off spreadsheets, and the skills and experience of the people who run the networks. So weve got serious issues if those people retire. Automating [and] digitizing is top of mind for all the utilities weve talked to in the Next Grid Alliance.

To date, a lot of the automation work thats been done has been around basic automation of business processes. But there are new capabilities on the horizon that will push the automation of different activities up the value chain, Lambert said.

ML is the next level predictive maintenance of your assets, delivering for the customer. Uniphore, for example: youre learning from every interaction you have with your customer, incorporating that into the algorithm, and the next time you meet a customer, youre going to do better. So thats the next generation, Lambert said. Once everything is digital, youre learning from those engagements whether engaging an asset or a human being.

Lambert sees another source of demand for new machine learning tech in the need for utilities to rapidly decarbonize. The move away from fossil fuels will necessitate entirely new ways of operating and managing a power grid. One where humans are less likely to be in the loop.

In the next five years, utilities have to get automation and analytics right if theyre going to have any chance at a net-zero world youre going to need to run those assets differently, said Lambert. Windmills and solar panels are not [part of] traditional distribution networks. A lot of traditional engineers probably dont think about the need to innovate, because theyre building out the engineering technology that was relevant when assets were built decades ago whereas all these renewable assets have been built in the era of OT/IT.

Follow this link:
National Grid sees machine learning as the brains behind the utility business of the future - TechCrunch

An introduction to data science and machine learning with Microsoft Excel – TechTalks

This article is part ofAI education, a series of posts that review and explore educational content on data science and machine learning. (In partnership withPaperspace)

Machine learning and deep learning have become an important part of many applications we use every day. There are few domains that the fast expansion of machine learning hasnt touched. Many businesses have thrived by developing the right strategy to integrate machine learning algorithms into their operations and processes. Others have lost ground to competitors after ignoring the undeniable advances in artificial intelligence.

But mastering machine learning is a difficult process. You need to start with a solid knowledge of linear algebra and calculus, master a programming language such as Python, and become proficient with data science and machine learning libraries such as Numpy, Scikit-learn, TensorFlow, and PyTorch.

And if you want to create machine learning systems that integrate and scale, youll have to learn cloud platforms such as Amazon AWS, Microsoft Azure, and Google Cloud.

Naturally, not everyone needs to become a machine learning engineer. But almost everyone who is running a business or organization that systematically collects and processes can benefit from some knowledge of data science and machine learning. Fortunately, there are several courses that provide a high-level overview of machine learning and deep learning without going too deep into math and coding.

But in my experience, a good understanding of data science and machine learning requires some hands-on experience with algorithms. In this regard, a very valuable and often-overlooked tool is Microsoft Excel.

To most people, MS Excel is a spreadsheet application that stores data in tabular format and performs very basic mathematical operations. But in reality, Excel is a powerful computation tool that can solve complicated problems. Excel also has many features that allow you to create machine learning models directly into your workbooks.

While Ive been using Excels mathematical tools for years, I didnt come to appreciate its use for learning and applying data science and machine learning until I picked up Learn Data Mining Through Excel: A Step-by-Step Approach for Understanding Machine Learning Methods by Hong Zhou.

Learn Data Mining Through Excel takes you through the basics of machine learning step by step and shows how you can implement many algorithms using basic Excel functions and a few of the applications advanced tools.

While Excel will in no way replace Python machine learning, it is a great window to learn the basics of AI and solve many basic problems without writing a line of code.

Linear regression is a simple machine learning algorithm that has many uses for analyzing data and predicting outcomes. Linear regression is especially useful when your data is neatly arranged in tabular format. Excel has several features that enable you to create regression models from tabular data in your spreadsheets.

One of the most intuitive is the data chart tool, which is a powerful data visualization feature. For instance, the scatter plot chart displays the values of your data on a cartesian plane. But in addition to showing the distribution of your data, Excels chart tool can create a machine learning model that can predict the changes in the values of your data. The feature, called Trendline, creates a regression model from your data. You can set the trendline to one of several regression algorithms, including linear, polynomial, logarithmic, and exponential. You can also configure the chart to display the parameters of your machine learning model, which you can use to predict the outcome of new observations.

You can add several trendlines to the same chart. This makes it easy to quickly test and compare the performance of different machine learning models on your data.

In addition to exploring the chart tool, Learn Data Mining Through Excel takes you through several other procedures that can help develop more advanced regression models. These include formulas such as LINEST and LINREG formulas, which calculate the parameters of your machine learning models based on your training data.

The author also takes you through the step-by-step creation of linear regression models using Excels basic formulas such as SUM and SUMPRODUCT. This is a recurring theme in the book: Youll see the mathematical formula of a machine learning model, learn the basic reasoning behind it, and create it step by step by combining values and formulas in several cells and cell arrays.

While this might not be the most efficient way to do production-level data science work, it is certainly a very good way to learn the workings of machine learning algorithms.

Sign up to receive updates from TechTalks

Beyond regression models, you can use Excel for other machine learning algorithms. Learn Data Mining Through Excel provides a rich roster of supervised and unsupervised machine learning algorithms, including k-means clustering, k-nearest neighbor, nave Bayes classification, and decision trees.

The process can get a bit convoluted at times, but if you stay on track, the logic will easily fall in place. For instance, in the k-means clustering chapter, youll get to use a vast array of Excel formulas and features (INDEX, IF, AVERAGEIF, ADDRESS, and many others) across several worksheets to calculate cluster centers and refine them. This is not a very efficient way to do clustering, youll be able to track and study your clusters as they become refined in every consecutive sheet. From an educational standpoint, the experience is very different from programming books where you provide a machine learning library function your data points and it outputs the clusters and their properties.

In the decision tree chapter, you will go through the process calculating entropy and selecting features for each branch of your machine learning model. Again, the process is slow and manual, but seeing under the hood of the machine learning algorithm is a rewarding experience.

In many of the books chapters, youll use the Solver tool to minimize your loss function. This is where youll see the limits of Excel, because even a simple model with a dozen parameters can slow your computer down to a crawl, especially if your data sample is several hundred rows in size. But the Solver is an especially powerful tool when you want to finetune the parameters of your machine learning model.

Learn Data Mining Through Excel shows that Excel can even advanced machine learning algorithms. Theres a chapter that delves into the meticulous creation of deep learning models. First, youll create a single layer artificial neural network with less than a dozen parameters. Then youll expand on the concept to create a deep learning model with hidden layers. The computation is very slow and inefficient, but it works, and the components are the same: cell values, formulas, and the powerful Solver tool.

In the last chapter, youll create a rudimentary natural language processing (NLP) application, using Excel to create a sentiment analysis machine learning model. Youll use formulas to create a bag of words model, preprocess and tokenize hotel reviews and classify them based on the density of positive and negative keywords. In the process youll learn quite a bit about how contemporary AI deals with language and how much different it is from how we humans process written and spoken language.

Whether youre making C-level decisions at your company, working in human resources, or managing supply chains and manufacturing facilities, a basic knowledge of machine learning will be important if you will be working with data scientists and AI people. Likewise, if youre a reporter covering AI news or a PR agency working on behalf a company that uses machine learning, writing about the technology without knowing how it works is a bad idea (I will write a separate post about the many awful AI pitches I receive every day). In my opinion, Learn Data Mining Through Excel is a smooth and quick read that will help you gain that important knowledge.

Beyond learning the basics, Excel can be a powerful addition to your repertoire of machine learning tools. While its not good for dealing with big data sets and complicated algorithms, it can help with the visualization and analysis of smaller batches of data. The results you obtain from a quick Excel mining can provide pertinent insights in choosing the right direction and machine learning algorithm to tackle the problem at hand.

Visit link:
An introduction to data science and machine learning with Microsoft Excel - TechTalks