The technical realities of functional quantum computers – is Googles ten-year plan for Quantum Computing viable? – Diginomica

In March, I explored the enterprise readiness of quantum computing in Quantum computing is right around the corner, but cooling is a problem. What are the options? I also detailed potential industry use cases, from supply chain to banking and finance. But what are the industry giants pursuing?

Recently, I listened to two somewhat different perspectives on quantum computing. One is Googles (public) ten-year plan.

Google plans to search for commercially viable applications in the short term, but they dont think there will be many for another ten years - a time frame I've heard one referred to as bound but loose. What that meant was, no more than ten, maybe sooner. In the industry, the term for the current state of the art is NISQ Noisy, Interim Scale Quantum Computing.

The largest quantum computers are in the 50-70 qubit range, and Google feels NISQ has a ceiling of maybe two hundred. The "noisy" part of NISQ is because the qubits need to interact and be nearby. That generates noise. The more qubits, the more noise, and the more challenging it is to control the noise.

But Google suggests the real unsolved problems in fields like optimization, materials science, chemistry, drug discovery, finance, and electronics will take machines with thousands of qubits and even envision one million on a planar array etched in aluminum. Major problems need solving such noise elimination, coherence, and lifetime (a qubit holds its position in a tiny time slice).

In the meantime, Google is seeking customers to work with them to find applications working with Google researchers. Quantum computing needs algorithms as much as it needs qubits. It requires customers with a strong in-house science team and a commitment of three years. Whatever is discovered will be published as open source.

In summary, Google does not see commercial value in NISQ. They are using NISQ to discover what quantum computing can do that has any commercial capability.

First of all, if you have a picture in your mind of a quantum computer, chances are you are not including an essential element a conventional computer. According toQuantum Computing, Progress, and Prospects:

Although reports in the popular press tend to focus on the development of qubits and the number of qubits in the current prototypical quantum computing chip, any quantum computer requires an integrated hardware approach using significant conventional hardware to enable qubits to be controlled, programmed, and read out.

The author is undoubtedly correct. Most material about quantum computers never mentions this, and it raises quite a few issues that can potentially dilute the gee-whiz aspect. I'd heard this first from Itamar Sivan, Ph.D., CEO, Quantum Machines. He followed with the quip that technically, quantum computers aren't computers. Its that simple. They are not Turing Machines. File this under the category of "You're Not Too Old to Learn Something New.

From (Hindi) Theory of Computation - Turing Machine:

A Turing machine is a mathematical model of computation that defines an abstract machine, which manipulates symbols on a strip of tape according to a table of rules. Despite the model's simplicity, given any computer algorithm, a Turing machine capable of simulating that algorithm's logic can be constructed.

Dr. Sivan clarified this as follows:

Any computer to ever be used, from the early-days computers, to massive HPCs, are all Turing-machines, and are thereforeequivalent to one another. All computers developedand manufactured in the last decades, are all merelybigger and more compact variations of one another. A quantum computer however is not MERELY a more advanced Turing machine, it is a different type of machine, and classical Turing machines are not equivalent to quantum computers as they are equivalent to one another.

Therefore, the complexity of running particular algorithms on quantum computers is different from the complexity of running them on classical machines. Just to make it clear, a quantum computer can be degenerated to behave like a classical computer, but NOT vice-versa.

There is a lot more to this concept, but most computers you've ever seen or heard of are Turing Machines, except Quantum computers. This should come as no surprise because anything about quantum mechanics is weird and counter-intuitive, so why would a quantum computer be any different?

According to Sivan, a quantum computer needs three elements to perform: a quantum computer and an orchestration platform of (conventional) hardware and software. There is no software in a quantum computer. The platform manages the progress of their algorithm through, mostly laser beams pulses. The logic needed to operate the quantum computer resides with and is controlled by the orchestration platform.

The crucial difference in Google's and Quantum Machines' strategy is that Google views the current NISQ state of affairs as a testbed for finding algorithms and applications for future development. At the same time, Sivan and his company produced an orchestration platform to put the current technology in play. Their platform is quantum computer agnostic it can operate with any of them. Sivan feels that focusing solely on the number of qubits is just part of the equation. According to Dr. Sivan:

While today's most advanced quantum computers only have a relatively small number of available qubits (53 for IBM's latest generation and 54 for Google's Sycamore processor), we cannot maximize the potential of even this relatively small count. We are leaving a lot on the table with regards to what we can already accomplish with the computing power we already have. While we should continue to scale up the number of qubits, we also need to focus on maximizing what we already have.

Ive asked a few quantum computer scientists if quantum computers can solve the Halting Problem.In Wikipedia:

The halting problem is determining, from a description of an arbitrarycomputer programand an input, whether the program will finish running, or continue to run forever.Alan Turingproved in 1936 that a generalalgorithmto solve the halting problem for all possible program-input pairs could not exist.

That puts it in a class of problems that are undecidable. Oddly, opinion was split onthequestion, despite Turings Proof. Like Simplico said to Galileo inDialogues Concerning Two New Sciences, If Aristotle had not said otherwise I would have believed it.

There are so many undecidable problems in math that I wondered if some of these might fall out.For example, straight from current AI problems, Planning in aPartially observable Markov decision process is considered undecidable. A million qubits? Maybe not. After all, Dr. Sivan pointed out that toreplicate in a classical processor, the information in just a 300 qubit quantum processor would require more transistors than all of the atoms inthe universe.

I've always believed that action speaks louder than words. While Google is taking the long view, Quantum Machines provides the platform to see how far we can go with current technology. Googles tactics are familiar. Every time you use TensorFlow, it gets better. Every time play with their autonomous car, it gets better. Their collaboration with a dozen or so technically advanced companies makes their quantum technology better.

Read the original:
The technical realities of functional quantum computers - is Googles ten-year plan for Quantum Computing viable? - Diginomica

What’s New in HPC Research: Hermione, Thermal Neutrons, Certifications & More – HPCwire

In this bimonthly feature,HPCwirehighlights newly published research in the high-performance computing community and related domains. From parallel programming to exascale to quantum computing, the details are here.

Developing a performance model-based predictor for parallel applications on the cloud

As cloud computing becomes an increasingly viable alternative to on-premises HPC, researchers are turning their eyes to addressing latency and unreliability issues in cloud HPC environments. These researchers a duo from the Egypt-Japan University of Science and Technology and Benha University propose a predictor for the execution time of MPI-based cloud HPC applications, finding an 88% accuracy on ten benchmarks.

Authors: Abdallah Saad and Ahmed El-Mahdy.

Investigating portability, performance and maintenance tradeoffs in exascale systems

As the exascale era swiftly approaches, researchers are increasingly grappling with the difficult tradeoffs between major system priorities that will be demanded by such massive systems. These researchers a team from the University of Macedonia explore these tradeoffs through a case study measuring the effect of runtime optimizations on code maintainability.

Authors: Elvira-Maria Arvanitou, Apostolos Ampatzoglou, Nikolaos Nikolaidis, Aggeliki-Agathi Tzintzira, Areti Ampatzoglou and Alexander Chatzigeorgiou.

Moving toward a globally acknowledged HPC certification

Skillsets are incredibly important in the HPC world, but certification is far from uniform. This paper, written by a team from four universities in the UK and Germany, describes the HPC Certification Forum: an effort to categorize, define and examine competencies expected from proficient HPC practitioners. The authors describe the first two years of the community-led forum and outline plans for the first officially supported certificate in the second half of 2020.

Authors: Julian Kunkel, Weronika Filinger, Christian Meesters and Anja Gerbes.

Uncovering the hidden cityscape of ancient Hermione with HPC

In this paper, a team of researchers from the Digital Archaeology Laboratory at Lund University describe how they used a combination of HPC and integrated digital methods to uncover the ancient cityscape of Hermione, Greece. Using drones, laser scanning and modeling techniques, they fed their inputs into an HPC system, where they rendered a fully 3D representation of the citys landscape.

Authors: Giacomo Landeschi, Stefan Lindgren, Henrik Gerding, Alcestis Papadimitriou and Jenny Wallensten.

Examining thermal neutrons threat to supercomputers

Off-the-shelf devices are performant, efficient and cheap, making them popular choices for HPC and other compute-intensive fields. However, the cheap boron used in these devices makes them susceptible to thermal neutrons, which these authors (a team from Brazil, the UK and Los Alamos National Laboratory) contend pose a serious threat to those devices reliability. The authors examine RAM, GPUs, accelerators, an FPGA and more, tinkering with variables that affect the thermal neutron flux and measuring the threat posed by the neutrons under various conditions.

Authors: Daniel Oliveira, Sean Blanchard, Nathan DeBardeleben, Fernando Fernandes dos Santos, Gabriel Piscoya Dvila, Philippe Navaux, Andrea Favalli, Opale Schappert, Stephen Wender, Carlo Cazzaniga, Christopher Frost and Paolo Rech.

Deploying scientific AI networks at petaflop scale on HPC systems with containers

The computational demands of AI and ML systems are rapidly increasing in the scientific research sphere. These authors a duo from LRZ and CERN discuss the complications surrounding the deployment of ML frameworks on large-scale, secure HPC systems. They highlight a case study deployment of a convolutional neural network with petaflop performance on an HPC system.

Authors: David Brayford and Sofia Vallecorsa.

Running a high-performance simulation of a spiking neural network on GPUs

Spiking neural networks (SNNs) are the most commonly used computational model for neuroscience and neuromorphic computing, but simulations of SNNs on GPUs have imperfectly represented the networks, leading to performance and behavior shortfalls. These authors from Tsinghua University propose a series of technical approaches to more accurately representing SNNs on GPUs, including a code generation framework for high-performance simulations.

Authors: Peng Qu, Youhui Zhang, Xiang Fei and Weimin Zheng.

Do you know about research that should be included in next months list? If so, send us an email at[emailprotected]. We look forward to hearing from you.

Original post:
What's New in HPC Research: Hermione, Thermal Neutrons, Certifications & More - HPCwire

The future of quantum computing is Azure bright and you can try it – The American Genius

As time goes on, the value of efficiency and convenience becomes more and more important. Weve seen this in many examples from talk-to-text, to ordering food directly to your door without ever even speaking to another human.

Now coming into the convenience game is a keyboard that allows you to scan instead of type. Anyline is the new keyboard that instantly collects data with the snap of a camera.

Scan ID information, serial numbers, vouchers, IBANs, and barcodes in an instant with your smartphone, as it is compatible with Android and iOS. The app also allows you to scan things such as gift card barcodes, phone numbers you see on street advertisements, and more so, in a sense, it brings CTRL + C to real life.

With your smartphone, you can instantly collect data with the scan function on your keyboard. The platform is compatible with messenger, email, and browser apps. You scan the data and instantly paste it where you want it, saving the time of manual data entry.

This would be useful for scanning things to your notes section that you may refer to often, like your health insurance ID number, your WiFi router information, credit card info and what not.With anything else like this, the concern of privacy is always there so make sure youre doing what you can to protect your information (using a passcode and/or Face ID, not using shared/public networks, etc.) While you should know it by heart, I would recommend not ever scanning your social security number.

However, something like this does save a lot of time as it doesnt involve mistyping it picks up a barcode accurately. Also, you wont need someone reading something back to you so you can accurately type it down into your phone.

This could be a simple way to save time and become a more efficient person in general, and it makes it easier to share information with others. This is also super helpful for people who have trouble reading the teeny tiny type that barcodes are often displayed in.

Comment your thoughts below, and share any tips you use to help further your efficiency!

Read more:
The future of quantum computing is Azure bright and you can try it - The American Genius

European quantum computing startup takes its funding to 32M with fresh raise – TechCrunch

IQM Finland Oy (IQM), a European startup which makes hardware for quantum computers, has raised a 15M equity investment round from the EIC Accelerator program for the development of quantum computers. This is in addition to a raise of 3.3M from the Business Finland government agency. This takes the companys funding to over 32M. The company previously raised a 11.4M seed round.

IQM has hired a lot of engineers in its short life, and now says it plans to hire one quantum engineer per week on the pathway to commercializing its technology through the collaborative design of quantum-computing hardware and applications.

Dr. Jan Goetz, CEO and co-founder of IQM said: Quantum computers will be funded by European governments, supporting IQM s expansion strategy to build quantum computers in Germany, in a statement.

The news comes as the Finnish government announced only last week that it would acquire a quantum computer with 20.7M for the Finnish State Research center VTT.

It has been a mind-blowing forty-million past week for quantum computers in Finland. IQM staff is excited to work together with VTT, Aalto University, and CSC in this ecosystem, rejoices Prof. Mikko Mttnen, Chief Scientist and co-founder of IQM.

Previously, the German government said it would put 2bn into commissioning at least two quantum computers.

IQM thus now plans to expand its operations in Germany via its team in Munich.

IQM will build co-design quantum computers for commercial applications and install testing facilities for quantum processors, said Prof. Enrique Solano, CEO of IQM Germany.

The company is focusing on superconducting quantum processors, which are streamlined for commercial applications in a Co-Design approach. This works by providing the full hardware stack for a quantum computer, integrating different technologies, and then invites collaborations with quantum software companies.

IQM was one of the 72 to succeed in the selection process of the EIC. Altogether 3969 companies applied for this funding.

Read the original:
European quantum computing startup takes its funding to 32M with fresh raise - TechCrunch

The cost of training machines is becoming a problem – The Economist

Jun 11th 2020

THE FUNDAMENTAL assumption of the computing industry is that number-crunching gets cheaper all the time. Moores law, the industrys master metronome, predicts that the number of components that can be squeezed onto a microchip of a given sizeand thus, loosely, the amount of computational power available at a given costdoubles every two years.

For many comparatively simple AI applications, that means that the cost of training a computer is falling, says Christopher Manning, the director of Stanford Universitys AI Lab. But that is not true everywhere. A combination of ballooning complexity and competition means costs at the cutting edge are rising sharply.

Dr Manning gives the example of BERT, an AI language model built by Google in 2018 and used in the firms search engine. It had more than 350m internal parameters and a prodigious appetite for data. It was trained using 3.3bn words of text culled mostly from Wikipedia, an online encyclopedia. These days, says Dr Manning, Wikipedia is not such a large data-set. If you can train a system on 30bn words its going to perform better than one trained on 3bn. And more data means more computing power to crunch it all.

OpenAI, a research firm based in California, says demand for processing power took off in 2012, as excitement around machine learning was starting to build. It has accelerated sharply. By 2018, the computer power used to train big models had risen 300,000-fold, and was doubling every three and a half months (see chart). It should knowto train its own OpenAI Five system, designed to beat humans at Defense of the Ancients 2, a popular video game, it scaled machine learning to unprecedented levels, running thousands of chips non-stop for more than ten months.

Exact figures on how much this all costs are scarce. But a paper published in 2019 by researchers at the University of Massachusetts Amherst estimated that training one version of Transformer, another big language model, could cost as much as $3m. Jerome Pesenti, Facebooks head of AI, says that one round of training for the biggest models can cost millions of dollars in electricity consumption.

Facebook, which turned a profit of $18.5bn in 2019, can afford those bills. Those less flush with cash are feeling the pinch. Andreessen Horowitz, an influential American venture-capital firm, has pointed out that many AI startups rent their processing power from cloud-computing firms like Amazon and Microsoft. The resulting billssometimes 25% of revenue or moreare one reason, it says, that AI startups may make for less attractive investments than old-style software companies. In March Dr Mannings colleagues at Stanford, including Fei-Fei Li, an AI luminary, launched the National Research Cloud, a cloud-computing initiative to help American AI researchers keep up with spiralling bills.

The growing demand for computing power has fuelled a boom in chip design and specialised devices that can perform the calculations used in AI efficiently. The first wave of specialist chips were graphics processing units (GPUs), designed in the 1990s to boost video-game graphics. As luck would have it, GPUs are also fairly well-suited to the sort of mathematics found in AI.

Further specialisation is possible, and companies are piling in to provide it. In December, Intel, a giant chipmaker, bought Habana Labs, an Israeli firm, for $2bn. Graphcore, a British firm founded in 2016, was valued at $2bn in 2019. Incumbents such as Nvidia, the biggest GPU-maker, have reworked their designs to accommodate AI. Google has designed its own tensor-processing unit (TPU) chips in-house. Baidu, a Chinese tech giant, has done the same with its own Kunlun chips. Alfonso Marone at KPMG reckons the market for specialised AI chips is already worth around $10bn, and could reach $80bn by 2025.

Computer architectures need to follow the structure of the data theyre processing, says Nigel Toon, one of Graphcores co-founders. The most basic feature of AI workloads is that they are embarrassingly parallel, which means they can be cut into thousands of chunks which can all be worked on at the same time. Graphcores chips, for instance, have more than 1,200 individual number-crunching cores, and can be linked together to provide still more power. Cerebras, a Californian startup, has taken an extreme approach. Chips are usually made in batches, with dozens or hundreds etched onto standard silicon wafers 300mm in diameter. Each of Cerebrass chips takes up an entire wafer by itself. That lets the firm cram 400,000 cores onto each.

Other optimisations are important, too. Andrew Feldman, one of Cerebrass founders, points out that AI models spend a lot of their time multiplying numbers by zero. Since those calculations always yield zero, each one is unnecessary, and Cerebrass chips are designed to avoid performing them. Unlike many tasks, says Mr Toon at Graphcore, ultra-precise calculations are not needed in AI. That means chip designers can save energy by reducing the fidelity of the numbers their creations are juggling. (Exactly how fuzzy the calculations can get remains an open question.)

All that can add up to big gains. Mr Toon reckons that Graphcores current chips are anywhere between ten and 50 times more efficient than GPUs. They have already found their way into specialised computers sold by Dell, as well as into Azure, Microsofts cloud-computing service. Cerebras has delivered equipment to two big American government laboratories.

Moores law isnt possible any more

Such innovations will be increasingly important, for the AIfuelled explosion in demand for computer power comes just as Moores law is running out of steam. Shrinking chips is getting harder, and the benefits of doing so are not what they were. Last year Jensen Huang, Nvidias founder, opined bluntly that Moores law isnt possible any more.

Other researchers are therefore looking at more exotic ideas. One is quantum computing, which uses the counter-intuitive properties of quantum mechanics to provide big speed-ups for some sorts of computation. One way to think about machine learning is as an optimisation problem, in which a computer is trying to make trade-offs between millions of variables to arrive at a solution that minimises as many as possible. A quantum-computing technique called Grovers algorithm offers big potential speed-ups, says Krysta Svore, who leads the Quantum Architectures and Computation Group at Microsoft Research.

Another idea is to take inspiration from biology, which proves that current brute-force approaches are not the only way. Cerebrass chips consume around 15kW when running flat-out, enough to power dozens of houses (an equivalent number of GPUs consumes many times more). A human brain, by contrast, uses about 20W of energyabout a thousandth as muchand is in many ways cleverer than its silicon counterpart. Firms such as Intel and IBM are therefore investigating neuromorphic chips, which contain components designed to mimic more closely the electrical behaviour of the neurons that make up biological brains.

For now, though, all that is far off. Quantum computers are relatively well-understood in theory, but despite billions of dollars in funding from tech giants such as Google, Microsoft and IBM, actually building them remains an engineering challenge. Neuromorphic chips have been built with existing technologies, but their designers are hamstrung by the fact that neuroscientists still do not understand what exactly brains do, or how they do it.

That means that, for the foreseeable future, AI researchers will have to squeeze every drop of performance from existing computing technologies. Mr Toon is bullish, arguing that there are plenty of gains to be had from more specialised hardware and from tweaking existing software to run faster. To quantify the nascent fields progress, he offers an analogy with video games: Were past Pong, he says. Were maybe at Pac-Man by now. All those without millions to spend will be hoping he is right.

This article appeared in the Technology Quarterly section of the print edition under the headline "Machine, learning"

Read the rest here:
The cost of training machines is becoming a problem - The Economist

Archer looks to commercialisation future with graphene-based biosensor tech – ZDNet

Prototype of portable, battery powered, biosensing device - a few centimetres in size.

Archer Materials has announced progressing work on its graphene-based biosensor technology.

The Australian company told shareholders on Thursday it has developed a new set of graphene materials that could be applied for enhanced biosensing and to aid in the development of biocompatible inks in water-based solvents.

Archer said doing so could eliminate the use of hazardous and non-biocompatible chemicals, increasing the scope of biomolecules that can be detected.

"There is no doubt that diseases have a devastating effect on economies and there is value in advancing disease diagnosis using simpler, more accurate biosensors," Archer CEO Dr Mohammad Choucair said. "However, there are only a limited number of materials that can perform [biosensing], and they require innovative development."

Archer said laboratory synthesis was complemented with computational chemistry to calculate and visualise the materials candidates at the atom-level for their suitability in biomolecular sensing.

"We have rapidly advanced from raw material feedstock to prototypes of a portable battery-powered sensing device that can incorporate biological material," Choucair said. "This early stage work has the potential to allow much simpler and more effective sensing where early diagnosis of life-threatening diseases can lead to much improved outcomes."

With Australia traditionally not so good at commercialising research and development, Archer touted its graphene-based biotechnology as at an early stage of commercialisation.

It said it has been working with commercial advisors within the Australian biotech industry to produce a roadmap.

Archer's commercial strategy involves applying the "triple-helix business model" for biotechnology innovation to develop printable graphene-based biosensor componentry and sublicense the associated intellectual property rights.

It's hoping to do this by developing commercial-grade prototypes; pursuing patent applications in Australia, the United States, and Europe; and establishingcommercial partnerships.

Last month, Archer announced its plan to raise up to AU$3 million, offering shares at AU$0.60 per share.

The funds raised will be used to increase the pace of Archer's current work programs and to start hiring additional staff to do this work, it said.

Also in May, Archer announced a new agreement with IBM which it hopes will advance quantum computing and progress work towards solutions for the greater adoption of the technology.

Joining theIBM Q Network, Archer will gain access to IBM's quantum computing expertise and resources, seeing the Sydney-based company use IBM's open-source software framework, Qiskit.

Archer to work alongside IBM in progressing quantum computing

First quantum-focused Australian member of the IBM Q Network.

Archer puts together a few-qubit array

The Australian company has taken the next step towards creating a room temperature quantum computer.

Australia's Archer details first stage of room temp quantum chip success

The company has announced assembly of the first qubit material component of its 12CQ room-temperature qubit processor, touting nanometre precision.

View post:
Archer looks to commercialisation future with graphene-based biosensor tech - ZDNet

Duke’s Labs Are Back in Business, But In a New Way – Duke Today

The herculean efforts to re-start Dukes campus and medical school research laboratories are nearly complete.

Thousands of lab workers, kept from their benches and equipment for months by the COVID pandemic, are shaking off the cobwebs and getting back to work generating data. But with some significant differences.

I think it'll take me a few weeks to actually get back into the rhythm, said Tatiana Segura, a professor of biomedical engineering who has a large team in two laboratory spaces in Fitzpatrick-CIEMAS.

Segura had asked all of her trainees to review their lab protocols and have a detailed plan for what they should be doing in their newly limited lab hours. But, she notes, It takes time when you start out, like if you are trying to cook or do something you haven't done for a long time, it still takes you a while to remember how to do it.

Each of the reopened labs has been left to decide the finer details about spacing and timing for itself, in a move campus leadership has called states rights. For many labs, that means coordinating through instant messages on Slack and a shared calendar on the cloud. Smaller rooms and shared equipment pose a particular scheduling challenge because of personal spacing requirements.

In all cases, the new safety rules mean fewer people in the lab and a highly structured end to the old free-wheeling, all-hours culture of laboratory work. Now lab workers start each day by recording their temperature and filling out a symptom survey. Their badges give them entry to buildings and elevators that used to be wide open. And they wear masks at all times. When its time to leave, they have to leave.

This whole thing has been pretty challenging for us because we're used to working in teams, said research scientist Stephen Crain, who typically works on Jungsang Kims quantum computing hardware with two grad students in a fourth-floor lab of the Chesterfield Building downtown. Its normally a pretty collaborative effort where, if we get stuck, we kind of work together. But now we're one person at a time. It's just hard to be as productive.

As an assistant professor of chemistry and mother, Amanda Hargrove said she talks about time management with her trainees all the time. Now theyre living it. If you're working between daycare hours, that eight hours is crazy-efficient, she said. So I'm a little interested to see how much more efficient people become in the four hours that they can be there.

Hargrove has split her lab into three four-hour shifts from 8 a.m. to 8 p.m. with the mandated hour for cleaning and leaving the lab between shifts. Its all coordinated through an online calendar and Slack messaging so trainees can arrange when and how they want to work, including taking two 4-hour shifts if need be.

Third-year graduate student Martina Zafferani of Hargroves lab in the French Family Science Center prefers to work 10 or 12 hours at a time, much of it standing. Now shes taking two times four hours, with an hour between to go outside, have lunch and get some Vitamin D.

Zafferani works in a lab that typically might hold up to 15 workers, from undergraduates to post-docs, flowing in and out during the day. On one in early June, it was just her and fifth-year grad student Sarah Wicks, wearing masks with their heads down, trying to restart their experiments on making molecules to control RNA and getting as much out of their limited time as they can.

It hasn't been completely lonesome, Wicks said. But I do miss the chatter of other group members being here. We usually have about nine lab members working, we have music playing, equipment humming, and to have such silence now does make it lonelier than it was.

Still, its great to be back, said first-year masters student Ameya Chaudhari as he returned to work on bio-compatible polymers in Seguras lab. I had been feeling lethargic at home, but now Im energized being back in the lab, Chaudhari said. He looked long overdue for a haircut but was wearing one of the labs sharp, Duke-blue lab coats.

Duke labs that were working on questions related to COVID-19 stayed on duty throughout the shutdown, of course. And others, including Hargrove and dermatology associate professor Amanda MacLeod, are shifting some of their attention to COVID-adjacent questions as they come back.

Its weird being careful with everyone around, says MSRB-III postdoctoral researcher Paula Mariottoni of MacLeods group. She did a sort of pausing dance to move from her bench to a tissue culture room as a colleague walked past. Even if its a slow pace, moving forward is good, Mariottoni said.

Red tape Xs mark the floors about eight feet apart in each bay of MacLeods MSRB-III laboratory space, indicating where people should stand to communicate or pass. The elevator is designated for one rider at a time, up only; exiting is by a designated stairway down.

Third-year medical student Vivian Lei was working in the next bay over in the MacLeod group. It was designed for four, but occupied by just her. My time during lockdown was reexamining data, she said from behind a white cloth mask. Weirdly enough, this was one of the most productive times in our lab.

The timing of the shutdown in late March hit MacLeods group perfectly, in fact. They had just completed a move from Duke South to MSRB-III, and anticipating the move would be disruptive, the group did a lot of experiments in January and February to create a hoard of data that kept them busy.

Our people had run a lot of the wet-lab experiments up front, and now all the analysis had to be done, said MacLeod, an associate professor of dermatology who is studying how the skin helps combat pathogens including viruses -- as the bodys first line of defense. During the shutdown, the lab submitted two manuscripts, three grant applications and a few fellowship applications.

Environmental toxicologist Richard DiGiulios lab in the Levine Science Research Center includes colorful tanks of living fish, so somebody was coming in to feed them the entire time, even though the science stopped.

We didn't see anybody else, except for the occasional custodial worker, for 2 1/2 months, said DiGiulio lab manager Melissa Chernick. We just never intersected with anybody, which I thought was good because it made me feel more comfortable that nobody was in the building -- it was less possible contacts.

Now DiGiulio, the Sally Kleberg Distinguished Professor in the Nicholas School of Environment, is waiting to hear about safety rules for field work. The labs supply of killifish, collected from a Superfund site on the Elizabeth River in Virginia, is beginning to wane. To get more from the river, we all pack in a van for four hours up there and four back, Chernick said.

The lab buildings feel different without students studying or hanging out in public areas and hallways. The coffee shops are shuttered. There are no voices, no footfalls, just whooshing air.

It was very creepy to be alone in French, said Zafferani, who was one of the first people back in the building. Still, it was better than working at home, she said.

I think the beginning is going to be a little bit slower maybe than everyone hopes, MacLeod said. It's challenging, but it's doable.

And even though you have this mask on, she said, you can still talk and be friendly and kind to everyone.

See more here:
Duke's Labs Are Back in Business, But In a New Way - Duke Today

Top 10 emerging technologies of 2020: Winners and losers – TechRepublic

Artificial intelligence and 5G will drive the technology revolution, according to CompTIA.

Technology solutions built around artificial intelligence (AI) and 5G offer the most immediate opportunities for tech firms to generate new business and revenue, according to CompTIA's third annual Top 10 Emerging Technologies report released on Wednesday.

Each year, the Emerging Technology Community of CompTIA, the nonprofit association for the global technology industry, releases its list of the top emerging technologies.

SEE: Managing AI and ML in the enterprise 2020: Tech leaders increase project development and implementation (TechRepublic Premium)

"Our ranking represents a consensus viewpoint that emerged after some spirited debate and discussion with the community," said Michael Haines, director of partner incentive strategy and program design for Microsoft and chair of the CompTIA Emerging Technology Community, in a press release.

"We're not proposing that every solution provider and channel partner needs to immediately add these technologies to their menu of products and services," Haines added. "But these innovations will have a sweeping impact on the business of technology. Companies need to prepare now for the changes ahead."

AI and 5G each moved up one spot from last year's list. The Internet of Things (IoT), which claimed the top spot in 2019, dropped to third on this year's list. Augmented and virtual reality and biometrics also moved up, while blockchain and robotics slipped a bit.

"We always saw the marvelous opportunity in AI," Haines said in a blog post. "It's literally been moving up the list. It's one of those interesting ones to watch. AI is now being evaluated as we see it by nearly every organization for possible application to drive insights and better solutions."

Some technologies such as 3D printing and drones fell completely off the list, after claiming a spot since the list began in 2018, while Natural Language Processing made its first appearance.

1. AI

AI claimed the top spot on the list. Artificial intelligence refers to programmed algorithms that automatically parse and apply knowledge. It's the largest force in emerging technology, and includes security and sales applications for businesses.

2. 5G

5G offers improvements over 4G, such as low latency, intelligent power consumption and high device density. 5G will make augmented reality, smart cities and connected vehicles possible.

3. IoT

The Internet of Things combines information from connected devices and allows for analytics of systems. These platforms, devices and datasets provide additional insights and efficiencies for the enterprise.

4. Serverless Computing

Serverless computing, or Function as a Service (FaaS), allows companies to build applications that scale in real time so that they can respond to demand that can change instantly depending on orders of magnitude. FaaS offers a consumption-based platform so that developers can quickly and cost effectively deploy applications.

5. Biometrics

Security will be improved by biometics by allowing people and devices to authenticate and move seamlessly through the world.

6. Augmented Reality/Virtual Reality

AR and VR transform how people engage with machines, data and each other. The enterprise is using mixed reality, AI and sensor technologies to enhance execution flexibility, operational efficiency and individual productivity.

7. Blockchain

There's an ever-increasing need to be able to secure and manage transactions across the internet, and blockchain is the answer. Blockchain manages data and supply chain challenges.

8. Robotics

Robotics are shifting from industrial use to service delivery and are impacting home and businesses, both physically and virtually.

9. Natural Language Processing

NLP is a field of AI that enables computers to analyze and understand human language. Speech-to-text converts human language into a programming language. Text-to-speech converts a computer operation to an audible response.

10. Quantum Computing

Our ability to process and analyze big data will be impacted by quantum computing. It is the key to leveraging machine learning and the power of AI.

For comparison, in 2019, these were the top 10 from CompTIA:

The list is intended to be used as a starting place for debate. Haines said in a blog post, "What I like about it a lot is that people will disagree with the list. They'll say, 'Oh, well I think this one ought to be in there or that one ought to be in there.' And you know what? That's really one of the reasons for the listit's a living document. It's the view of this community, but it fosters great discussion."

We deliver the top business tech news stories about the companies, the people, and the products revolutionizing the planet. Delivered Daily

DevOps: A cheat sheet(TechRepublic) Technology in education: The latest products and trends (free PDF) (TechRepublic download)Hiring Kit: Autonomous Systems Engineer (TechRepublic Premium)Technology that changed us: The 1970s, from Pong to Apollo (ZDNet)These smart plugs are the secret to a seamless smart home (CNET)The 10 most important iPhone apps of all time (Download.com)Tom Merritt's Top 5 series (TechRepublic on Flipboard)

Image: Sompong Rattanakunchon / Getty Images

See the original post:
Top 10 emerging technologies of 2020: Winners and losers - TechRepublic

Top Artificial Intelligence Investments and Funding in May 2020 – Analytics Insight

The startup scenario is being changed by bringing in investment and deal activity around intelligent automation and artificial intelligence, big data and machine learning. The data plainly demonstrates that new businesses that had AI as a core product are creating narrow AI tech packed away with the heaviest investment from leading VC firms and investors who are putting vigorously in deep tech startups in big data, enterprise AI and automation. It likewise underscores a great part of the financing going on in domain explicit breakthrough innovations, and not broadly useful AI tech.

Investment funds, venture capital (VC) firms and corporate financial specialists are venturing up equity investments in artificial intelligence (AI) start-ups, mirroring a developing worldwide interest for AI advances and their business applications.

The aggregate sum contributed and the worldwide number of deals has expanded enormously since 2011, yet wide varieties in investment profiles develop among nations and areas.

Lets look at some of the top AI investments which took place in the month of May 2020.

Runa Capital has closed its third investment fund with $157 million to back startups in deep tech areas such as artificial intelligence and quantum computing. The firm said Runa Capital Fund III surpassed its target of $135 million. The new capital will allow the company to continue its strategy of making investments that range between $1 million and $10 million in early-stage companies.

Cybersecurity threat remediation provider Dtex recently announced it has raised $17.5 million. The funds will be used to expand into new and existing verticals, including banking and financial services, critical infrastructure, government, defense, pharmaceuticals, life sciences, and manufacturing.

GigaSpaces, a startup developing in-memory computing solutions for AI and machine learning workloads, last month announced it has raised $12 million. The funds will be used to scale expansion and accelerate product R&D, according to CEO Adi Paz. Fortissimo Capital led the investment in three-year-old, New York-based GigaSpaces, joined by existing investors Claridge Israel and BRM Group. The round brings GigaSpaces total raised to $53 million, following a $20 million series D in January 2016.

Omilia, a startup developing natural language technologies, today announced it raised $20 million in its first ever financing round. Founder and CEO Dimitris Vassos says the capital will help strengthen Omilias go-to-market efforts as it eyes expansion in North America and Western Europe. Omilias product portfolio spans a conversational platform and solutions targeting voice biometrics, speech recognition, and fraud prevention.

Logistics startup DispatchTrack announced it raised $144 million in the companys first-ever financing round. CEO Satish Natarajan says it will be used to support product research and development, as well as business, segment, and geographic expansion. DispatchTrack was founded in 2010 by Satish Natarajan and Shailu Satish, a husband-and-wife team who focused on the furniture industry before expanding into building materials, appliances, food and beverage distribution, restaurants, field and home services, and third-party logistics.

Share This ArticleDo the sharing thingy

About AuthorMore info about author

Analytics Insight is an influential platform dedicated to insights, trends, and opinions from the world of data-driven technologies. It monitors developments, recognition, and achievements made by Artificial Intelligence, Big Data and Analytics companies across the globe.

Original post:
Top Artificial Intelligence Investments and Funding in May 2020 - Analytics Insight

Spain’s CaixaBank Teams with IBM Services to Accelerate Cloud Transformation and Innovation in the Financial Services Industry – PR Newswire UK

Red Hat OpenShift and AI engaged to help the bank to roll out new digital offerings delivering enhanced customer experiences

ARMONK, New Yorkand BARCELONA, Spain, June 4, 2020 /PRNewswire/ -- CaixaBank,a leading financial institution in Spain and Portugal, serving more than 15.5 million customers, has announced an agreement with IBM Services (NYSE: IBM) to help accelerate its hybrid cloud journey and continue their work to increase the bank's capability to develop innovative, digital-first solutions to enhance client experiences.

CaixaBank will leverage IBM Cloud Pak for Applications running on Red Hat OpenShift to manage workloads and applications across its overall cloud infrastructure. The bank also agreed to continue to work with IBM in their joint innovation center to apply advanced technologies like AI, and additionally explore quantum computing and blockchain solutions. The companies will continue to seek to co-create new solutions for the banking industry with a goal to help quickly process a large number of transactions in an open, secure and scalable environment while delivering improved customer experiences.

With a key focus on technological innovation for the industry, CaixaBank is Spain's leading digital financial services provider, serving more than 6.5 million digital clients. CaixaBank is also one of the pioneering banks in the application of artificial intelligence for financial services, developing one of the first virtual banking assistants created in Europe. Built with IBM Watson, the AI-based virtual assistant manages more than 1.5 million client conversations each month, handling a spectrum of tasks such as helping bank employees quickly obtain relevant detailed information about new client offerings and quickly assisting mobile customers via chat with day-to-day queries. This approach frees up employee time to focus on serving customers.

IBM has been a strategic technology provider for CaixaBank since 2011. Along with renewing their existing relationship, the recent agreements are also focused on accelerating innovation and digital transformation, while also strengthening the longtime collaboration between IBM and the bank, chaired by Jordi Gual and CEO Gonzalo Gortzar.

"Our company, the leader in digital customers in Spain, has renewed our relationship with IBM to allow us to continue innovating and transforming the way we interact with our customers," said Gonzalo Gortzar, CaixaBank's CEO. "By strengthening and expanding the collaboration with a company that is a global model in innovation for the finance industry, we will accelerate, even further, our digital capabilities to continue developing innovative projects and services."

IBM is bringing its deep financial services industry experience to help generate long term value to CaixaBank and its clients. By leveraging IBM Cloud Pak for Applications, CaixaBank can modernize and create applications with increased agility and security while addressing compliance requirements within a hybrid cloud environment.

"We are pleased to be on this digital transformation journey with CaixaBank, an innovation leader in the banking industry," said Juan Zufiria, Senior Vice President, Global Technology Services. "With this collaboration, we are laying the foundation to build a model, not just for CaixaBank and its millions of customers, but also for the future of the industry. The open cloud environment can allow the bank to accelerate its innovation and offer a more agile way to bring new digital services to its customers with added flexibility and security."

Increased processing capacity and data storage capability The IBM Cloud Pak for Applicationssolution is designed to help reduce risk and improve operational resiliency with an estimated processing power and data storage capability of 105,000 terabytes, a capacity equal to 200 times the volume of a digital library with all the books listed in the world in all languages.

New projects for the joint Innovation Center Researchers at the CaixaBank-IBM innovation center have previously been exploring technologies for the future of financial services and the recent agreement expands to include with blockchain and quantum computing. Recently, CaixaBank developed a prototype of a machine learning algorithm based on quantum computing to analyze customers based on credit risk.

This agreement was signed during IBM's Q1, 2020.

Contact Information:

Nuria Teres Communications, CaixaBank prensa@caixabank.com +34 93 404 1398

Kaveri Camire Communications, IBM kcamire@us.ibm.com (914) 625-6395

Photo - https://mma.prnewswire.com/media/1176568/IBM_Marta_Martinez_Gonzalo_Gortazar_CaixaBank.jpg

Logo -https://mma.prnewswire.com/media/95470/ibm_logo.jpg

http://www.ibm.com

SOURCE IBM

Visit link:
Spain's CaixaBank Teams with IBM Services to Accelerate Cloud Transformation and Innovation in the Financial Services Industry - PR Newswire UK