Daily Archives: June 4, 2021

It’s time to be realistic about the circular economy for textiles – just-style.com

Posted: June 4, 2021 at 3:53 pm

In 2018, the UKs Environmental Audit Committee (EAC) recommended, along with other measures, a producer responsibility charge to pay for better clothing collection and recycling.

The Governments response included a statement about its Resources and Waste Strategy: We commit to take forward policy on EPR, eco-design product standards which could include requirements on micro-fibre shedding, and consumer information such as labelling.

British MPs criticise mountain of clothing waste

Government policy was to review and consult on measures for five new waste streams by the end of 2025. Textiles is one of the waste streams identified for consideration.

However, something has happened to revise this strategy. The new Environment Bill that is going through Parliament has a section on managing and separating waste.

For household collections, the identified recyclable waste streams are as follows: (a) glass; (b) metal; (c) plastic; (d) paper and card; (e) food waste; (f) garden waste. There is no mention of textiles, clothing or apparel.

It cannot be that the Government has forgotten that recycling textiles is a priority as, only recently, the UKs Interdisciplinary Circular Economy Centres were launched, including the Textiles Circularity Centre. The other centres, addressing minerals, chemicals and metals, can interface with the household collection strategy but this is not the case for textiles.

Arguably, there is already a textile recycling industry in the UK. Textile banks and charity shops have been sources of textile materials, and several retailers now offer takeback schemes.

GlobalData's TMT Themes 2021 Report tells you everything you need to know about disruptive tech themes and which companies are best placed to help you digitally transform your business.

However, the textile recyclers are almost entirely focused on finding markets for second-hand garments. They collect and sort textile products, and sell them on wherever they can find markets.

Mechanical processing takes some of these textiles, but quantities are small. Other markets exist for wipers, but the profit margin is very low. This is an industry in decline, because barriers confronting exported wastes are increasing, and much of the higher value products are donated directly to charity shops.

Extending the life of a garment can be done in many ways: by repairing faults, by designing with more durable materials and construction techniques, by selling as a second-hand product, by swapping or passing on to others or to a charity shop or textile bank. Some have the skills needed to embellish the garment a form of upcycling. But is this the circular economy?

The key to understanding circularity is that materials are perceived as resources, and not wastes. Extending the life of a product is primarily an action advancing sustainability. To recognise circularity, we should ask what happens to the product when its second owner discards it?

At present, items deemed worthless are placed in a bin for general waste. Textile materials become soiled very easily, and they end up in landfill or in incineration. Consequently, consumers do not recognise unwanted textiles as resources.

There are numerous examples of mechanical processing to recover fibres, which can then be fabricated into new products such as felts to absorb sound and to provide insulation. There are examples of commercial garments that fully or partly incorporate recycled fibres.

However, these are all niche products for specialised markets and they do not have potential for addressing the mountain of textile wastes. They have a second life as alternative products, but are then disposed as wastes. How is it possible to convince consumers that textile materials can be a resource, and are worthy of disposing separately from the general waste?

There are a growing number of projects that have either set out to answer this question, or are intending to demonstrate the circular economy in textiles. There are several pilot plants for chemical processing in the UK, in Europe (Sweden, Finland, Slovenia), in the US and in Hong Kong.

In most of these plants, the goal is fibre-to-fibre recycling. As textile dyes are usually chemically active, progress has been slow. Often, the plant can handle undyed fibres but not consumer-waste textiles. Some work only with cotton; others limit the textile input to cotton and polyester fibres.

I was involved with the EU-funded Resyntex project, which took a much broader brush approach to the challenges facing circularity in textiles. Our target was to find a way of processing cellulosic, protein, polyester and polyamide fibres (over 95% of the waste mountain).

We used enzyme chemistry to depolymerise the fibres but we did not attempt to make new fibres. Rather, the aim was to make commodity feedstock products. This meant that the outputs had a value based on global markets, and as the markets changed with time, so the feedstocks produced could also be changed in response.

This concept leads to industrial symbiosis. The waste materials from one sector can be resources for other industrial sectors, and industries can collaborate to maximise the benefits and achieve commercial viability.

The most successful route involved protein fibres, which could be turned into an adhesive for laminating wood and forming chipboard. The only major problem was one of availability of materials: one which could be greatly helped by utilising household collections of textiles and automated sorting.

During the course of the Resyntex project, it was apparent that apparel retailers and their suppliers were more interested in fibre-to-fibre recycling.

My response to this was to remind them that new fibres have a gestation time longer than the four-year Resyntex project!

But what Resyntex would do was to debut a system that could then implement a fibre-to-fibre project. Recycled cellulosic fibres are of particular interest, as the environmental impact of growing cotton is substantial.

The EU plans to implement household collections of textiles by 2025. At present, there is no clear idea of what to do with these textiles after they are collected.

The Resyntex project has shown that technologies are in place to implement circularity. What we need from governments now is a strategy for launching national collections of the materials and encouragements to invest in the plant needed to mine this mountain.

About the author: David Tyler is Professor of Fashion Technologies at the Manchester Fashion Institute, Manchester Metropolitan University.

Read more:

It's time to be realistic about the circular economy for textiles - just-style.com

Posted in Resource Based Economy | Comments Off on It’s time to be realistic about the circular economy for textiles – just-style.com

The road to sustainability: the superhighway built from paper waste instead of cement – Euronews

Posted: at 3:53 pm

At first glance the new stretch of motorway being built in the municipality of La Font de la Figuera near the Spanish city of Valencia looks like any other. But hidden secrets lie beneath its surface.

Thanks to pioneering tech, Spanish contractor Acciona is using paper ash to replace the cement that would normally go into the roads construction to improve durability.

"In road construction, we need the strongest materials. And for that, we usually use cement. This paper ash doesnt just look like cement. It meets all the technical requirements of cement, but its also more environmentally friendly," explains Acciona's R&D Project Manager, Juan Jose Cepria Pamplona.

Acciona believes using paper ash will enable it to significantly cut its carbon footprint.

"The potential impact of the project is enormous. We have calculated that we can save 65-75% of the associated CO2 emissions. And by 'scaling up' we could save up to 18,000 tonnes of cement per year, says Juan Jose.

But the benefit is not only carbon reduction. By using paper ash thats burnt waste paper and pulp that can no longer be recycled the company is turning rubbish, that would most likely end up in landfill, into a resource.

The motorway in La Font de la Figuera is one of three pilot projects, but Juan Jose says Acciona has big plans for the future.

Our intention is to scale up and to extend its [paper ash] use nationally and eventually replicate it internationally," he says.

Figures from 2014 show the sector - the worlds second-biggest had an annual production of 130 million tonnes. 11 million tonnes of that ended up as unrecycled waste.

Johan Elvnert is Secretary-General of the Forest-based Sector Technology Platform. He says Europe's paper and pulp industry is finding new ways to harness this by-product.

New technologies make it possible to reuse and recycle more. One good example is the paperChain project, but we see these kind of developments for everything, textiles, packaging, even non-materials and even fish food from the treatment water of pulp mills. So, the best is not to think of this as waste but a resource.

Elvnert concludes by saying: To get to a zero-waste circular economy we need to work together along the whole value chain. The EUs 2050 targets are very ambitious indeed. In the forest-based value chain, we have looked at how to get to a zero-waste circular society and we will work hard to make this agenda a reality, but support from the European Union is crucial.

European Green Deal

PaperChain Project

Acciona

Here is the original post:

The road to sustainability: the superhighway built from paper waste instead of cement - Euronews

Posted in Resource Based Economy | Comments Off on The road to sustainability: the superhighway built from paper waste instead of cement – Euronews

Innovation will drive the UAE’s evolution: Top official – Khaleej Times

Posted: at 3:53 pm

Human capital important to help drive innovation forward, Dr. Tariq Bin Hendi says at Global Investment Forum 2021

The success of a knowledge-based economy lies in how well the entrepreneurial ecosystem allows for risk and experimentation, said Dr. Tariq Bin Hendi, director general of the Abu Dhabi Investment Office.

Speaking at the Global Investment Forum 2021, he briefed attendees on how the UAE had established itself as a leader in attracting highly successful entrepreneurs.

I think when you look at the public sector and its role in helping to drive innovation and technology, you also have to look at how nascent the market is, he said.

We have to de-risk the environment and the ecosystem to make sure that people want to experiment, and that the R&D that occurs as a result can drive innovation so you can ultimately create a knowledge-based economy.

He added: It is important to have capital, but you also have to have the human capital to help drive innovation forward; and the only way to bring that human capital here and to nurture it is to make sure that you keep driving the narrative and the actual delivery on the policies that you have set.

The facts, he said, are very clear. We receive the most amount of FDI in the Arab world. We are ranked number one in the region when it comes to entrepreneurship. As many of you that live here know, the focus has been very much on the balance between lives and livelihoods; how we protect our economy but also make sure that we protect the number one resource that we have, which is the people that call the UAE their home. All of our initiatives revolve around funding, de-risking, how we design the right policies and regulations, how we enable the environment, and how entrepreneurs can use the UAE as a base to grow their ideas, nurture them, and expand into the wider region.

Bin Hendi also stressed that technology is a means to an end, but that it is innovation that will drive the UAEs evolution as a country. He also noted that there are several issues that have to be addressed through partnerships; these include water scarcity, food security, and job creation for the youth.

Our leadership has taken steps to ensure that as we drive the adoption of technology and innovation to create a knowledge-based economy, we are creating new jobs for the new generation of job seekers, he said. A collective direction forward, with a focus on people, and diversifying the economy to make sure that it is robust and resilient is a recipe for success that the UAE has enjoyed. The recent 100 per cent foreign ownership law is a key milestone for our nation and a statement to the world that we are willing to break rules and laws and regulations that we believe will stimulate growth.

rohma@khaleejtimes.com

Read the rest here:

Innovation will drive the UAE's evolution: Top official - Khaleej Times

Posted in Resource Based Economy | Comments Off on Innovation will drive the UAE’s evolution: Top official – Khaleej Times

Farhad Manjoo: The wind and solar boom is here – Salt Lake Tribune

Posted: at 3:53 pm

Just one word, Benjamin: Solar.

Well, actually, one more: Wind.

The sun, the air and the chemistry to bottle their limitless power its looking more and more as if these constitute the worlds next great technological advance, a leap as life-changing for many of us as was aviation, the internet or, of course, plastics.

Faster than many thought possible, and despite long doubt about renewable energys practicality, a momentous transformation is well underway. We are moving from a global economy fueled primarily by climate-warming fossil fuels to one in which we will cleanly pluck most of our energy out of water, wind and the fire in the sky.

People who study energy markets say that economics alone ensures our eventual transition to clean fuels but that policy choices by the governments can speed it up. In October, the International Energy Agency declared solar power to be the cheapest new form of electricity in many places around the world, and in particularly favorable locations, solar is now the cheapest source of electricity in history.

It can be difficult to muster much optimism about humanitys capacity to address climate change, and I have argued before that it is wisest to look to the future with a pessimistic eye, if only to encourage urgent action toward collective problem-solving. (We are more likely to do something to solve our problems if were frank about how bad things might get.)

There are lots of reasons to cast doubt on the clean-energy future. Wind and solar still account for just a tiny fraction of the worlds energy production. Even their most enthusiastic supporters concede that much will need to change to realize the full potential of renewable energy. Over the coming decades consumers and businesses will have to adapt to many novel technologies, while governments will need to build new infrastructure and overhaul energy regulations built around fossil fuels.

Still, amid the general gloom of climate change, the clean-energy boom offers the rare glimmer not just of hope but of something more: excitement. The industrys bold claims are bolstered by bolder trends. Over the last couple of decades experts have consistently underestimated the declines in price, the improvements in performance and the subsequent speed of adoption of renewable power.

Unlike fossil fuels which get more expensive as we pull more of them from the ground, because extracting a dwindling resource requires more and more work renewable energy is based on technologies that get cheaper as we make more. This creates a virtuous flywheel: Because solar panels, wind turbines, batteries and related technologies to produce clean energy keep getting cheaper, we keep using more of them; as we use more of them, manufacturing scale increases, cutting prices further still and on and on.

Jenny Chase, who analyzes the solar power sector at BloombergNEF, an energy research firm, told me that when she started her job in 2005, her most optimistic scenario was that sunlight would eventually generate as much as 1% of the worlds electricity. At the time, solar power contributed essentially nothing to the global energy mix, so even a tiny fraction looked pretty good.

I thought, well, itll be a small thing, but focusing my career on something thats 1% of the worlds electricity, thats all right, she told me.

She was way off, and so were many others, including governmental agencies. Solar power surpassed 1% of global electricity generation in the middle of the last decade. Chase estimates that solar now accounts for at least 3% of the worlds electricity that is, three times as much as she once thought possible.

In a forecast published late last year, Chase and her colleagues at BloombergNEF estimated that by 2050, 56% of the worlds electricity would be produced by wind and solar power. But she says that forecast is already out of date its too low.

Others go further still. The fossil fuel era is over, declares Carbon Tracker Initiative, a nonprofit think tank that studies the economics of clean energy, in a new report. Kingsmill Bond, its energy strategist, told me that the transition to renewable energy will alter geopolitics and global economics on a scale comparable to that of the Industrial Revolution.

He cites one telling example to illustrate how and why. The worlds largest conventional oil field, Ghawar in Saudi Arabia, has the capacity to produce nearly 4 million barrels of oil per day. If you were to convert Ghawars annual oil output into electricity, youd get almost 1 petawatt-hour of power per year. (Thats nearly enough to power Japan for a year; the worlds annual electrical energy demand is 27 petawatt-hours.)

The Ghawar oil field takes up a lot of space about 3,000 square miles, around the size of Rhode Island and Delaware combined. But it soon might sound crazy to use that much sunny land for drilling oil. Bond estimates that if you put up solar panels on an area the size of Ghawar, you could generate more than 1 petawatt-hour per year more than youd get from the oil buried under Ghawar.

But the oil will one day run out, while the sun will keep shining over Ghawar and not just there, but everywhere else, too. This is the magic of the sun, as Bond explains: Only Saudi Arabia has a Ghawar, but with solar power almost every country in the world with enough space can generate 1 petawatt-hour of power (and without endangering the planet to boot).

Its important to note that there remain hurdles in the way of a renewable-energy future. The most obvious one is the infrastructure required to take advantage of all this electric power more robust power grids, for instance, and the transformation to electric power of everything from cars to container ships.

These problems are considerable but solvable. In his upcoming book, Electrify, Saul Griffith, an inventor (and MacArthur fellow) who is a co-founder of an organization called Rewiring America, argues that many of the barriers to a clean-energy future are systemic and bureaucratic, not technological.

Griffith says that the transformation will be an economic bonanza many analysts predict huge job creation and savings in energy prices from a switch to renewables. But if we want it in time to avert some of the most catastrophic predictions about a warming climate, we need to push the changes along even faster. Among other things, Griffith calls for a complete overhaul of our energy policies in order to reduce some of the regulatory costs of expanding renewable power.

What kinds of costs? Many small, unforeseen things. For instance, in much of the U.S., installing rooftop solar panels requires an extensive and expensive permitting process that substantially increases the price. Through streamlined rules, other countries have managed to greatly reduce such costs.

This wont be easy; the fossil-fuel industry is actively battling the rise of renewables. But at most, it can only slow things down. A carbon-free energy economy is coming whether oil and coal companies like it or not.

Farhad Manjoo | The New York Times (Earl Wilson/The New York Times)

Farhad Manjoo is a columnist for The New York Times.

Read this article:

Farhad Manjoo: The wind and solar boom is here - Salt Lake Tribune

Posted in Resource Based Economy | Comments Off on Farhad Manjoo: The wind and solar boom is here – Salt Lake Tribune

Looking to the future of quantum cloud computing – Siliconrepublic.com – Siliconrepublic.com

Posted: at 3:51 pm

Trinity College Dublins Dan Kilper and University of Arizonas Saikat Guha discuss the quantum cloud and how it could be achieved.

Quantum computing has been receiving a lot of attention in recent years as several web-scale providers race towards so-called quantum advantage the point at which a quantum computer is able to exceed the computing abilities of classical computing.

Large public sector investments worldwide have fuelled research activity within the academic community. The first claim of quantum advantage emerged in 2019 when Google, NASA and Oak Ridge National Laboratory (ORNL) demonstrated a computation that the quantum computer completed in 200 seconds and that the ORNL supercomputer verified up to the point of quantum advantage, estimated to require 10,000 years to complete to the end.

Roadmaps that take quantum computers even further into this regime are advancing steadily. IBM has made quantum computers available for online access for many years now and recently Amazon and Microsoft started cloud services to provide access for users to several different quantum computing platforms. So, what comes next?

The step beyond access to a single quantum computer is access to a network of quantum computers. We are starting to see this emerge from the web or cloud-based quantum computers offered by cloud providers effectively quantum computing as a service, sometimes referred to as cloud-based quantum computing.

This consists of quantum computers connected by classical networks and exchanging classical information in the form of bits, or digital ones and zeros. When quantum computers are connected in this way, they each can perform separate quantum computations and return the classical results that the user is looking for.

It turns out that with quantum computers, there are other possibilities. Quantum computers perform operations on quantum bits, or qubits. It is possible for two quantum computers to exchange information in the form of qubits instead of classical bits. We refer to networks that transport qubits as quantum networks. If we can connect two or more quantum computers over a quantum network, then they will be able to combine their computations such that they might behave as a single larger quantum computer.

Quantum computing distributed over quantum networks thus has the potential to significantly enhance the computing power of quantum computers. In fact, if we had quantum networks today, many believe that we could immediately build large quantum computers far into the advantage regime simply by connecting many instances of todays quantum computers over a quantum network. With quantum networks built, and interconnected at various scales, we could build a quantum internet. And at the heart of this quantum internet, one would expect to find quantum computing clouds.

At present, scientists and engineers are still working on understanding how to construct such a quantum computing cloud. The key to quantum computing power is the number of qubits in the computer. These are typically micro-circuits or ions kept at cryogenic temperatures, near minus 273 degrees Celsius.

While these machines have been growing steadily in size, it is expected that they will eventually reach a practical size limit and therefore further computing power is likely to come from network connections across quantum computers within the data centre, very much like todays current classical computing data centres. Instead of racks of servers, one would expect rows of cryostats.

Quantum computing distributed over quantum networks has the potential to significantly enhance the computing power of quantum computers

Once we start imagining a quantum internet, we quickly realise that there are many software structures that we use in the classical internet that might need some type of analogue in the quantum internet.

Starting with the computers, we will need quantum operating systems and computing languages. This is complicated by the fact that quantum computers are still limited in size and not engineered to run operating systems and programming the way that we do in classical computers. Nevertheless, based on our understanding of how a quantum computer works, researchers have developed operating systems and programming languages that might be used once a quantum computer of sufficient power and functionality is able to run them.

Cloud computing and networking rely on other software technologies such as hypervisors, which manage how a computer is divided up into several virtual machines, and routing protocols to send data over the network. In fact, research is underway to develop each of these for the quantum internet. With quantum computer operating systems still under development, it is difficult to develop a hypervisor to run multiple operating systems on the same quantum computer as a classical hypervisor would.

By understanding the physical architecture of quantum computers, however, one can start to imagine how it might be organised to support different subsets of qubits to effectively run as separate quantum computers, potentially using different physical qubit technologies and employing different sub-architectures, within a single machine.

One important difference between quantum and classical computers and networks is that quantum computers can make use of classical computers to perform many of their functions. In fact, a quantum computer in itself is a tremendous feat of classical system engineering with many complex controls to set up and operate the quantum computations. This is a very different starting point from classical computers.

The same can be said for quantum networks, which have the classical internet to provide control functions to manage the network operations. It is likely that we will rely on classical computers and networks to operate their quantum analogues for some time. Just as a computer motherboard has many other types of electronics other than the microprocessor chip, it is likely that quantum computers will continue to rely on classical processors to do much of the mundane work behind their operation.

With the advent of the quantum internet, it is presumable that a quantum-signalling-equipped control plane might be able to support certain quantum network functions even more efficiently.

When talking about quantum computers and networks, scientists often refer to fault-tolerant operations. Fault tolerance is a particularly important step toward realising quantum cloud computing. Without fault tolerance, quantum operations are essentially single-shot computations that are initialised and then run to a stopping point that is limited by the accumulation of errors due to quantum memory lifetimes expiring as well as the noise that enters the system with each step in the computation.

Fault tolerance would allow for quantum operations to continue indefinitely with each result of a computation feeding the next. This is essential, for example, to run a computer operating system.

In the case of networks, loss and noise limit the distance that qubits can be transported on the order of 100km today. Fault tolerance through operations such as quantum error correction would allow for quantum networks to extend around the world. This is quite difficult for quantum networks because, unlike classical networks, quantum signals cannot be amplified.

We use amplifiers everywhere in classical networks to boost signals that are reduced due to losses, for example, from traveling down an optical fibre. If we boost a qubit signal with an optical amplifier, we would destroy its quantum properties. Instead, we need to build quantum repeaters to overcome signal losses and noise.

Together we have our sights set on realising the networks that will make up the quantum internet

If we can connect two fault-tolerant quantum computers at a distance that is less than the loss limits for the qubits, then the quantum error correction capabilities in the computers can in principle recover the quantum signal. If we build a chain of such quantum computers each passing quantum information to the next, then we can achieve the fault-tolerant quantum network that we need. This chain of computers linking together is reminiscent of the early classical internet when computers were used to route packets through the network. Today we use packet routers instead.

If you look under the hood of a packet router, it is composed of many powerful microprocessors that have replaced the computer routers and are much more efficient at the specific routing tasks involved. Thus, one might imagine a quantum analogue to the packet router, which would be a small purpose-built quantum computer designed for recovering and transmitting qubits through the network. These are what we refer to today as quantum repeaters, and with these quantum repeaters we could build a global quantum internet.

Currently there is much work underway to realise a fault-tolerant quantum repeater. Recently a team in the NSF Center for Quantum Networks (CQN)achieved an important milestone in that they were able to use a quantum memory to transmit a qubit beyond its usual loss limit. This is a building block for a quantum repeater. The SFI Connect Centre in Ireland is also working on classical network control systems that can be used to operate a network of such repeaters.

Together we have our sights set on realising the networks that will make up the quantum internet.

By Dan Kilper and Saikat Guha

Dan Kilper is professor of future communication networks at Trinity College Dublin and director of the Science Foundation Ireland (SFI) Connect research centre.

Saikat Guha is director of the NSF-ERC Center for Quantum Networks and professor of optical sciences, electrical and computer engineering, and applied mathematics at the University of Arizona.

Read the rest here:

Looking to the future of quantum cloud computing - Siliconrepublic.com - Siliconrepublic.com

Posted in Quantum Computing | Comments Off on Looking to the future of quantum cloud computing – Siliconrepublic.com – Siliconrepublic.com

Swedish university is behind quantum computing breakthrough – ComputerWeekly.com

Posted: at 3:51 pm

Swedens Chalmers University of Technology has achieved a quantum computing efficiency breakthrough through a novel type of thermometer that is capable of simplifying and rapidly measuring temperatures during quantum calculations.

The discovery adds a more advanced benchmarking tool that will accelerate Chalmers work in quantum computing development.

The novel thermometer is the latest innovation to emerge from the universitys research to develop an advanced quantum computer. The so-called OpenSuperQ project at Chalmers is coordinated with technology research organisation the Wallenberg Centre for Quantum Technology (WACQT), which is the OpenSuperQ projects main technology partner.

WACQT has set the goal of building a quantum computer capable of performing precise calculations by 2030. The technical requirements behind this ambitious target are based on superconducting circuits and developing aquantum computer with at least 100 well-functioning qubits. To realise this ambition, the OpenSuperQ project will require a processor working temperature close to absolute zero, ideally as low as 10 millikelvin (-273.14 C).

Headquartered at Chalmers Universitys research hub in Gothenburg, the OpenSuperQ project, launched in 2018, is intended to run until 2027. Working alongside the university in Gothenburg, WACQT is also operating support projects being run at the Royal Institute of Technology (Kungliga Tekniska Hgskolan) in Stockholm and collaborating universities in Lund, Stockholm, Linkping and Gothenburg.

Pledged capital funding for the WACQT-managed OpenSuperQ project which has been committed by the Knut and Alice Wallenberg Foundation together with 20 other private corporations in Sweden, currently amounts to SEK1.3bn (128m). In March, the foundation scaled up its funding commitment to WACQT, doubling its annual budget to SEK80m over the next four years.

The increased funding by the foundation will lead to the expansion of WACQTs QC research team, and the organisation is looking to recruit a further 40 researchers for the OpenSuperQ project in 2021-2022. A new team is to be established to study nanophotonic devices, which can enable the interconnection of several smaller quantum processors into a large quantum computer.

The Wallenberg sphere incorporates 16 public and private foundations operated by various family members. Each year, these foundations allocate about SEK2.5bn to research projects in the fields of technology, natural sciences and medicine in Sweden.

The OpenSuperQ project aims to take Sweden to the forefront of quantum technologies, including computing, sensing, communications and simulation, said Peter Wallenberg, chairman of the Knut and Alice Wallenberg Foundation.

Quantum technology has enormous potential, so it is vital that Sweden has the necessary expertise in this area. WACQT has built up a qualified research environment and established collaborations with Swedish industry. It has succeeded in developing qubits with proven problem-solving ability. We can move ahead with great confidence in what WACQT will go on to achieve.

The novel thermometer breakthrough opens the door to experiments in the dynamic field of quantum thermodynamics, said Simone Gasparinetti, assistant professor at Chalmers quantum technology laboratory.

Our thermometer is a superconducting circuit and directly connected to the end of the waveguide being measured, said Gasparinetti. It is relatively simple and probably the worlds fastest and most sensitive thermometer for this particular purpose at the millikelvin scale.

Coaxial cables and waveguides the structures that guide waveforms and serve as the critical connection to the quantum processor remain key components in quantum computers. The microwave pulses that travel down the waveguides to the quantum processor are cooled to extremely low temperatures along the way.

For researchers, a fundamental goal is to ensure that these waveguides are not carrying noise due to the thermal motion of electrons on top of the pulses that they send. Precise temperature measurement readings of the electromagnetic fields are needed at the cold end of the microwave waveguides, the point where the controlling pulses are delivered to the computers qubits.

Working at the lowest possible temperature minimises the risk of introducing errors in the qubits. Until now, researchers have only been able to measure this temperature indirectly, and with relatively long delays. Chalmers Universitys novel thermometer enables very low temperatures to be measured directly at the receiving end of the waveguide with elevated accuracy and with extremely high time resolution.

The novel thermometer developed at the university provides researchers with a value-added tool to measure the efficiency of systems while identifying possible shortcomings, said Per Delsing, a professor at the department of microtechnology and nanoscience at Chalmers and director of WACQT.

A certain temperature corresponds to a given number of thermal photons, and that number decreases exponentially with temperature, he said. If we succeed in lowering the temperature at the end where the waveguide meets the qubit to 10 millikelvin, the risk of errors in our qubits is reduced drastically.

The universitys primary role in the OpenSuperQ project is to lead the work on developing the application algorithms that will be executed on the OpenSuperQ quantum computer. It will also support the development of algorithms for quantum chemistry, optimisation and machine learning.

Also, Chalmers will head up efforts to improve quantum coherence in chips with multiple coupled qubits, including device design, process development, fabrication, packaging and testing. It will also conduct research to evaluate the performance of 2-qubit gates and develop advanced qubit control methods to mitigate systematic and incoherent errors to achieve targeted gate fidelities.

See the original post:

Swedish university is behind quantum computing breakthrough - ComputerWeekly.com

Posted in Quantum Computing | Comments Off on Swedish university is behind quantum computing breakthrough – ComputerWeekly.com

Global IT giant to partner with U of C on quantum computing centre – Calgary Herald

Posted: at 3:51 pm

Breadcrumb Trail Links

A global IT giant has announced plans to partner with the University of Calgary to create a centre of excellence for quantum computing in the city.

Author of the article:

A global IT giant has announced plans to partner with the University of Calgary to create a centre of excellence for quantum computing in the city.

Bangalore-based Mphasis Ltd., a provider of IT outsourcing services, announced Wednesday that it will set up a Canadian headquarters in Calgary. The move is expected to create 500 to 1,000 local jobs within the next two to three years, according to company CEO Nitin Rakesh.

The company will also establish what it dubs the Quantum City Centre of Excellence at the University of Calgary to serve as a hub for companies focused on the commercial development of quantum technologies. Mphasis will be the anchor tenant and will work to draw in other companies working in the field.

Quantum computing uses the principles of quantum physics to solve problems. It is considered to be a huge leap forward from traditional computer technology, and has futuristic applications in the fields of medicine, energy, fintech, logistics and more.

This advertisement has not loaded yet, but your article continues below.

In a virtual news conference Wednesday, Premier Jason Kenney called quantum computing one of the most promising emerging high-tech sectors. He said the partnership between Mphasis and the University of Calgary will help make Alberta a destination of choice for investment capital and talent in this growing field.

The goal is to make Alberta a force to be reckoned with in quantum computing, machine learning and AI economically, but also intellectually, Kenney said. Post-secondary students will have incredible opportunities to master the most sought-after skills through this venture.

Mphasis also announced its plans to establish Sparkle Calgary, which will offer training in artificial intelligence and automation technology for Albertans seeking a career transition. Rakesh said through this platform, Mphasis hopes to help address the skills shortage that currently plagues Albertas tech sector, while at the same time helping out-of-work Albertans find a place in the new economy.

Theres a ton of data expertise that sits at the heart of the oil and gas industry, Rakesh said. So can we take that ability to apply data knowledge, data science, and really re-skill (those workers) toward cloud computing . . . Thats the vision we want to see.

The University of Calgary has been working for some time to help establish Alberta as a leader for quantum computing research through its Institute for Quantum Science and Technology a multidisciplinary group of researchers from the areas of computer science, mathematics, chemistry and physics. The U of C is also a member of Quantum Alberta, which aims to accelerate Quantum Science research, development and commercialization in the province.

This advertisement has not loaded yet, but your article continues below.

U of C president Ed McCauley said Wednesday he hopes that the partnership with Mphasis will lead to the birth of a new wave of startup companies in Calgary, ones that will use cutting-edge technology developed on campus.

This (quantum) technology will not only create its own industry, but it will fuel advances in others, McCauley said. Calgary will not only be an energy capital, it will be a quantum capital, too.

The federal government has identified quantum computing as critically important to the future economy. The most recent federal budget includes $360 million for a National Quantum Strategy encompassing funding for research, students and skills development.

Mphasis is the second major Indian IT company in recent months to announce it will set up shop in Calgary. In March, Infosys a New York Stock Exchange-listed global consulting and IT services firm with more than 249,000 employees worldwide said it will bring 500 jobs to the city over the next three years as part of the next phase of its Canadian expansion.

Like Mphasis, Infosys has formed partnerships with Calgarys post-secondary institutions to invest jointly in training programs that will help to develop a local technology talent pool.

astephenson@postmedia.com

This advertisement has not loaded yet, but your article continues below.

Sign up to receive daily headline news from the Calgary Herald, a division of Postmedia Network Inc.

A welcome email is on its way. If you don't see it, please check your junk folder.

The next issue of Calgary Herald Headline News will soon be in your inbox.

We encountered an issue signing you up. Please try again

Postmedia is committed to maintaining a lively but civil forum for discussion and encourage all readers to share their views on our articles. Comments may take up to an hour for moderation before appearing on the site. We ask you to keep your comments relevant and respectful. We have enabled email notificationsyou will now receive an email if you receive a reply to your comment, there is an update to a comment thread you follow or if a user you follow comments. Visit our Community Guidelines for more information and details on how to adjust your email settings.

Read the rest here:

Global IT giant to partner with U of C on quantum computing centre - Calgary Herald

Posted in Quantum Computing | Comments Off on Global IT giant to partner with U of C on quantum computing centre – Calgary Herald

What is Thermodynamic Computing and Could It Become Important? – HPCwire

Posted: at 3:51 pm

What, exactly, is thermodynamic computing? (Yes, we know everything obeys thermodynamic laws.) A trio of researchers from Microsoft, UC San Diego, and Georgia Tech have written an interesting viewpoint in the June issue of Communications of ACM A Vision to Compute like Nature: Thermodynamically.

Arguing that traditional computing is approaching hard limits for many familiar reasons, Todd Hylton (UCSD), Thomas Conte (Georgia Tech), and Mark Hill (Microsoft) sketch out this idea that it may be possible to harness thermodynamic computing to solve many currently difficult problem sets and to do so with lower power and better performance.

Animals, plants, bacteria, and proteins solve problems by spontaneously finding energy-efficient configurations that enable them to thrive in complex, resource-constrained environments. For example, proteins fold naturally into a low-energy state in response to their environment, write the researchers. In fact, all matter evolves toward low-energy configurations in accord with the Laws of Thermodynamics. For near-equilibrium systems these ideas are well known and have been used extensively in the analysis of computational efficiency and in machine learning techniques, write the researchers in their paper.

Theres a nice, summary description of the TC notion on a Computing Community Consortium (CCC) blog this week:

What if we designed computing systems to solve problems through a similar process? The writers envision a thermodynamic computing system (TCS) as a combination of a conventional computing system and novel TC hardware. The conventional computer is a host through which users can access the TC and define a problem for the TC to solve. The TC, on the other hand, is an open thermodynamic system directly connected to real-world input potentials (for example, voltages), which drive the adaptation of its internal organization via the transport of charge through it to relieve those potentials.

In the ACM Viewpoint, the researchers say, [W]e advocate a new, physically grounded, computational paradigm centered on thermodynamics and an emerging understanding of using thermodynamics to solve problems that we call Thermodynamic Computing or TC. Like quantum computers, TCs are distinguished by their ability to employ the underlying physics of the computing substrate to accomplish a task. (See the figure below from the paper)

The recent Viewpoint is actually the fruit of a 2019 thermodynamic computing workshop sponsored by CCC and organized by the ACM Viewpoint authors. In many ways, their idea sounds somewhat similar to adiabatic quantum computing (e.g. D-Wave Systems) but without the need to maintain quantum state coherence during computation.

Among existing computing systems, TC is perhaps most similar to neuromorphic computing, except that it replaces rule-driven adaptation and neuro-biological emulation with thermo-physical evolution, is how the researchers describe TC.

The broad idea to let a system seek thermodynamic equilibrium to compute isnt new and has been steadily advancing, as they note in their paper:

The idea of using the physics of self-organizing electronic or ionic devices to solve computational problems has shown dramatic progress in recent years. For example, networks of oscillators built from devices exhibiting metal-insulator transitions have been shown to solve computational problems in the NP-hard class.Memristive devices have internal state dynamics driven by complex electronic, ionic, and thermodynamic considerations,which, when integrated into networks, result in large-scale complex dynamics that can be employed in applications such as reservoir computing.Other systems of memristive devices have been shown to implement computational models such as Hopfield networks and to build neural networks capable of unsupervised learning.

Today we see opportunity to couple these recent experimental resultswith the new theories of non-equilibrium systems through both existing (for example, Boltzmann Machines) and newer (for example, Thermodynamic Neural Network) model systems.

The researchers say thermodynamic computing approaches are particularly well-suited for searching complex energy landscapes that leverage both rapid device fluctuations and the ability to search a large space in parallel, and addressing NP-complete combinatorial optimization problems or sampling many-variable probability distributions.

They suggest a three-prong TC development roadmap:

At least initially, we expect that TC will enable new computing opportunities rather than replace Classical Computing at what Classical Computing does well (enough), following the disruption path articulated by Christensen.These new opportunities will likely enable orders of magnitude more energy efficiency and the ability to self-organize across scales as an intrinsic part of their operation. These may include self-organizing neuromorphic systems and the simulation of complex physical or biological domains, but the history of technology shows that compelling new applications often emerge after the technology is available.

The viewpoint is fascinating and best read directly.

Link to ACM Thermodynamic Computing Viewpoint: https://cacm.acm.org/magazines/2021/6/252841-a-vision-to-compute-like-nature/fulltext

Link to CCC blog: https://us5.campaign-archive.com/?e=afe05237d1&u=3403318289e02657adfc0822d&id=7b8ae80cfa

Read more here:

What is Thermodynamic Computing and Could It Become Important? - HPCwire

Posted in Quantum Computing | Comments Off on What is Thermodynamic Computing and Could It Become Important? – HPCwire

IBM has partnered with IITs, others to advance training, research in quantum computing – Elets

Posted: at 3:51 pm

Share

Share

Share

Listen to this Article

The institutions which have been selected, the respective faculty and students will be able to access IBM quantum systems, quantum learning resources and, quantum tools over IBM Cloud for education and research purposes. This will allow these institutions to work on actual quantum computers and program these using the Qiskit open-source framework.

The selected institutions are Indian Institute of Science Education & Research (IISER) Pune, IISER Thiruvananthapuram, Indian Institute of Science (IISc) Bangalore, Indian Institute of Technology (IIT) Jodhpur, IIT- Kanpur, IIT Kharagpur, IIT Madras, Indian Statistical Institute (ISI) Kolkata, Indraprastha Institute of Information Technology (IIIT) Delhi, Tata Institute of Fundamental Research (TIFR) Mumbai and the University of Calcutta.

The collaboration with Indias top institutions is a part IBM Quantum Educators program that helps faculty in the quantum field connect with others. The program offers multiple benefits like additional access to systems beyond IBMs open systems, pulse access on the additional systems, priority considerations when in queue and private collaboration channels with other educators in the program, read an IBM notice.

Follow and connect with us on Facebook, Twitter, LinkedIn, Elets video

See the article here:

IBM has partnered with IITs, others to advance training, research in quantum computing - Elets

Posted in Quantum Computing | Comments Off on IBM has partnered with IITs, others to advance training, research in quantum computing – Elets

Malta Becomes Newest Participant in the EuroHPC Joint Undertaking – HPCwire

Posted: at 3:51 pm

June 4, 2021 Malta joins the European High Performance Computing Joint Undertaking (EuroHPC JU), a joint initiative between the EU, European countries, and private partners that pools resources to develop a world-class supercomputing ecosystem in Europe.

Malta, formally an Observer on the EuroHPC JU Governing, will now be a full Member, along side the other 32 Participating States.

Anders Dam Jensen, the European High Performance Computing Joint Undertaking (EuroHPC JU) Executive Director, said:

We are delighted to welcome Malta to the EuroHPC Joint Undertaking family. Malta is joining the JU at an exciting moment for European digital autonomy, with the recent inauguration of the Vega supercomputer in Slovenia, and two more supercomputers will reinforce Europes supercomputer ecosystem shortly, MeluXina in Luxembourg and Karolina in Bulgaria. The coming years will seefurther acceleration and development of the EuroHPC JU project, as we strive towards Europes ambition to become a world leader in high-performance computing, and we are thrilled that Malta is joining us on this journey.

Background information

The EuroHPC Joint Undertaking wasestablishedin 2018 andis autonomous since September 2020.

The EuroHPC JU is currently equipping the EU with an infrastructure of petascaleand precursor of exascale supercomputers, and developing the necessary technologies, applications and skills for reaching full exascale capabilities by 2023.One supercomputer is currently operational in Slovenia (Vega); another one (MeluXina)will be officially inaugurated in Luxembourg on 7 June 2021. Five more EuroHPC supercomputers have been procured and will be operational in 2021:Discoverer(Bulgaria),Karolina(Czech Republic),Deucalion(Portugal),Leonardo(Italy), andLumi(Finland).

In addition,through its research and innovation agenda, the EuroHPC JU is also strengthening the European knowledge base in HPC technologies and bridging the digital skills gap, notably through the creation of a network of national HPC CompetenceCentresand other pan-European education initiatives.

The EuroHPC machines will be available to European researchers, industry, public administrations and SMEs. They will be a strategic resource for Europe, underpinning advances in sectors such asbio-engineering, weather forecasting, the fight against climate change, personalized medicine, as well as in the discovery of new materials and drugs that will benefit EU citizens.

Anew regulationis currently being discussed at EU level and is expected to enter into force in the coming months, aiming to enable a further investment of EUR 7 billion in the next generation of supercomputers,such asexascale, post-exascaleand quantum computers and an ambitious R&Iprogramme.

Source: EuroHPC JU

See the original post:

Malta Becomes Newest Participant in the EuroHPC Joint Undertaking - HPCwire

Posted in Quantum Computing | Comments Off on Malta Becomes Newest Participant in the EuroHPC Joint Undertaking – HPCwire