Australia urged to invest in quantum computing ahead of future pandemics – Sydney Morning Herald

Quantum approaches could play a significant role in developing treatments for any future pandemics more efficiently, improving the calculations that have to be made in chemical research and helping bringing new treatments to market.

"Advances in computing tech have given enormous practical advances in medicine and we think quantum has a role in drug discoveries," Professor Biercuk said.

These types of technologies are incredibly fragile, however, and Q-CTRL is in the business of offering technology solutions to industry that help stabilise quantum computing processes.

The company, which was spun out of the University of Sydney, has been backed by leading local venture capital firms including Square Peg, which led a $22 million funding round in the business last year.

The CSIRO released a road map for quantum computing investment this year, though much of the research and development work being done in the sector is occurring offshore.

"There is not very much in the Australian market at this time... we are almost 100 per cent export-focused in our company," Professor Biercuk said.

Loading

Over recent months, industry experts right across the technology sector have been championing the importance of targeting investment in research and development projects that could help Australia face future pandemics better and export skills in a post-COVID economy. This includes expanding vaccine manufacturing facilities and ensuring research and development incentives support the biotech industry.

A report on COVID-19 from the Medical Technology Association of Australia said industry should be brought to the table to work with governments on any future pandemic planning.

"Any new or updated national pandemic preparedness plan must recognise the critical role of the medtech industry," the report said.

Federal and state governments have contributed to a range of quantum computing projects over the past five years as researchers race to make quantum computers a reality.

The path could take hundreds of millions of dollars and many years, Professor Biercuk said, much like the pharmaceutical sector needs long-term investors to survive.

"The investment needed is actually similar to that for drug development. It means the investors we have attracted have a very long-term approach to this."

Emma is the small business reporter for The Age and Sydney Morning Herald based in Melbourne.

Go here to see the original:
Australia urged to invest in quantum computing ahead of future pandemics - Sydney Morning Herald

New UC-led institute awarded $25M to explore potential of quantum computing and train a future workforce – University of California

In the curious world of quantum mechanics, a single atom or subatomic particle can exist simultaneously in multiple conditions. A new UC-led, multiuniversity institute will explore the realities of this emerging field as it focuses on advancing quantum science and engineering, with an additional goal of training a future workforce to build and use quantum computers.

The National Science Foundation (NSF) has awarded $25 million over five years to establish the NSF Quantum Leap Challenge Institute (QLCI) for Present and Future Quantum Computation as part of the federal governments effort to speed the development of quantum computers. The institute will work to overcome scientific challenges to achieving quantum computing and will design advanced, large-scale quantum computers that employ state-of-the-art scientific algorithms developed by the researchers.

There is a sense that we are on the precipice of a really big move toward quantum computing, said Dan Stamper-Kurn, UC Berkeley professor of physics and director of the institute. We think that the development of the quantum computer will be a real scientific revolution, the defining scientific challenge of the moment, especially if you think about the fact that the computer plays a central role in just about everything society does. If you have a chance to revolutionize what a computer is, then you revolutionize just about everything else.

Unlike conventional computers, quantum computers seek to harness the mysterious behavior of particles at the subatomic level to boost computing power. Once fully developed, they could be capable of solving large, extremely complex problems far beyond the capacity of todays most powerful supercomputers. Quantum systems are expected to have a wide variety of applications in many fields, including medicine, national security and science.

Theoretical work has shown that quantum computers are the best way to do some important tasks: factoring large numbers, encrypting or decrypting data, searching databases or finding optimal solutions for problems. Using quantum mechanical principles to process information offers an enormous speedup over the time it takes to solve many computational problems on current digital computers.

Scientific problems that would take the age of the universe to solve on a standard computer potentially could take only a few minutes on a quantum computer, said Eric Hudson, a UCLA professor of physics and co-director of the new institute. We may get the ability to design new pharmaceuticals to fight diseases on a quantum computer, instead of in a laboratory. Learning the structure of molecules and designing effective drugs, each of which has thousands of atoms, are inherently quantum challenges. A quantum computer potentially could calculate the structure of molecules and how molecules react and behave.

The project came to fruition, in part, thanks to a UC-wide consortium, the California Institute for Quantum Entanglement, funded by UCs Multicampus Research Programs and Initiatives (MRPI).The MRPI funding opportunity incentivizes just this kind of multicampus collaboration in emerging fields that can position UC as a national leader.

This new NSF institute is founded on the outstanding research contributions in theoretical and experimental quantum information science achieved by investigators from across the UC system through our initiative to foster multicampus collaborations, said Theresa Maldonado, Ph.D., vice president for Research and Innovation of the University of California. The award recognizes the teams vision of how advances in computational quantum science can reveal new fundamental understanding of phenomena at the tiniest length-scale that can benefit innovations in artificial intelligence, medicine, engineering, and more. We are proud to lead the nation in engaging excellent students from diverse backgrounds into this field of study.

The QLCI for Present and Future Quantum Computation connects UC Berkeley, UCLA and UC Santa Barbara with five other universities around the nation and in California. The institute will draw on a wealth of knowledge from experimental and theoretical quantum scientists to improve and determine how best to use todays rudimentary quantum computers, most of them built by private industry or government labs. The goal, ultimately, is to make quantum computers as common as mobile phones, which are, after all, pocket-sized digital computers.

The institute will be multidisciplinary, spanning physics, chemistry, mathematics, computer science, and optical and electrical engineering, among other fields, and will include scientists and engineers with expertise in quantum algorithms, mechanics and chemistry. They will partner with outside institutions, including in the emerging quantum industry, and will host symposia, workshops and other programs. Research challenges will be addressed jointly through a process that incorporates both theory and experiment.

Situated near the heart of todays computer industry, Silicon Valley and Silicon Beach, and at major California universities and national labs, the institute will train a future workforce akin to the way computer science training at universities fueled Silicon Valleys rise to become a tech giant. UCLA will pilot a masters degree program in quantum science and technology to train a quantum-smart workforce, while massive online courses, or MOOCs, will help spread knowledge and understanding of quantum computers even to high school students.

This center establishes California as a leader nationally and globally in quantum computing, Stamper-Kurn said.

The institutes initial members are all senior faculty from UC Berkeley, UCLA, UC Santa Barbara, the California Institute of Technology, the Massachusetts Institute of Technology, the University of Southern California, the University of Washington and the University of Texas at Austin.

We still do not know fully what quantum computers do well, Stamper-Kurn said, and we face deep challenges that arise in scaling up quantum devices. The mission of this institute is to address fundamental challenges in the development of the quantum computer.

More information on NSF-supported research on quantum information science and engineering is available at nsf.gov/quantum.

The rest is here:
New UC-led institute awarded $25M to explore potential of quantum computing and train a future workforce - University of California

Giant atoms enable quantum processing and communication in one – MIT News

MIT researchers have introduced a quantum computing architecture thatcan perform low-error quantum computations while also rapidly sharing quantum information between processors. The work represents a key advance toward a complete quantum computing platform.

Previous to this discovery, small-scale quantum processors have successfully performed tasks at a rate exponentially faster than that of classical computers. However, it has been difficult to controllably communicate quantum information between distant parts of a processor. In classical computers, wired interconnects are used to route information back and forth throughout a processor during the course of a computation. In a quantum computer, however, the information itself is quantum mechanical and fragile, requiring fundamentally new strategies to simultaneously process and communicate quantum information on a chip.

One of the main challenges in scaling quantum computers is to enable quantum bits to interact with each other when they are not co-located, says William Oliver, an associate professor of electrical engineering and computer science, MIT Lincoln Laboratory fellow, and associate director of the Research Laboratory for Electronics. For example, nearest-neighbor qubits can easily interact, but how do I make quantum interconnects that connect qubits at distant locations?

The answer lies in going beyond conventional light-matter interactions.

While natural atoms are small and point-like with respect to the wavelength of light they interact with, in a paper published today in the journal Nature, the researchers show that this need not be the case for superconducting artificial atoms. Instead, they have constructed giant atoms from superconducting quantum bits, or qubits, connected in a tunable configuration to a microwave transmission line, or waveguide.

This allows the researchers to adjust the strength of the qubit-waveguide interactions so the fragile qubits can be protected from decoherence, or a kind of natural decay that would otherwise be hastened by the waveguide, while they perform high-fidelity operations. Once those computations are carried out, the strength of the qubit-waveguide couplings is readjusted, and the qubits are able to release quantum data into the waveguide in the form of photons, or light particles.

Coupling a qubit to a waveguide is usually quite bad for qubit operations, since doing so can significantly reduce the lifetime of the qubit, says Bharath Kannan, MIT graduate fellow and first author of the paper. However, the waveguide is necessary in order to release and route quantum information throughout the processor. Here, weve shown that its possible to preserve the coherence of the qubit even though its strongly coupled to a waveguide. We then have the ability to determine when we want to release the information stored in the qubit. We have shown how giant atoms can be used to turn the interaction with the waveguide on and off.

The system realized by the researchers represents a new regime of light-matter interactions, the researchers say. Unlike models that treat atoms as point-like objects smaller than the wavelength of the light they interact with, the superconducting qubits, or artificial atoms, are essentially large electrical circuits. When coupled with the waveguide, they create a structure as large as the wavelength of the microwave light with which they interact.

The giant atom emits its information as microwave photons at multiple locations along the waveguide, such that the photons interfere with each other. This process can be tuned to complete destructive interference, meaning the information in the qubit is protected. Furthermore, even when no photons are actually released from the giant atom, multiple qubits along the waveguide are still able to interact with each other to perform operations. Throughout, the qubits remain strongly coupled to the waveguide, but because of this type of quantum interference, they can remain unaffected by it and be protected from decoherence, while single- and two-qubit operations are performed with high fidelity.

We use the quantum interference effects enabled by the giant atoms to prevent the qubits from emitting their quantum information to the waveguide until we need it. says Oliver.

This allows us to experimentally probe a novel regime of physics that is difficult to access with natural atoms, says Kannan. The effects of the giant atom are extremely clean and easy to observe and understand.

The work appears to have much potential for further research, Kannan adds.

I think one of the surprises is actually the relative ease by which superconducting qubits are able to enter this giant atom regime. he says. The tricks we employed are relatively simple and, as such, one can imagine using this for further applications without a great deal of additional overhead.

Andreas Wallraff, professor of solid-state physics at ETH Zurich, says the research "investigates a piece of quantum physics that is hard or even impossible to fathom for microscopic objects such as electrons or atoms, but that can be studied with macroscopic engineered superconducting quantum circuits. With these circuits, using a clever trick, they are able both to protect their giant atom from decay and simultaneously to allow for coupling two of them coherently. This is very nice work exploring waveguide quantum electrodynamics."

The coherence time of the qubits incorporated into the giant atoms, meaning the time they remained in a quantum state, was approximately 30 microseconds, nearly the same for qubits not coupled to a waveguide, which have a range of between 10 and 100 microseconds, according to the researchers.

Additionally, the research demonstrates two-qubit entangling operations with 94 percent fidelity. This represents the first time researchers have quoted a two-qubit fidelity for qubits that were strongly coupled to a waveguide, because the fidelity of such operations using conventional small atoms is often low in such an architecture. With more calibration, operation tune-up procedures and optimized hardware design, Kannan says, the fidelity can be further improved.

See original here:
Giant atoms enable quantum processing and communication in one - MIT News

How Quantum Computers Work – ThoughtCo

A quantum computer is a computer design which uses the principles of quantum physics to increase the computational power beyond what is attainable by a traditional computer. Quantum computers have been built on a small scale and work continues to upgrade them to more practical models.

Computers function by storing data in a binary number format, which result in a series of 1s & 0s retained in electronic components such as transistors. Each component of computer memory is called a bit and can be manipulated through the steps of Boolean logic so that the bits change, based upon the algorithms applied by the computer program, between the 1 and 0 modes (sometimes referred to as "on" and "off").

A quantum computer, on the other hand, would store information as either a 1, 0, or a quantum superposition of the two states. Such a "quantum bit" allows for far greater flexibility than the binary system.

Specifically, a quantum computer would be able to perform calculations on a far greater order of magnitude than traditional computers ... a concept which has serious concerns and applications in the realm of cryptography & encryption. Some fear that a successful & practical quantum computer would devastate the world's financial system by ripping through their computer security encryptions, which are based on factoring large numbers that literally cannot be cracked by traditional computers within the lifespan of the universe. A quantum computer, on the other hand, could factor the numbers in a reasonable period of time.

To understand how this speeds things up, consider this example. If the qubit is in a superposition of the 1 state and the 0 state, and it performed a calculation with another qubit in the same superposition, then one calculation actually obtains 4 results: a 1/1 result, a 1/0 result, a 0/1 result, and a 0/0 result. This is a result of the mathematics applied to a quantum system when in a state of decoherence, which lasts while it is in a superposition of states until it collapses down into one state. The ability of a quantum computer to perform multiple computations simultaneously (or in parallel, in computer terms) is called quantum parallelism.

The exact physical mechanism at work within the quantum computer is somewhat theoretically complex and intuitively disturbing. Generally, it is explained in terms of the multi-world interpretation of quantum physics, wherein the computer performs calculations not only in our universe but also in other universes simultaneously, while the various qubits are in a state of quantum decoherence. While this sounds far-fetched, the multi-world interpretation has been shown to make predictions which match experimental results.

Quantum computing tends to trace its roots back to a 1959 speech by Richard P. Feynman in which he spoke about the effects of miniaturization, including the idea of exploiting quantum effects to create more powerful computers. This speech is also generally considered the starting point of nanotechnology.

Of course, before the quantum effects of computing could be realized, scientists and engineers had to more fully develop the technology of traditional computers. This is why, for many years, there was little direct progress, nor even interest, in the idea of making Feynman's suggestions into reality.

In 1985, the idea of "quantum logic gates" was put forth by the University of Oxford's David Deutsch, as a means of harnessing the quantum realm inside a computer. In fact, Deutsch's paper on the subject showed that any physical process could be modeled by a quantum computer.

Nearly a decade later, in 1994, AT&T's Peter Shor devised an algorithm that could use only 6 qubits to perform some basic factorizations ... more cubits the more complex the numbers requiring factorization became, of course.

A handful of quantum computers has been built. The first, a 2-qubit quantum computer in 1998, could perform trivial calculations before losing decoherence after a few nanoseconds. In 2000, teams successfully built both a 4-qubit and a 7-qubit quantum computer. Research on the subject is still very active, although some physicists and engineers express concerns over the difficulties involved in upscaling these experiments to full-scale computing systems. Still, the success of these initial steps does show that the fundamental theory is sound.

The quantum computer's main drawback is the same as its strength: quantum decoherence. The qubit calculations are performed while the quantum wave function is in a state of superposition between states, which is what allows it to perform the calculations using both 1 & 0 states simultaneously.

However, when a measurement of any type is made to a quantum system, decoherence breaks down and the wave function collapses into a single state. Therefore, the computer has to somehow continue making these calculations without having any measurements made until the proper time, when it can then drop out of the quantum state, have a measurement taken to read its result, which then gets passed on to the rest of the system.

The physical requirements of manipulating a system on this scale are considerable, touching on the realms of superconductors, nanotechnology, and quantum electronics, as well as others. Each of these is itself a sophisticated field which is still being fully developed, so trying to merge them all together into a functional quantum computer is a task which I don't particularly envy anyone ... except for the person who finally succeeds.

Excerpt from:
How Quantum Computers Work - ThoughtCo

What’s Needed to Deliver the Nationwide Quantum Internet Blueprint – HPCwire

While few details accompanied last weeks official announcement of U.S. plans for a nation-wide quantum internet, many of the priorities and milestones had been worked out during a February workshop and are now available in subsequent reports. The Department of Energy is leading the effort which is part of the U.S. Quantum Initiative passed in 2019.

The race to harness quantum information science whether through computing, communications, or sensing has become a global competition. In many ways quantum communications is the furthest along in development and its promise of near absolute security is extremely alluring. DOEs 17 National Laboratories are intended to serve as the backbone of the U.S. quantum internet effort.

As noted in the official announcement, Crucial steps toward building such an internet are already underway in the Chicago region, which has become one of the leading global hubs for quantum research. In February of this year, scientists from DOEs Argonne National Laboratory in Lemont, Illinois, and the University of Chicagoentangled photons across a 52-milequantumloop in the Chicago suburbs, successfully establishing one of the longest land-based quantum networks in the nation. That network will soon be connected to DOEs Fermilab in Batavia, Illinois, establishing a three-node, 80-mile testbed.

Turning early prototypes into a scaled-up nationwide effort involves tackling many technical challenges. One thorny problem, for example, is development of robust repeater technology, which among other things requires reliable quantum memory technology and prevention of signal loss. Interestingly, satellites may play a role as a bridge according to the report:

A quantum Internet will not exist in isolation apart from the current classical digital networks. Quantum information largely is encoded in photons and transmitted over optical fiber infrastructure that is used widely by todays classical networks. Thus, at a fundamental level, both are supported by optical fiber that implements lightwave channels. Unlike digital information encoded and transmitted over current fiber networks, quantum information cannot be amplified with traditional mechanisms as the states will be modified if measured.

While quantum networks are expected to use the optical fiber infrastructure, it could be that special fibers may enable broader deployment of this technology. At least in the near term, satellite-based entanglement bridges could be used to directly connect transcontinental and transatlantic Q-LANs. Preliminary estimates indicate that entangled pairs could be shared at rates exceeding 106 in a single pass of a Medium Earth Orbit (MEO) satellite. Such a capability may be a crucial intermediate step, while efficient robust repeaters are developed (as some estimates predict more than 100 repeaters would be needed to establish a transatlantic link).

The report from the workshop spells out four priorities along with five milestones. (The event was chaired by Kerstin Kleese van Dam, Brookhaven National Laboratory; Inder Monga, Energy Sciences Network; Nicholas Peters, Oak Ridge National Laboratory; and Thomas Schenkel, Lawrence Berkeley National Laboratory).

Here are the four priorities identified in the report:

Some of the test cases being discussed are fascinating such as one across Long Island, NY:

For example, there would be considerable value in expanding on the current results gleaned from the Brookhaven LabSBUESnet collaboration, which in April 2019 achieved the longest distance entanglement distribution experiment in the United States by covering approximately 20 km. Integral to the testbed are room-temperature quantum network prototypes, developed by SBUs Quantum Information Technology (QIT) laboratory, that connect several quantum memories and qubit sources. The combination of these important results allowed the BrookhavenSBU ESnet team to design and implement a quantum network prototype that connects several locations at Brookhaven Lab and SBU.

By using quantum memories to enhance the swapping of the polarization entanglement of flying photon pairs, the implementation aims to distribute entanglement over long distances without detrimental losses. The team has established a quantum network on Long Island, N.Y., using ESnets and Crown Castle fiber infrastructure, which encompasses approximately 120-km fiber length connecting Brookhaven Lab, SBU, and Center of Excellence in Wireless and Information Technology (CEWIT) at SBU campus locations.

As a next step, the team plans to connect this existing quantum network with the Manhattan Landing (MAN- LAN) in New York City, a high-performance exchange point where several major networks converge. This work would set the stage for a nationwide quantum-protected information exchange network. Figure 3:3 depicts the planned network configuration.

Here are milestones called out in the report:

A fifth broad milestone the Cross-cutting milestone: Build a Multi-institutional Ecosystem emphasizes the importance of federal agency cooperation and coordination and names DOE, NSF, NIST, DoD, NSA, and NASA as key players. While pursuing these alliances, critical opportunities for new directions and spin-off applications should be encouraged by robust cooperation with quantum communication startups and large optical communications companies. Early adopters can deliver valuable design metrics.

Its a clearly ambitious agenda. Stay tuned.

Link to announcement, https://www.hpcwire.com/off-the-wire/doe-unveils-blueprint-for-the-quantum-internet-in-event-at-university-of-chicago/

Link to slide deck, https://science.osti.gov/-/media/ascr/ascac/pdf/meetings/202004/Quantum_Internet_Blueprint_Update.pdf?la=en&hash=8C076C1BEB7CA49A3920B1A3C15AA531B48BDD72

Link to full report, https://www.energy.gov/sites/prod/files/2020/07/f76/QuantumWkshpRpt20FINAL_Nav_0.pdf

Read more from the original source:
What's Needed to Deliver the Nationwide Quantum Internet Blueprint - HPCwire

Asia Pacific Deep Learning Chip Market: Rising Significance of Quantum Computing is Propelling the Growth of the Market Science Market Reports -…

Thedeep learning chip marketin Asia Pacific is expected to grow from US$ 372.0 Mn in 2018 to US$ 5,702.2 Mn by the year 2027 with a CAGR of 35.7% from the year 2019 to 2027.

Driving factor such as the rising significance of quantum computing is propelling the growth of thedeep learning chip market. Further, the growing adoption of deep learning chips mainly for edge devices is anticipated to propel the deep learning chip market growth in the near future. Quantum computing takes seconds to finish a calculation that would otherwise takes more time. Quantum computers are an innovative transformation of artificial intelligence, machine learning, and big data. Therefore, prominence of quantum computing is expected to drive the growth of deep learning chip market.

The Asia Pacific Deep Learning Chip Market is growing along with the Technology, Media and Telecommunications industry, but the market is likely to slow down its growth due to the shortage of skilled professionals, suggests the Business Market Insights report.

The Business Market Insights subscription helps clients understand the ongoing market trends, identify opportunities, and make informed decisions through the reports in the Subscription Platform. The Industry reports available in the subscription provide an in-depth analysis on various market topics and enable clients to line up remunerative opportunities. The reports provide the market size & forecast, drivers, challenges, trends, and more.

Register for a free trial today and gain instant access to our market research reports @

https://www.businessmarketinsights.com/TIPRE00008602/request-trial

The China dominated the deep learning chip market in 2018 and is expected to dominate the market with the highest share in the Asia Pacific region through the forecast period. The creation of Chinas first national laboratory for deep learning, was initiated in Beijing in a move that could help the country surpass the US in developing AI. In 2017, The National Development and Reform Commission (NDRC) approved the plan to open up a national engineering lab for researching and implementing deep learning technologies. China is at the forefront of new and emerging technologies such as AI, and the adoption and implementation rate of AI is high across all major industry verticals. The government is keen in maintaining Chinas stronghold and competitiveness, especially in the adoption of advanced technologies. The above-mentioned factors are, therefore, contributing to the growth of the deep learning chip market in the country.

These factorsare expectedto offer broad growth opportunities in the Technology, Media and Telecommunications industry and this is expected to cause the demand forAsia Pacific Deep Learning Chip Market in the market.

Business Market Insights reports focus upon client objectives, use standard research methodologies and exclusive analytical models, combined with robust business acumen, which provides precise and insightful results.

Business Market Insights reports are useful not only for corporate and academic professionals but also for consulting, research firms, PEVC firms, and professional services firms.

ASIA PACIFIC DEEP LEARNING CHIP MARKET SEGMENTATION

ASIA PACIFIC DEEP LEARNING CHIP By Chip Type

ASIA PACIFIC DEEP LEARNING CHIP By Technology

ASIA PACIFIC DEEP LEARNING CHIP By Industry Vertical

ASIA PACIFIC DEEP LEARNING CHIP By Country

Deep Learning Chip Market Companies Mentioned

Business Market Insights provides affordable subscription with pay as per requirement @

https://www.businessmarketinsights.com/TIPRE00008602/checkout/basic/single/monthly

(30-day subscription plans proveto beverycost-effectivewith no compromise on the quality of reports)

Benefits with Business Market Insights

About Business Market Insights

Based in New York, Business Market Insights is a one-stop destination for in-depth market research reports from various industries including Technology, Media & Telecommunications, Semiconductor & Electronics, Aerospace & Defense, Automotive & Transportation, Biotechnology, Healthcare IT, Manufacturing & Construction, Medical Device, and Chemicals & Materials. The clients include corporate and academic professionals, consulting, research firms,PEVCfirms, and professional services firms.

For Subscription contact

Business Market Insights

Phone :+442081254005E-Mail :[emailprotected]

See original here:
Asia Pacific Deep Learning Chip Market: Rising Significance of Quantum Computing is Propelling the Growth of the Market Science Market Reports -...

This simple explainer tackles the complexity of quantum computing – Boing Boing

Many videos describing quantum computers try to distill and oversimplify everything. Thoughty's takes its time and gives more historical and theoretical context than most.

Because it does take a while to get into the subject, here's a shorter explainer by MIT:

Today's computers use bitsa stream of electrical or optical pulses representing1s or0s. Everything from your tweets and e-mails to your iTunes songs and YouTube videos are essentially long strings of these binary digits.

Quantum computers, on the other hand, usequbits, whichare typically subatomic particles such as electrons or photons. Generating and managing qubits is a scientific and engineering challenge. Some companies, such as IBM, Google, and Rigetti Computing, use superconducting circuits cooled to temperatures colder than deep space. Others, like IonQ, trap individual atoms in electromagnetic fields on a silicon chip in ultra-high-vacuum chambers. In both cases, the goal is to isolate the qubits in a controlled quantum state.

The processing power possible through these controlled qubits will make today's fastest computers look positively archaic.

Image: YouTube / Thoughty2

Intelligence is a surprisingly difficult thing to define. Kurzgesagt jumps into the debate with an interesting overview of where intelligence begins. Is a slime mold intelligent? Are plants intelligent?

Wildfires are a natural part of many ecosystems, though more and more are human-caused. Wendover Productions takes a look at how firefighters work to minimize the spread of wildfires in grueling and dangerous conditions.

Because of its ubiquity, the landscape is littered with proposed etymologies of the term OK. This nice explainer clarifies the murky origins of one of the most widely spoken words in the world.

You sort out your recycling. You dont use plastic bottles anymore. And youve even gone paperless, right down to using the canvas shopping bag at the grocery store. But even if youre trying to be ultra-vigilant to the environmental impact you make on our planet, there are bound to be some blind spots in that []

When you used to walk through any office you would likely spot a few bobbleheads. These wobbly figurines are great fun to have around, although most celebrate people we will never meet. For something a little more personal, tryHandmade Custom Bobbleheads. These mini caricatures are sculpted and painted by skilled artisans, based on any photo []

Most people enjoy having items on their desks that convey a taste of who they are and what theyre about to visitors. Under those circumstances, could there possibly be a greater flex to show off all of your galactic ambitions than having the entire solar system on exhibit right on your desktop? Even if you []

More here:
This simple explainer tackles the complexity of quantum computing - Boing Boing

6 Laws Of Zero Will Shape Our Future. For The Better Or Worse Is Up To Us. – Forbes

Six key drivers of humanity's progress are headed towards ZERO cost.

There are decades where nothing happens; and, there are weeks where decades happen, observed Vladimir Lenin. The recent weeks grappling with the Covid-19 pandemic certainly fall into the weeks-where-decades-happen category. Whats more, the trillions of dollars being spent on pandemic-fighting strategies might well make or break the decades to come,as I recently wrote.

Take telehealth, the adoption of which has seemingly been on the horizon for decades and suddenly, within a few weeks after Covid-19, achieved near universal embrace. McKinseyestimatesthat providers are seeing 50 to 175 times more patients via telehealth now than before Covid-19. Whats more,57% of providers view telehealth more favorably than before and 64% report that they are more comfortable using it. These punctuated changes in perception, preference and practice could vault the telehealth market from $3 billion pre-Covid to $250 billion and, in the process, force the rewiring of the entire healthcare delivery system,accordingto McKinsey.

The technological drivers enabling telehealth are reshaping every other aspect of the decades to come, too. I previously introduced these drivers as six Laws of Zerothat underpin a planning approach that Paul Carroll and I call the Future Perfect. In this article, I lay out the six laws in more detail.

The basic idea is that six key drivers of humanitys progresscomputing, communications, information, energy, water and transportationare headed toward zero cost. That meanswe can plan on being able to throw as much of these resources as we need to smartly address any problem. Success in doing so would bring us closer to the Future Perfect. Alternatively, like gluttons at an all-you-can-eat buffet, we could binge in ways that exacerbate societal issues such as health, equity,civility, privacy and human rights.

Our Future Perfect approach, which builds onan approach developed by Alan Kayfor inventing the future, projects the Laws of Zerointo the future to imagine how vast resources could address important needs in key pillars of society, suchas electricity, food, manufacturing, transportation, shelter, climate, education and healthcare.Weve chosen 2050 as a marker because 30 years is far enough in the future that one isnt immediately trapped by incremental thinking. Instead, we can explore how exponential resource improvements might radically alter the range of possible approaches. The question becomes Wouldnt it be ridiculous if we didnt have this?

The 30-year visioning is intended as a mind-stretching exercise, not precise forecasting. This is where creativity and romance lives, albeit underpinned by the deep science of the Laws of Zero rather than pure fantasy. We use a technique we call future histories to develop powerful narratives of compelling futures. We then pull backwards to today and chart possible paths for turning the 30-year visions into something more concrete. As Alan Kay says, the best way to predict the future is to invent it.

Now, lets explore the drivers and understand what zero cost really does, to lay the foundation for designing those remarkably different and (hopefully) better futures.

1.Computing.

That computing has followed the Law of Zero comes as no surprise to anyone familiar withMoores Law, the observation by Intel co-founder Gordon Moore in 1965 that computing was doubling in power and halving in cost every 2 years. Ray Kurzweil has even observed that Moores Law describes not just integrated circuits but more than a century of computing, as shown below.

Computing has followed Moore's Law for more than a century. (Log Scale)

One consequence is that the smartphone in your pocket has over100,000 times more processing power and one million times more memorythan thecomputer that guided Apollo 11to the moon and backat a percentage of cost that effectively rounds to zero.While computing power obviously isnt free, as anyone buying a smartphone knows, it looks almost free from any historical distance.

Now consider computing in 2050. If Moores Law remains a guide, computing power would double about 20 times in the next 30 years. Cost would be cut in half 20 times. In other words, we can look forward to computing power more than one million times faster than today with a per unit cost of todays divided by million.

Evenif Moores Law slowsand engineers progress in cramming more circuits onto a silicon chip finally diminishes, innovations in computing architecture, algorithms and (perhaps) quantum computing are emerging to pick up the slackand perhaps even quicken the pace. Rodney Brooks, former director of MITs Computer Science and Artificial Intelligence Lab, argues that the end of Moores Law, as we traditionally knew it, will unleash a new golden era ofcomputing.

2.Communications

circa 1850: American inventor of the electric telegraph and morse code, Samuel Morse (1791 - 1872). ... [+] (Photo by Hulton Archive/Getty Images)

The first message that inventor Samuel Morse sent on his experimental telegraph line between Baltimore and Washington DC in 1844 was, What hath God wrought? Some form of that astonishment has been expressed time and again as communications have moved from the telegraph to the telephone to the ubiquitous, digital communications of today. Communications is becoming ever richer, too, as having bandwidth to burn means that video (and, in time, augmented and virtual reality) can be part of every connection.

Reach will keep expanding, too. Riding on the computing Law of Zero, communications will expand into every corner of the globe, as tens of billions of devices and trillions of sensors are incorporated into a tapestry of communication. In other words, we arent just talking about humans connecting with each other.

Now, history suggests that expectations about communication should be tempered. The telegraph, for instance, waspredicted in 1858 to bring world peace:

It is impossible that old prejudices and hostilities should longer exist, while such an instrument has been created for an exchange of thought between all the nations of the earth.

But, the telegraph turned out to be a critical instrument of war as well. The Union armies relied on some6.5 million messagesduring the Civil War, and the telegraph provided the North key tactical, operational and strategic advantages over the Confederacy.

The Internet created similar delusions. Wed all understand each other better and communicate freely, so rainbows and unicorns. Clearly, no one had imagined Twitter in those early days. No one realized how much Facebook, etc. could increase tribalism and exacerbate divides.

Still, whatever winds up flowing through the pipes in our future whether butterflies or raw sewage the pipes will be almost infinitely wide, and the cost will be a flat, low rate. Draw the graph of cost vs. performance from todays perspective, and that cost in 2050 will be so low that lets call it zero.

3.Information

In the late 1990s, during the Internets coming out party, it started to become clear just how much information could be collected on peoples online behavior. But Nicholas Negroponte, the founder of the MIT Media Lab, defiantly told us that, while his credit card company or someone else might know that he went to see a certain movie, They wont know whether I liked it!

No longer. Just a few buildings away from Negropontes old office at MIT, researchers can monitor motion, heartrate, breathing and even emotions for people simply by how they reflect ambient radio waves, such as Wi-Fi. So, sensors in a movie theater can already get a good read on how Negroponte and the rest of an audience felt about a film. Mix in a little genomics, biology, financial data, social preference data, census data, etc., and, whoa, were naked, whether we want to be or not.

Furthermore, if you think social media is revealing now, wait until you see whats coming. Already, a startup called Banjo has become a sort of network of networks and is monitoring massive amounts of private and public social feeds around the world. It divides the globe into 7 trillion sectors and monitors in real time what is happening in each of those small areas.

It doesnt take much imagination to conjure up the dystopian possibilities; but, the ability to monitor everything and everywhere has plenty of potential benefits, too.

In a simulation described by the Attorney General of Utah, Banjo was able tosolve a child abduction casein minutes. The same case eluded a multi-agency task force that applied hundreds of person-hours using traditional methods. A real-life example: by matching a social media post about the sound of gunfire with closed-circuit video camera showing gunfire, Banjo was the first to detect the 2017 shooting at the Mandalay Bay resort in Las Vegas.

How much will all that information cost? A good indicator is the sequencing cost per genome, which is falling at an exponential rate even faster than Moores Law.

Sequencing cost per genome is falling faster than Moore's Law.

Still, we are far from taking full advantage of the Laws of Zero in information. Just consider the sad state of Covid-19 testing, where wait times of up to two weeks arerendering test results uselessfor guiding public health strategies, quarantines and contact tracing. Instead, imagine building a world where every bit of such information is available when you need it to enable and manage a Future Perfect.

4.Energy

A joke in the world of energy imagines Alexander Graham Bell and Thomas Edison coming back to life today. Bell would be amazed by wireless communications, by the small size of phones, by texting and all the other apps. What would he even do with an iPhone? Facetime? Fortnite? TikTok? Where would he start? By contrast, Edison would look at the electric grid and say, Yeah, thats about where I left things.

Fortunately, big change is finally afoot.

When Bell Labs developed the first solar photovoltaic panel in 1954, the cost was $1,000 per watt produced. That means it cost $75,000 to power a single reading lamp maybe a little pricey. By 2018, solar was down to 40 cents a watt. A drop in price by a factor of 2,500 over six decades isnt Moores law, but its certainly headed toward that magic number: zero.

Wind power is also on an aggressive move toward zero prices are down nearly 50% just in the past year. Contracts were recently signed for wind power in Brazil at1.75 cents per kilowatt hour, about one-fourth of the average of 6.8 cents per kwh worldwide for coal, considered to be the cheapest of the conventional energy sources.

The key holdup for renewable energy has been batteries. There has to be some way to store solar and wind energy for when you need it, which means the need for lots and lots of battery capacity. Fortunately, batteries are progressing on three key fronts: battery life, power and cost. CATL, the worlds top producer,recently announceda car battery that can operate for 1.2 million miles, which is 8 times more than most car batteries on the market today. And, as the figure below by BloombergNEF shows, battery prices have plunged 87% in the past 10 years. Even Gordon Moore would be pleased.

Battery prices have plunged 87% in the past 10 years.

So, we have at least three cost curves that look like theyre headed toward zero: solar, wind and batteries. Thats plenty, but others are worth mentioning as well, including nuclear fission, nuclear fusion, geothermal and radical improvements in efficiency. Together, they create a Law of Zero for energy that will create unfathomable benefits. Energy drives every living thing, and unlimited energy will drive unlimited opportunities.

Edison wouldnt know what hit him.

5.Water

Water is the new oil. Aquarter of humanityfaces looming water crises. Demand is growing with population, urbanization and wealth, taxing traditional fresh water supplies while also polluting them. But theres hope: Limitless energy could allow for the almost magical availability of water.

By 2050, anyone near a body of saltwater could benefit from water technology breakthroughs. Desalination has always been possible but prohibitively expensive because of the energy costs, whether done by filtering out the salt through osmosis or by evaporating the water and leaving the salt behind. But cheap energy makes desalination more plausible, hopefully in time for the many cities around the world that are getting desperate for water.

Water wont be pulled right out of thin air in great quantities any time soon, but that technology is also under development. One group wona $1.5 million X Prizeby developing a generator that can be used in any climate and can extract at least 2,000 liters of water a day from the air at a cost of less than two cents a liter, using entirely renewable energy.

Cody Friesen, founder of Zero Mass Water, a startup backed by Bill GatesBreakthrough Energy Ventures,BlackRockand other high profile investors, says decentralized production of water will lead to benefits akin to those that come from having abundant electricity while off the grid. He says 1% to 2% of the worlds carbon footprint comes from mass-purifying todays water; that carbon dioxide goes away when water is drawn from the air and purified at your doorstep by the sorts of solar-poweredSource Hydropanelsthat his company produces.

Like many potentially world-changing future solutions, Friesens approach doesnt make economic sense today. But his early research, products and field experiments, such ashis work with the Australian Renewable Energy Agency (ARENA), helps to develop the solution for when the approach, riding on the Law of Zero for energy, becomes viable on a massive scale by 2050, if not sooner.

Where there is abundant water, along with the energy that comes from that Law of Zero, there can be food. The basics of life will be available everywhere, even to the far corners of the Earth.

6.Transportation

Although the enthusiasm for autonomous vehicles has taken a hit over the past couple of years they are a really hard problem some early successes and the multitude of startups and brilliant scientists tackling the issues make us confident that 2050 will include an unlimited number of fully autonomous vehicles.

The implications are mind-boggling. Basically, terrestrial transportation heads toward zero marginal cost. Remember, electricity is heading towards zero cost, and all these cars and trucks will be powered by batteries, so fuel is no longer an expense. In addition, the cost of driving, in terms of the time you devote to it, will disappear once we reach full autonomy. With time no longer a factor, distance wont be, either. Even if you have to travel a couple of hundred miles, or spend two or three hours in a vehicle, you take your world into the vehicle with you and can act just as you would sitting on your couch at home. So, much of the expense associated with fuel, time and distance go away.

Now, a lot of metal will need to be shaped and maintained even in an autonomous future, so transportation wont be free. But it will be so much less expensive than it is today that we can be profligate in throwing transportation resources at anything we want to design for the Future Perfect. So, think in terms of a world where fuel is free and, thus, infinite, and where many considerations of time and distance no longer matter.

Yes, lots of people and businesses will have to adapt. Most notable are the 4.5 million professional drivers in the U.S., but theyre just the startand it wont all be on the negative side of the ledger. Autonomous vehicles will also change emergency rooms (which currently treat some 2.5 million people each year after auto accidents and, based on current estimates, might treat only 10% as many once AVs become ubiquitous). We know that the vast majority of the roughly 40,000 people who die on U.S. roads every year will miss that appointment with death (Yay!) and keep living productive lives.

We also know that health, wealth, education, economic mobility and more will all improve, because access to transportation currently constrains so many people, and those limitations go away.

Not all the Laws of Zero will kick in right away. Theubiquity of water, in particular, will take time to play out, partly because getting to zero cost for energy will also take a bit. For all these Laws of Zero, supporting technologies need to continue to mature and be helped along by some as-yet-to-be-invented (but inevitable) scientific breakthroughs.

But the core question is fascinating and important: How will these Laws of Zero let us design and build as grand a world as possible for our children and their kids by 2050?

What can removing todays actual and cognitive restraints let us realistically project for them in terms of electricity, food, manufacturing, transportation, shelter, climate, education, healthcare anddare we say it?the political and social environment that will define the equity, civility, privacy and human rights in the world in which they live? How much can we increase the odds of success if we more clearly envision the Future Perfect enabled by the Laws of Zero, and begin now to focus both inspiration and perspiration towards inventing it?

The short answer is: A lot, if we step up to the challenge. By doing so, we can make next few decades ones where centuries happen.

Read this article:
6 Laws Of Zero Will Shape Our Future. For The Better Or Worse Is Up To Us. - Forbes

Research Opens New Neural Network Model Pathway to Understanding the Brain – Business Wire

PALO ALTO, Calif.--(BUSINESS WIRE)--NTT Research, Inc., a division of NTT (TYO:9432), today announced that a research scientist in its Physics & Informatics (PHI) Lab, Dr. Hidenori Tanaka, was the lead author on a technical paper that advances basic understanding of biological neural networks in the brain through artificial neural networks. Titled From deep learning to mechanistic understanding in neuroscience: the structure of retinal prediction, the paper was presented at NeurIPS 2019, a leading machine-learning, artificial intelligence (AI) and computational neuroscience conference, and published in Advances in Neural Information Processing Systems 32 (NIPS 2019). Work on the paper originated at Stanford University, academic home of the papers six authors when the research was performed. At the time, a post-doctoral fellow and visiting scholar at Stanford University, Dr. Tanaka joined NTT Research in December 2019. The underlying research aligns with the PHI Labs mission to rethink the computer by drawing inspirations from computational principles of neural networks in the brain.

Research on the paper began through collaboration between the labs of Stanford University Professors Surya Ganguli and Stephen Baccus, two of the papers co-authors. Dr. Ganguli, one of four Stanford professors who are lead investigators on collaborative projects with the NTT Research PHI Lab, is an associate professor in the Department of Applied Physics. Dr. Baccus is a professor in the Department of Neurobiology. Co-authors Niru Maheswaranathan, Lane McIntosh and Aran Nayebi were in the Stanford Neurosciences Ph.D. program when the work was performed. Drawing upon previous work on deep learning models of the retinal responses to natural scenes by the co-authors, this NeurIPS paper addressed the fundamental question in modern computational neuroscience of whether successful deep learning models were simply replacing one complex system (a biological circuit) with another (a deep network), without understanding either. By combining ideas from theoretical physics and interpretable machine learning, the authors developed a new way to perform model reduction of artificial neural networks that are trained to mimic the experimentally recorded neural response of the retina to natural scenes. The underlying computational mechanisms were consistent with prior scientific literature, thus placing these neuroscientific models on firmer theoretical foundations.

Because we are working on such a long-range, cross-disciplinary frontier, the work last year by Dr. Tanaka and his colleagues at Stanford is still fresh; moreover, it is particularly relevant to our continued exploration of the space between neuroscience and quantum information science, as the framework presents a new way to extract computational principles from the brain, said PHI Lab Director Dr. Yoshihisa Yamamoto. Establishing a solid foundation for neural network models is an important breakthrough, and we look forward to seeing how the research community, our university research partners, Dr. Tanaka and our PHI Lab build upon these insights and advance this work further.

To better ground the framework of deep networks as neuroscientific models, the authors of this paper combine modern attribution methods and dimensionality reduction for determining the relative importance of interneurons for specific visual computations. This work analyzes the deep-learning models that were previously shown to reproduce four types of cell responses in the salamander retina: omitted stimulus response (OSR), latency coding, motion reversal response and motion anticipation. The application of the developed model reduction scheme results in simplified, subnetwork models that are consistent with prior mechanistic models, with experimental support in three of the four response types. In the case of OSR, the analysis yields a new mechanistic model and hypothesis that redresses previous inadequacies. In all, the research shows that in the case of the retina, complex models derived from machine learning can not only replicate sensory responses but also generate valid hypotheses about computational mechanisms in the brain.

Unlike natural systems that physicists usually deal with, our brain is notoriously complicated and sometimes rejects simple mathematical models, said Dr. Tanaka. Our paper suggests that we can model the complex brain with complex artificial neural networks, perform model-reduction on those networks and gain intuition and understanding of how the brain operates.

Dr. Tanakas theoretical pursuit in reducing the complexity of artificial neural networks not only advances our scientific understanding of the brain, but also provides engineering solutions to save time, memory and energy in training and deploying deep neural networks. His current research proposes a new pruning algorithm, SynFlow (Iterative Synaptic Flow Pruning), which challenges the existing paradigm that data must be used to quantify which synapses are important. Whereas last years paper sought to understand the brain by performing model reduction on the biological neural networks, the new work from this year aims to make deep learning more powerful and efficient by removing parameters from artificial neural networks.

This research plays a role in the PHI Labs broader mission to apply fundamental principles of intelligent systems, including our brain, in radically re-designing artificial computers, both classical and quantum. To advance that goal, the PHI Lab has established joint research agreements not only with Stanford but also five additional universities, one government agency and quantum computing software company. The other universities are California Institute of Technology (Caltech), Cornell University, Massachusetts Institute of Technology (MIT), Swinburne University of Technology and the University of Michigan. The government entity is NASA Ames Research Center in Silicon Valley, and the private company is 1Qbit. Taken together, these agreements span research in the fields of quantum physics, brain science and optical technology.

About NTT Research

NTT Research opened its Palo Alto offices in July 2019 as a new Silicon Valley startup to conduct basic research and advance technologies that promote positive change for humankind. Currently, three labs are housed at NTT Research: the Physics and Informatics (PHI) Lab, the Cryptography and Information Security (CIS) Lab, and the Medical and Health Informatics (MEI) Lab. The organization aims to upgrade reality in three areas: 1) quantum information, neuro-science and photonics; 2) cryptographic and information security; and 3) medical and health informatics. NTT Research is part of NTT, a global technology and business solutions provider with an annual R&D budget of $3.6 billion.

NTT and the NTT logo are registered trademarks or trademarks of NIPPON TELEGRAPH AND TELEPHONE CORPORATION and/or its affiliates. All other referenced product names are trademarks of their respective owners. 2020 NIPPON TELEGRAPH AND TELEPHONE CORPORATION

More:
Research Opens New Neural Network Model Pathway to Understanding the Brain - Business Wire

The 6 Biggest Technology Trends In Accounting And Finance – Forbes

The explosion in data that has launched the Fourth Industrial Revolution, an era when business will be transformed by cyber-physical systems, has enabled several technology trends to develop. Every business can leverage these important trends and should pay attention to how best to use them, but accountants should really evaluate how these six technologies can be used strategically to achieve the companys business strategy.

The 6 Biggest Technology Trends In Accounting and Finance

1.Big Data

Data is crucial to make business financial decisions. Today, data isn't just numbers and spreadsheets that accountants have been familiar with for years; it also includes unstructured data that can be analyzed through natural language processing. This can allow for real-time status monitoring of financial matters. Data is the fuel that powers other technology trends that are transforming finance and accounting in the Fourth Industrial Revolution. Even the audit process has been digitalized. In the financial realm, data produces valuable insights, drives results and creates a better experience for clients. Since everything leaves a digital footprint, the unprecedented digitalization of our world is creating opportunities to glean new insights from data that wasnt possible before. These insights help improve internal operations and build revenue.

2.Increased Computing Power

Just as it is for other companies, all the data created by our digitalized world would be useless or at least less powerful if it weren't for the advances in computing power. These changes allow accounting and finance departments and firms to store and use the data effectively. First, there are the cloud services from providers such as Amazon, Google, and Microsoft that provide scalable systems and software to leverage that can be accessed wherever and whenever it's needed. Edge computing has also grown. This is where the computing happens not in the cloud, but right where the data is collected. The adoption of 5G (fifth generation) cellular network technology will be the backbone of a smarter world. When quantum computing is fully adopted, it will be transformative in a way that cannot even be predicted at this point since it

will catapult our computing power exponentially. Quantum computers will be able to provide services and solve problems that werent possible with traditional computers. There will be tremendous value in the financial world for this capability.

3.Artificial Intelligence (AI)

Artificial intelligence can help accounting and finance professionals be more productive. AI algorithms allow machines to take over time-consuming, repetitive, and redundant tasks. Rather than just crunch numbers, with the support of AI, financial professionals will be able to spend more time delivering actionable insight. Machines can help reduce costs and errors by streamlining operations. The more finance professionals rely on AI to do what it does bestanalyze and process a tremendous amount of data and take care of monotonous tasksthe more time humans will recover to do what they do best. New technology has changed the expectations clients have when working with companies, and it's the same for accounting. AI helps accountants be more efficient.

4.Intelligence of Things

When the internet of things, the system of interconnected devices and machines, combines with artificial intelligence, the result is the intelligence of things. These items can communicate and operate without human intervention and offer many advantages for accounting systems and finance professionals. The intelligence of things helps finance professionals track ledgers, transactions, and other records in real-time. With the support of artificial intelligence, patterns can be identified, or issues can be resolved quickly. This continuous monitoring makes accounting activities such as audits much more streamlined and stress-free. In addition, the intelligence of things improves inventory tracking and management.

5.Autonomous Robots

Robots don't have to be physical entities. In accounting and finance, robotic process automation (RPA) can handle repetitive and time-consuming tasks such as document analysis and processing, which is abundant in any accounting department. Freed up from these mundane tasks, accountants are able to spend time on strategy and advisory work. Intelligent automation (IA) is capable of mimicking human interaction and can even understand inferred meaning in client communication and adapt to an activity based on historical data. In addition, drones and unmanned aerial vehicles can even be deployed on appraisals and the like.

6. Blockchain

The final tech trend that has significant implications for accounting and finance professionals that I wish to cover is blockchain. A distributed ledger or blockchain is a highly secure database. It's a way to securely store and accurately record data, which has broad applications in accounting and financial records. Blockchain enables smart contracts, protecting and transferring ownership of assets, verifying people's identities and credentials, and more. Once blockchain is widely adopted, and challenges around industry regulation are overcome, it will benefit businesses by reducing costs, increasing traceability, and enhancing security.

To learn more about these technology trends as well as other key trends that are shaping the 4th industrial revolution, you can take a look at my new book, Tech Trends in Practice: The 25 Technologies That Are Driving The 4th Industrial Revolution

Continue reading here:
The 6 Biggest Technology Trends In Accounting And Finance - Forbes