Page 116«..1020..115116117118..130..»

Category Archives: Quantum Computing

U of T’s Peter Wittek, who will be remembered at Feb. 3 event, on why the future is quantum – News@UofT

Posted: January 18, 2020 at 9:46 am

In September of 2019, Peter Wittek, an assistant professor at the University of Toronto, went missing during a mountaineering expedition in the Himalayas after reportedly being caught in an avalanche. A search and rescue mission was launched but the conditions were very difficult and Wittek was not found.

Peters loss is keenly felt, said Professor Ken Corts, acting dean of the Rotman School of Management. He was the Founding Academic Director of the CDL Quantum Stream, a valued instructor in the MMA program, data scientist in residence with the TD Management Data and Analytics Lab, an exceptional contributor to Rotman and U of T and a wonderful colleague.

A ceremony to remember Wittek will take place on Feb. 3 from 3 to 4:30 pm in Desautels Hall at the Rotman School of Management.

Quantum computing and quantum machine learning an emerging field that counted Wittek as one of its few experts was the topic of his final interview inRotman Management Magazine. It is reprinted below:

You oversee the Creative Destruction Labs Quantum stream, which seeks entrepreneurs pursuing commercial opportunities at the intersection of quantum computing and machine learning. What do those opportunities look like?

Weve been running this stream for three years now, and we were definitely the first to do this in an organized way. However, the focus has shifted slightly. We are now interested in looking at any application of quantum computing.

These are still very early days for quantum computing. To give you a sense of where we are at, some people say its like the state of digital computing in the 1950s, but Id say its more like the 1930s. We dont even agree yet on what the architecture should look likeand, as a result, we are very limited with respect to the kind of applications we can build.

As a result, focusing on quantum is still quite risky. Nevertheless, so far we have had 45 companies complete our program. Not all of them survived, but a good dozen of them have raised funding. If you look at the general survival rate for AI start-ups, our record is roughly the same and given how new this technology is, that is pretty amazing.

What are the successful start-ups doing? Can you give an example of the type of problems theyre looking to solve?

At the moment I would say the main application areas are logistics and supply chain. Another promising area is life sciences, where all sorts of things can be optimized with this technology. For instance, one of our companies,Protein-Qure, is folding proteins with quantum computers.

Finance is another attractive area for these applications. In the last cohort we had a company that figured out a small niche problem where they had both the data and the expertise to provide something new and innovative; they are in the process of raising money right now. The other area where quantum makes a lot of sense is in material discovery. The reason we ever even thought of building these computers was to understand quantum materials, back in the 1980s. Today, one of our companies is figuring out how to discover new materials using quantum processing units instead of traditional supercomputers.

We have a company calledAgnostic, which is doing encryption and obfuscation for quantum computers. Right nowIBM,Rigetti ComputingandD-Wave Systemsare building quantum computers for individual users. They have access to everything that you do on the computer and can see all the data that youre sending. But if youre building a commercial application, obviously you will want tohide that. Agnostic addresses this problem by obfuscating the code you are running. One application weve seen in the life sciences is a company calledEigenMed, which addresses primary care. They provide novel machine learning algorithms for primary care by using quantum-enhanced sampling algorithms.

We also seed companies that dont end up using quantum computing. They might try out a bunch of things and discover that it doesnt work for the application they have in mind, and they end up being 100 per cent classical.StratumAIis an example of this. It uses machine learning to map out the distribution of ore bodies under the ground. The mining industry is completely underserved by technology, and this company figured out thatto beat the state-of-the-art by a significant margin, it didnt even need quantum. They just used classical machine learning and they already have million dollar contracts.

Which industries will be most affected by this technology?

Life sciences will be huge because, as indicated, it often has complex networks and probability distributions, and these are very difficult to analyze with classical computers. The way quantum computers work, this seems to be a very good fit, so that is where I expect the first killer app to come from. One company,Entropica Labs, is looking at various interactions of several genomes to identify how the combined effects cause certain types of disease. This is exactly the sort of problem that is a great fit for a quantum computer.

You touched on quantum applications in primary care. If I walked into a doctors office, how would that affect me?

Its trickybecause, like mining, primary care is vastly underserved by technology. So, if you were to use any machine learning, you would only do better. But EigenMed was actually founded by an MD. He realized that there are certain machine learning methods that we dont use simply because their computational requirements are too high but that they happen to be a very good fit for primary care, because the questions you can ask the computer are similar to what a GP would ask.

For instance, if a patient walks in with a bunch of symptoms, you can ask, What is the most likely disease? and What are the most likely other symptoms that I should verify to make sure it is what I suspect? These are the kinds of probabilistic questions that are hard to ask on current neural network architectures, but they are exactly the kind of questions that probabilistic graphical models handle well.

Are physicians and other health-care providers open to embracing this technology, or do they feel threatened by it?

First of all, health care is a heavily regulated market, so you need approval for everything. Thats not always easy to getand, as a result, it can be very difficult to obtain data. This is the same problem that any machine learning company faces. Fine, they have this excellent piece of technology and theyve mastered it,but if you dont have any good data, you dont have a company. I see that as the biggest obstacle to machine learning-based progress in health care and life sciences.

You have said that QML has the potential to bring about the next wave of technology shock. Any predictions as to what that might look like?

I think its going to be similar to what happened with deep learning. The academic breakthrough happened about nine years ago, but it took a long time to get into the public discussion. This is currently happening with AI which, at its core, is actually just very simple pattern recognition. Its almost embarrassing how simplistic AI is and yet it is already changing entire industries.

Quantum is next not just quantum machine learning but quantum computing in general. Breakthroughs are happening every day, both on the hardware side and in the kind of algorithms you can build with quantum computers. But its going to take another 10 years until it gets into public discussions and starts to disrupt industries. The companies we are seeding today are going to be the ones that eventually disrupt industries.

Alibaba is one of the companies at the forefront of embracing quantum, having already committed $15 billion to it. What is Alibaba after?

First of all, I want to say a huge thank you toAlibaba becausethe moment it made that commitment, everyone woke up and said, Hey, look: the Chinese are getting into quantum computing! Almost immediately, the U.S. government allocated $1.3 billion to invest in and develop quantum computers, and a new initiative is also coming together in Canada.

The worlds oldest commercial quantum computing company is actually from Canada:D-Wave Systemsstarted in 1999 in British Columbia. Over its 20-year history, it managed to raise over $200 million. Then Alibaba came along and announced it was committing $15 billion to quantumand this completely changed the mindset. People suddenly recognized that theres a lot of potential in this area.

What does Alibaba want from quantum? You could ask the same question ofGoogle, which is also building a quantum computer. For them, its because they want to make their search and advertisement placement even better than it already is. Eventually, this will be integrated into their core business. I think Alibaba is looking to do something similar. As indicated, one of the main application areas for quantum is logistics and supply chain. Alibaba has a lot more traffic thanAmazon. Its orders are smaller, but the volume of goods going through its warehouses is actually much larger. Any kind of improved optimization it can achieve will translate into millions of dollars in savings. My bet is that Alibabas use of quantum will be applied to something that is critical to its core operation.

The mission of CDLs Quantum stream is that, by 2022, it will have produced more revenue-generating quantum software companies than the rest of the world combined. What is the biggest challenge you face in making that a reality?

People are really waking up to all of this. There is already a venture capital firm that focuses exclusively on quantum technologies. So, the competition is steep, but we are definitely leading in terms of the number of companies created. In Canada, the investment community is a bit slow to put money into these ventures. But every year we are recruiting better and better people and the cohorts are more and more focused and, as a result, I think we are going to see more and more success stories.

It seems like everyone is interested in quantum andthey are thinking about investing in it, but they are all waiting for somebody else to make the first move. Im waiting for that barrier to break and, in the meantime, we are making progress.Xanadujust raised $32 million in Series A financing, which indicates that it has shown progress in building its business model and demonstrated the potential to grow and generate revenue. ProteinQure raised a seed of around $4 million dollars. And another company,BlackBrane, raised $2 million. So, already, there are some very decent financing rounds happening around quantum. It will take lots of hard work, but I believe we will reach our goal.

Peter Wittekwas an Assistant Professor at the Rotman School of Management and Founding Academic Director of the Creative Destruction Labs Quantum stream. The author ofQuantum Machine Learning: What Quantum Computing Means to Data Mining(Academic Press, 2016),he was also a Faculty Affiliate at the Vector Institute for Artificial Intelligence and the Perimeter Institute for Theoretical Physics.

This article appeared in theWinter 2020 issueof Rotman ManagementMagazine.Published by the University of Torontos Rotman School of Management,Rotman Managementexplores themes of interest to leaders, innovators and entrepreneurs.

More:

U of T's Peter Wittek, who will be remembered at Feb. 3 event, on why the future is quantum - News@UofT

Posted in Quantum Computing | Comments Off on U of T’s Peter Wittek, who will be remembered at Feb. 3 event, on why the future is quantum – News@UofT

2020s — the Decade of AI and Quantum – Inside Higher Ed

Posted: at 9:46 am

Too often, we look ahead assuming that the technologies and structures of today will be in place for years to come. Yet a look back confirms that change has moved at a dramatic pace in higher education.

Reviewing the incredible progress each decade brings makes me wonder, if I knew at the beginning of the decade what was coming, how might I have better prepared?

Make no mistake, we have crossed the threshold into the fourth industrial revolution that will most markedly advance this decade through maturing artificial intelligence, ultimately driven by quantum computing. The changes will come at an ever-increasing rate as the technologies and societal demands accelerate. Digital computers advanced over the past half century at approximately the rate described by Moores Law, with processing power doubling every two years. Now we are entering the era of Nevens Law, which predicts the speed of progress of quantum computing at a doubly exponential rate. This means change at a dizzyingly rapid rate that will leave many of us unable to comprehend the why and barely able to digest the daily advances that will describe reality. New platforms, products and processes will proliferate in this new decade.

That includes higher education. The centuries-old model of the faculty member at a podium addressing a class of students who are inconsistently and inaccurately taking notes on paper or laptop will seem so quaint, inefficient and impractical that it will be laughable. Observers in 2030 will wonder how any significant learning even took place in that environment.

Semesters and seat time will not survive the coming decade. Based in 19th- and 20th-century societal needs, these are long overdue to pass away. The logical and efficient structure of outcomes-based adaptive learning will quickly overtake the older methods, doing away with redundancy for the advanced students and providing developmental learning for those in need. Each student will be at the center of their learning experience, with AI algorithms fed by rich data about each student mapping progress and adjusting the pathway for each learner. This will lead to personalized learning where the courses and curriculum will be custom-made to meet the needs of the individual learner. Yet, it also will also serve to enhance the social experience for learners meeting face-to-face. In a report from Brookings on the topic, researchers stated that technology can help education leapfrog in a number of ways. It can provide individualized learning by tracking progress and personalizing activities to serve heterogeneous classrooms.

Early implementations of adaptive learning in the college setting have shown that this AI-driven process can result in greater equity success for the students. In addition, the faculty members see that their role has become even more important as they directly interact with the individual students to enable and facilitate their learning.

Increasingly we are gathering data about our students as they enter and progress through learning at our institutions. That big data is the "food" upon which artificial intelligence thrives. Sorting through volumes and varieties of data that in prior decades we could not efficiently process, AI can now uncover cause and effect pairs and webs. It can lead us to enhancements and solutions that previously were beyond our reach. As the pool of data grows and becomes more and more diverse -- not just numbers, but also videos and anecdotes -- the role of quantum computing comes into play.

While it is unlikely we will see quantum computers physically on the desks of university faculty and staff in the coming decade, we certainly will see cloud use of quantum computers to solve increasingly complex problems and opportunities. Quantum computers will interact with digital computers to apply deep learning at an as yet unseen scale. We will be able to pose challenges such as "what learning will researchers need to best prepare for the next generation of genetic advancement?" Faster than a blink of an eye, the quantum computers will respond.

It turns out that major developments are occurring every day in the advancement of quantum computing. Johns Hopkins University researchers recently discovered a superconducting material that may more effectively host qubits in the future. And Oxford University researchers just uncovered ways in which strontium ions can be much more efficiently entangled for scaling quantum computers. Advancements such as these will pave the path to ever more powerful computers that will enable ever more effective adaptive, individualized and personalized learning.

We know that change is coming. We know the direction of that change. We know some of the actual tools that will be instrumental in that change. Armed with that knowledge, what can we do today to prepare for the decade of the 2020s? Rather than merely reacting to changes after the fact, can we take steps to anticipate and prepare for that change? Can our institutions be better configured to adapt to the changes that are on the horizon? And who will lead that preparation at your institution?

Read more:

2020s -- the Decade of AI and Quantum - Inside Higher Ed

Posted in Quantum Computing | Comments Off on 2020s — the Decade of AI and Quantum – Inside Higher Ed

Googles Quantum Supremacy will mark the End of the Bitcoin in 2020 – The Coin Republic

Posted: at 9:46 am

Ritika Sharma Monday, 13 January 2020, 03:49 EST Modified date: Monday, 13 January 2020, 05:00 EST

Quantum computing whenever hit the headlines left not just Bitcoin holders but also every Cryptocurrency holder worried about the uncertainty around their holdings.

It widely believed that the underlying technology of Bitcoin, Blockchain is immutable, meaning it cannot be changed or encrypted without authority over encryption keys.

However, with quantum computers, it is possible to break a blockchains cryptographic codes. Quantum computing can hit the most significant features of Blockchain like unchangeable data, unalterable, and security making it vulnerable.

Google has achieved quantum supremacy as of late 2019, which poses a threat to Bitcoin. It will be a threat to Blockchain, as quantum computing will affect one blockchains key features like inalterability and security, thus making Blockchain as highly vulnerable technology.

Later, china Joined Google in the quantum supremacy Race and announced working on quantum technology. With this, the year 2020 might witness the end of the Crypto Era.

How can Quantum computing break the Blockchain?

The reason behind this fear is quite genuine and straightforward: Bitcoin or any Cryptocurrency depends on cryptography, hash functions, and asymmetric cryptographic number mainly relies on the computing power of computers. The hash function calculates a random number for each block.

The results obtained by this process are effortless to verify, but challenging to find. However, quantum computing has powerful algorithmic capabilities, which is precisely the enemy of this key.

Quantum computing uses subatomic particles, which will be available in more than one state at one time. This feature makes Quantum computing faster than the technology we use today.

Quantum computers can work 100 million times faster than current systems; the computational power is capable of solving any complex mathematical equation in a matter of a few seconds, which current systems take 10,000 years to solve.

With such super computational powers, Quantum computers is capable of calculating the one-way functions that will make one-way encryption obsolete.

The risk over Blockchain is more if it gets in the wrong hands. Hackers with a quantum computer can hack the Cryptocurrency ledger and take complete control of Blockchain.

Will Googles Quantum computing wipe out your Bitcoins?

Googles quantum Supremacy only to traditional computers on classical problems; this isnt actual quantum technology. It was presented bluntly as, quantum supremacy, though it is just a step in the world of quantum computing space.

Even if Googles quantum computer demonstrates, its computing power on specific problems far exceeds the best performing supercomputing. The results of this research by Google do not have much meaning in terms of Bitcoin. This isnt even near to what we can call breaking Bitcoin or Blockchain.

However, Googles quantum supremacy does not pose any threat to Bitcoin; many people in the space still stressed about quantum threat theory. Many analysts claim that the quantum algorithm used by Shor can crack private keys, but again, there Is a long way to go before it could break bitcoins Blockchain.

According to researchers, a quantum computer with 4,000 qubits is undoubtedly able to break the Blockchain. Still, googles the quantum computer has only 53 qubits, which cannot cause any harm to Blockchain, and it is worth mentioning that The higher the qubit, the more difficult it becomes.

Satoshi Nakamotos Proposed solution to beat Quantum Supremacy

Satoshi was a true visionary, the things we are concerned about today, and had already been answered by him. In 2010, satoshi Nakamoto responded to the question about quantum computers by username llama on bitcoin talk.

He replied that If Bitcoin suddenly cracked, the signature will be destroyed; but if it is slowly changed, the system still has time to convert to a stronger function, and Re-sign all your assets. Another cruder answer to this question suggested by the author of Mastering Bitcoin, Andreas Antonopoulos, If the quantum computer comes, we will upgrade.

The Quantum supremacy threat isnt new to the crypto world, and many cryptocurrency projects such as Ethereum, quantum chains, etc., focused on making blockchain quantum resistance, experts in Cryptocurrency space also advocating the development of quantum encryption technology to ensure the security of funds.

Unless a threat of Actual Quantum computing of far more powerful processor explodes, Bitcoin and its developers still have time to secure it. With the continuous development in Quantum technology and the development of more qubit chips, still, there will be the sword of Damocles hanging on the head of the cryptocurrency.

Read the original:

Googles Quantum Supremacy will mark the End of the Bitcoin in 2020 - The Coin Republic

Posted in Quantum Computing | Comments Off on Googles Quantum Supremacy will mark the End of the Bitcoin in 2020 – The Coin Republic

HPC In 2020: Acquisitions And Mergers As The New Normal – The Next Platform

Posted: at 9:46 am

After a decade of vendor consolidation that saw some of the worlds biggest IT firms acquire first-class HPC providers such as SGI, Cray, and Sun Microsystems, as well as smaller players like Penguin Computing, WhamCloud, Appro, and Isilon, it is natural to wonder who is next. Or maybe, more to the point, who is left?

As it turns out, there are still plenty of companies, large and small, that can fill critical holes in the product portfolios of HPC providers, or those who want to be HPC players. These niche acquisitions will be especially important to these same providers as they expand into HPC-adjacent markets such as artificial intelligence, data analytics and edge computing.

One company that can play into all of these markets is FPGA-maker Xilinx. Since Intel acquired Altera in 2015, Xilinx is the only standalone company of any size that makes reconfigurable logic devices. Give that, the natural buyer for Xilinx would be AMD, Intels arch-nemesis. AMD, of course, already has a highly competitive lineup of CPUs and GPUs to challenge its much larger rival, and the addition of an FPGA portfolio would open a third front. It would also provide AMD entry into a whole array of new application markets where FPGAs operate: ASIC prototyping, IoT, embedded aerospace/automotive, 5G communications, AI inference, database acceleration, and computational storage, to name a few.

The only problem is that Xilinxs current market cap of around $25 billion, or about half the current market cap of AMD. And if youre wondering about AMDs piggy bank, the chipmaker has $1.2 billion cash on hand as of September 2019. Which means any deal would probably take the form of a merger rather than a straight acquisition. Theres nothing wrong with that, but a merger is a more complex decision and has greater ramifications for both parties. Thats why the rumors of a Xilinx acquisition have tended to center on larger semiconductor manufacturers that might be looking to diversify their offerings, like Broadcom or Qualcomm. Those acquisitions wouldnt offer the HPC and AI technology synergies that AMD could provide, but they would likely be easier to execute.

Another area that continues to be ripe for acquisitions is the storage market. In HPC, Panasas and DataDirect Networks stand alone well, stand together as the two HPC specialists left in the market. And of those two, the more modest-sized Panasas would be easier to swallow. But most HPC OEMs, including the biggies like Hewlett Packard Enterprise, Dell Technologies, and Lenovo already have their own HPC storage and file system offerings of one sort or another, although Lenovo is probably most deficient in this regard. For what its worth though, Panasas, which has been around since 1999, has never attracted the kind of suitor willing to fold the companys rather specialized parallel file system technologies into its own product portfolio. In all honesty, we dont expect that to change.

The real storage action in the coming years in HPC, as well as in the enterprise and the cloud, is going to be in the software defined space, where companies like WekaIO, VAST Data, Excelero, and DataCore Software have built products that can virtualize all sorts of hardware. Thats because the way storage is being used and deployed in the datacenter these days is being transformed by cheaper capacity (disks) and cheaper IOPS (NVM-Express and other SSD devices), the availability of cloud storage, and the inverse trends of disaggregation and hyperconvergence.

As we noted last July: While there are plenty of NAS and SAN appliances being sold into the enterprise to support legacy applications, modern storage tends to be either disaggregated with compute and storage broken free of each other at the hardware level but glued together on the fly with software to look local or hyperconverged with the compute and block storage virtualized and running on the same physical server clusters and atop the same server virtualization hypervisors.

Any of the aforementioned SDS companies, along with others, may find themselves courted by OEMs and storage-makers, and even cloud providers. DDN has been busy in that regard, having acquired software-defined storage maker Nexenta in May 2019. We expect to see more of such deals in the coming years. Besides DDN, other storage companies like NetApp should be looking hard at bringing more SDS in-house. The big cloud providers Amazon, Microsoft, Google, and so on will also be making some big investments in SDS technologies, even if theyre not buying such companies outright.

One market that is nowhere near the consolidation stage is quantum computing. However, that doesnt mean companies wont be looking to acquire some promising startups in this area, even at this early stage. While major tech firms such as IBM, Google, Intel, Fujitsu, Microsoft, and Baidu have already invested a lot on in-house development and are busy selecting technology partners, other companies have taken a more wait-and-see approach.

In the latter category, one that particularly stands out is HPE. In this case, the company is more focused on near-term R&D, like memristors or other memory-centric technologies. While there may be some logic in letting other companies spend their money figuring out the most promising approaches for quantum computing, and then swoop in and copy (or buy) whatever technology is most viable, there is also the risk of being left behind. Thats something HPE cannot afford.

That said, HPE has recently invested in IonQ, a promising quantum computing startup that has built workable prototype using ion trap technology. The investment was provided via Pathfinder, HPEs investment arm. In an internal blog post on the subject penned by Abhishek Shukla, managing director of global venture investments, and Ray Beausoleil, Senior Fellow of large scale integrated photonics, the authors extol the virtues of IonQs technical approach:

IonQs technology has already surpassed all other quantum computers now available, demonstrating the largest number of usable qubits in the market. Its gate fidelity, which measures the accuracy of logical operations, is greater than 98 percent for both one-qubit and two-qubit operations, meaning it can handle longer calculations than other commercial quantum computers. We believe IonQs qubits and methodology are of such high quality, they will be able to scale to 100 qubits (and 10,000 gate operations) without needing any error correction.

As far as we can tell, HPE has no plans to acquire the company (and it shares investment in the startup with other companies, including Amazon, Google, and Samsung, among others). But if HPE is truly convinced IonQ is the path forward, it would make sense to pull the acquisition trigger sooner rather than later.

We have no illusions that any of this comes to pass in 2020 or ever. As logical as the deals we have suggested seem to us, the world of acquisitions and mergers is a lot more mysterious and counterintuitive than wed like to admit (cases in point: Intel buying Whamcloud or essentially buying Cloudera through such heavy investment). More certain is the fact that these deals will continue to reshape the HPC vendor landscape in the coming decade as companies go after new markets and consolidate their hold on old ones. If anything, the number of businesses bought and sold will increase as high performance computing, driven by AI and analytics, will extend into more application domains. Or as the Greeks put it more succinctly, the only constant is change.

Here is the original post:

HPC In 2020: Acquisitions And Mergers As The New Normal - The Next Platform

Posted in Quantum Computing | Comments Off on HPC In 2020: Acquisitions And Mergers As The New Normal – The Next Platform

Alibaba’s 10 Tech Trends to Watch in… – Alizila

Posted: at 9:46 am

The Alibaba DAMO Academy, Alibaba Groups global program for tackling ambitious, high-impact technology research, has made some predictions about the trends that will shape the industry in the year ahead. From more-advanced artificial intelligence to large-scale blockchain applications, heres what you can expect in 2020.

1. Artificial Intelligence Gets More Human2020 is set to be a breakthrough year for AI, according to DAMO. Researchers will be taking inspiration from a host of new areas to upgrade the technology, namely cognitive psychology and neuroscience combined with insights into human behavior and history. Theyll also adopt new machine-learning techniques, such as continual learning, which allows machines to remember what theyve learned in order to more quickly learn new things something humans take for granted. With these advances in cognitive intelligence, machines will be able to better understand and make use of knowledge rather than merely perceive and express information.

2. The Next Generation of ComputationComputers these days send information back and forth between the processor and the memory in order to complete tasks. The problem? Computing demands have grown to such an extent in the digital age that our computers cant keep up. Enter processing-in-memory architecture, which integrates the processor and memory into a single chip for faster processing speed. PIM innovations will play a critical role in spurring next-generation AI, DAMO said.

3. Hyper-Connected ManufacturingThe rapid deployment of 5G, Internet of Things and cloud- and edge-computing applications will help manufacturers go digital, including everything from automating equipment, logistics and production scheduling to integrating their factory, IT and communications systems. In turn, DAMO predicts theyll be faster to react to changes in demand and coordinate with suppliers in real time to help productivity and profitability.

WATCH: An Inside Look at Cainiaos Hyperconnected Warehouse

4. Machines Talking to Machines at ScaleMore-advanced IoT and 5G will enable more large-scale deployments of connected devices, which brings with them a range of benefits for governments, companies and consumers. For example, traffic-signal systems could be optimized in real time to keep drivers moving (and happy), while driverless cars could access roadside sensors to better navigate their surroundings. These technologies would also allow warehouse robots to maneuver around obstacles and sort parcels, and fleets of drones to efficiently and securely make last-mile deliveries.

5. Chip Design Gets EasierHave you heard? Moores Law is dying. It is now becoming too expensive to build faster and smaller semiconductors. In its place, chipmakers are now piecing together smaller chiplets into single wafers to handle more-demanding tasks. Think Legos. Another advantage of chiplets is that they often use already-inspected silicon, speeding up time to market. Barriers to entry in chipmaking are dropping, too, as open-source communities provide alternatives to traditional, proprietary design. And as more companies design their own custom chips, they are increasingly contributing to a growing ecosystem of development tools, product information and related software that will enable still easier and faster chip design in the future.

6. Blockchain Moves Toward MainstreamThe nascent blockchain industry is about to see some changes of its own. For one, expect the rise of the blockchain-as-a-service model to make these applications more accessible to businesses. Also, there will be a rise in specialized hardware chips for cloud and edge computing, powered by core algorithms used in blockchain technologies. Scientists at DAMO forecast that the number of new blockchain applications will grow significantly this year, as well, while blockchain-related collaborations across industries will become more common. Lastly, the academy expects large-scale blockchain applications to see wide-scale adoption.

7. A Turning Point for Quantum ComputingRecent advancements in this field have stirred up hopes for making large-scale quantum computers a reality, which will prompt more investments into quantum R&D, according to DAMO. That will result in increased competition and ecosystem growth around quantum technologies, as well as more attempts to commercialize the technology. DAMO predicts that after a difficult but critical period of intensive research in the coming years, quantum information science will deliver breakthroughs such as computers that can correct computation errors in real time.

8. More Revolution in SemiconductorsDemand is surging for computing power and storage, but major chipmakers still havent developed a better solution than 3-nanometer node silicon-based transistors. Experiments in design have led to the discovery of other materials that might boost performance. Topological insulators and two-dimensional superconducting materials, for example, may become connective materials as their properties allow electrical currents to flow without resistance. New magnetic and resistive switching materials might also be used to create next-generation magnetic memory technology, which can run on less power than their predecessors.

9. Data Protection Powered by AIAs businesses face a growing number of data-protection regulations and the rising compliance costs to meet them interest is growing in new solutions that support data security. AI algorithms can do that. They help organizations manage and filter through information, protect user information shared across multiple parties and make regulatory compliance easier, or even automatic. These technologies can help companies promote trust in the reuse and sharing of analytics, as well as overcome problems such as data silos, where certain information is not accessible to an entire organization and causes inefficiencies as a result.

10. Innovation Starts on the CloudCloud computing has evolved far beyond its intended purpose as technological infrastructure to take on a defining role in IT innovation. Today, clouds computing power is the backbone of the digital economy as it transforms the newest, most-advanced innovations into accessible services. From semiconductor chips, databases and blockchain to IoT and quantum computing, nearly all technologies are now tied to cloud computing. It has also given rise to new technologies, such as serverless computing architecture and cloud-powered robotic automation.

See original here:

Alibaba's 10 Tech Trends to Watch in... - Alizila

Posted in Quantum Computing | Comments Off on Alibaba’s 10 Tech Trends to Watch in… – Alizila

Technology Trends to Keep an Eye on in 2020 – Built In Austin

Posted: at 9:46 am

Consumers have a lot of tech news to keep up with in 2020, with anticipated advances in autonomous vehicles, folding touchscreen phones and new video game consoles. But what are tech professionals gearing up for this year?

The answer depends on who you ask. For example, Executive VP of Product at Arrive Logistics Michael Senftleber is watching how business processes particularly in the world of freight will be affected by the increasing popularity of artificial intelligence and machine learning. Meanwhile, Vikram Phatak, founder of cybersecurity firm NSS Labs, is monitoring how 5G, IoT devices and other infrastructure will affect the future of digital protection strategies.

These are just a few of the upcoming tech evolutions the following Austin professionals are paying attention to. While each tech leader is harnessing different technological developments for different reasons, all paths lead to improved customer satisfaction and a continual evolution of their businesses.

Arrive Logistics EVP of Product Michael Senftleber said his team is planning to use AI and machine learning to enhance the overall capabilities of their freight prediction tech, while also improving the usability of their platform.

What are the top tech trends youre watching in 2020?

I often hear people talk about data science, AI and machine learning like theyre magic silver bullets for every problem; they are not. AI and ML are best applied when theres significant data, a computationally complex problem and repeated samples or transactions. In 2020, the AI and ML hype will continue to grow. But so will tangible business applications that leverage those technologies and data science to solve real challenges and provide unique insights.

In 2020, the AI and ML hype will continue to grow.

How are you applying these trends in your work in the year ahead?

The opportunities for data science, AI and ML in the logistics industry are enormous. We are leveraging these technologies with market and proprietary data to predict the future cost to move a load of freight, match the right load to the right truck, alert on opportunities or deviations and make business decisions in real time.

However, while powerful, often complex technologies end up in complex, hard-to-use platforms, its critical that we continue to build technology to harness opportunities and enable business workflows, while building an interface to enable a simple, seamless experience for our users.

At the 2019 Consumer Electronics Show, IBM unveiled the worlds first quantum computer designed for commercial and scientific use. NSS Labs Founder Vikram Phatak said further developments in the world of quantum computing will play a significant role in moving cybersecurityforward, and his company is gearing up for that change.

What are the top tech trends youre watching in 2020?

The adoption of cloud computing, ubiquitous high-speed internet like 5G wireless, internet of things, artificial intelligence and quantum computing will drive major shifts in the way the world works, including how we protect people. IoT devices will operate more efficiently and autonomously as AI evolves. And quantum computing is a transformative leap forward that makes possible new technologies that we havent even imagined yet.

Modern encryption that takes thousands of years to break with current computing technology can be broken in seconds using quantum computing. Our research indicates that as the virtual and physical worlds merge, cybersecurity will naturally evolve to focus on protecting people regardless of whether they are using mobile devices, computers or IoT devices connected to the cloud. This new paradigm will spur a scalable, zero-trust alternative to current cybersecurity architectures.

As the virtual and physical worlds merge, cybersecurity will naturally evolve.

How are you applying these trends in your work in the year ahead?

We select test topics based on enterprise customer demand. The world is rapidly changing and our plans for 2020 reflect that. This year we have a lot planned, including testing cloud security offerings like secure access service edge (SASE), and security for cloud computing, like cloud network firewall. We are also initiating research on how 5G will change cloud computing, practical uses of AI and getting prepared for a post-quantum world.

Kuldeep Chowhan has his head in clouds...of cloud computing. The engineer at vacation property rental site Vrbo said his team is watching public cloud computing.

What are the top tech trends youre watching in 2020?

Innovation and scale at the major public cloud computing providers continues to accelerate. Many platforms that were difficult to operationalize are becoming managed services that are easy and cost-effective to consume. Many of these services lower the bar for entry for data science and machine learning, which are blossoming in sophistication and applicability.

Vrbo is accelerating its migration to the cloud.

How are you applying these trends in your work in the year ahead?

Vrbo is accelerating its migration to the cloud so we can leverage the power of the Expedia Group travel platform. We want to provide travelers with a rich product offering that is personalized and relevant to them. That strategy includes a hybrid cloud data platform for all of Expedia Group that will power new AI capabilities to help our travelers find their perfect vacations.

Janani Mohanakrishnan said the water sector will experience a significant number of evolutions in the coming year. And as a result, the VP of product innovation and delivery said her team atwater conservation technology provider Banyan Water is adjusting their approach to automation and user data.

What are the top tech trends youre watching in 2020?

In the water sector, Im keeping an eye on more utilities and commercial customers leveraging digital solutions to improve operations. These operations include the following: keeping privacy in mind, detecting issues faster and reducing costs associated with troubleshooting and resolution. Were also watching the identification of improved uses of water, more businesses considering migrating to a circular economy, reduced truck rolls, and improved customer communication and engagement.

We will be updating our take on user research, data analytics and automation.

How are you applying these trends in your work in the year ahead?

Our products help commercial customers save money and water through the optimization of irrigation and indoor water management. We will be updating our take on user research, data analytics and automation to maximize value for customers.

We will be smart about how we make research an inherent part of the process, knowing we are resource-constrained. Weve trained select team members to collect feedback from users whenever they can. We accept that some research is better than no research, and that its OK for us to be brave with predictions.

Well also improve the intelligence of our existing models, leveraging tailored interventions as applicable to help buildings dramatically reduce indoor water consumption when there are inefficiencies.

Visit link:

Technology Trends to Keep an Eye on in 2020 - Built In Austin

Posted in Quantum Computing | Comments Off on Technology Trends to Keep an Eye on in 2020 – Built In Austin

Podcast: The Overhype and Underestimation of Quantum Computing – insideHPC

Posted: January 12, 2020 at 11:50 pm

https://radiofreehpc.com/audio/RF-HPC_Episodes/Episode260/RFHPC260_QuantumQuantum.mp3In this podcast, the Radio Free HPC team looks at how Quantum Computing is overhyped and underestimated at the same time.

The episode starts out with Henry being cranky. It also ends with Henry being cranky. But between those two events, we discuss quantum computing and Shahins trip to the Q2B quantum computing conference in San Jose.

Not surprisingly, there is a lot of activity in quantum, with nearly every country pushing the envelop outward. One of the big concerns is that existing cryptography is now vulnerable to quantum cracking. Shahin assures us that this isnt the case today and is probably a decade away, which is another way of saying nobody knows, so it could be next week, but probably not.

We also learn the term NISQ which is a descriptive acronym for the current state of quantum systems. NISQ stands for Noisy Intermediate Scale Quantum computing. The conversation touches on various ways quantum computing is used now and where its heading, plus the main reason why everyone seems to be kicking the tires on quantum: the fear of missing out. Its a very exciting area, but to Shahin, it seems like how AI was maybe 8-10 years ago, so still early days.

Other highlights:

Download the MP3 *Follow RFHPC on Twitter *Subscribe on Spotify *Subscribe on Google Play *Subscribe on iTunes

Sign up for the insideHPC Newsletter

See more here:

Podcast: The Overhype and Underestimation of Quantum Computing - insideHPC

Posted in Quantum Computing | Comments Off on Podcast: The Overhype and Underestimation of Quantum Computing – insideHPC

Bleeding edge information technology developments – IT World Canada

Posted: at 11:50 pm

What are some bleeding-edge information technology developments that a forward-thinking CIO should keep an eye on?

Here are a few emerging technologies that have caught my attention. These are likely to have an increasing impact on the world of business in the future. Consider which ones you should follow a little more closely.

A recent advance in quantum computing that a Google team achieved indicates that quantum computing technology is making progress out of the lab and closing in on practical business applications. Quantum computing is not likely to change routine business transaction processing or data analytics applications. However, quantum computing is likely to dramatically change computationally intense applications required for:

Since most businesses can benefit from at least a few of these applications, quantum computing is worth evaluating. For a more detailed discussion of specific applications in various topic areas, please read: Applying Paradigm-Shifting Quantum Computers to Real-World Issues.

Machine learning is the science of computers acting without software developers writing detailed code to handle every case in the data that the software will encounter. Machine learning software develops its own algorithms that discover knowledge from specific data and the softwares prior experience. Machine learning is based on statistical concepts and computational principles.

The leading cloud computing infrastructure providers machine learning routines that are quite easy to integrate into machine learning applications. These routines greatly reduce expertise barriers that have slowed machine learning adoption at many businesses.

Selected business applications of machine learning include:

For summary descriptions of specific applications, please read: 10 Companies Using Machine Learning in Cool Ways.

Distributed ledger technology is often called blockchain. It enables new business and trust models. A distributed ledger enables all parties in a business community to see agreed information about all transactions, not just their own. That visibility builds trust within the community.

Bitcoin, a cryptocurrency, is the mostly widely known example application of blockchain.

Distributed ledger technology has great potential to revolutionize the way governments, institutions, and corporations interact with each other and with their clients or customers.Selected business applications of distributed ledger technology include:

For descriptions of industry-specific distributed ledger applications, please read: 17 Blockchain Applications That Are Transforming Society.

The Industrial Internet of Things (IIoT) is a major advance on Supervisor Control and Data Acquisition (SCADA). SCADA, in many forms, has been used for decades to safely operate major industrial facilities including oil refineries, petrochemical plants, electrical power generation stations, and assembly lines of all kinds.

IIOT is a major advance over relatively expensive SCADA. IIoT relies on dramatically cheaper components including sensors, network bandwidth, storage and computing resources. As a result, IIoT is feasible in many smaller facilities and offers a huge increase in data points for larger facilities. Business examples where IIoT delivers considerable value include production plants, trucks, cars, jet engines, elevators, and weather buoys.

The aggressive implementation of IIoT can:

For summary descriptions of specific IIOT applications, please read: The Top 20 Industrial IoT Applications.

RISC-V is an open-source hardware instruction set architecture (ISA) for CPU microprocessors that is growing in importance. Its based on established reduced instruction set computer (RISC) principles. The open-source aspect of the RISC-V ISA is a significant change compared to the proprietary ISA designs of the dominant computer chip manufacturers Intel and Arm.

RISC-V offers a way around paying ISA royalties for CPU microprocessors to either of the monopolists. The royalties may not be significant for chips used in expensive servers or smartphones, but they are significant for the cheap chips required in large numbers to implement the IIOT applications listed above.

For an expanded discussion of RISC-V, please read: A new blueprint for microprocessors challenges the industrys giants.

What bleeding edge information technology developments would you add to this list? Let us know in the comments below.

See the original post:

Bleeding edge information technology developments - IT World Canada

Posted in Quantum Computing | Comments Off on Bleeding edge information technology developments – IT World Canada

Bitcoin Remains Secure Regardless of IBM’s Quantum Computing Boost – Coin Idol

Posted: at 11:50 pm

Jan 12, 2020 at 10:20 // News

International Business Machines (IBM), an American multinational information technology firm, has managed to double the power of its quantum computer (QC) but this effort didnt break the encryption of Bitcoin (BTC), the original blockchain-based cryptocurrency.

During the CES 2020 conference that happened on January 8, the company revealed that it fruitfully completed a Quantum Volume (QV) of 32 with the help of its 28-qubit quantum PC called Raleigh.

In a nutshell, Quantum Volume is the metric used to define the intricacy of snags which can be deciphered and worked on by a QC. The QV can be employed to relate the performance of various quantum PCs and the information tech company has successfully doubled this value on a yearly basis, ever since 2016 (four years now).

The computer machines have for a good time been addressed as one of the powerful novelties happening within this very century, with budding applications in well-nigh every single sector such as healthcare, internet of things, artificial intelligence (AI), blockchain, financial modeling, etc.

Even though IBM's state-of-the-art advancements can be thought of as momentous progress, QCs can at this time only be applied for very definite errands, for instance, these machines are many miles in front of the universal classical PCs that we have been familiarly using. Per se, major fears have been developed thinking that these gadgets could be at one point in time be applied to break the cryptography employed to safeguard digital assets such as Bitcoin remain speculative, at any rate for the time being.

It is also claimed that since the system is designed completely around cryptographically protected transactions, then, it requires a much more powerful QC to crack and fissure the encryption that is applied to produce the private keys for BTC. As a matter of fact, as per the paper released in June 2017 by several authors including Martin Roetteler, the type of a quantum computer requires to command processing power of about 2,500 qubits in order to breakdown the 256-bit encryption being used by BTC.

Remember, the most powerful QC that we have now only has about 72-qubit processor, and this implies that it will take more time (in years) to touch encryption-intimidating levels. But that doesnt rule out the rate at which IBM and Google are trying to double the computing power year-in-year-out, hence becoming major threats to Bitcoin and the entire cryptocurrency community.

Visit link:

Bitcoin Remains Secure Regardless of IBM's Quantum Computing Boost - Coin Idol

Posted in Quantum Computing | Comments Off on Bitcoin Remains Secure Regardless of IBM’s Quantum Computing Boost – Coin Idol

Google and IBM square off in Schrodingers catfight over quantum supremacy – The Register

Posted: at 11:50 pm

Column Just before Christmas, Google claimed quantum supremacy. The company had configured a quantum computer to produce results that would take conventional computers some 10,000 years to replicate - a landmark event.

Bollocks, said IBM - which also has big investments both in quantum computing and not letting Google get away with stuff. Using Summit, the world's largest conventional supercomputer at the Oak Ridge National Laboratories in Tennessee, IBM claimed it could do the same calculation in a smidge over two days.

As befits all things quantum, the truth is a bit of both. IBM's claim is fair enough - but it's right at the edge of Summit's capability and frankly a massive waste of its time. Google could, if it wished, tweak the quantum calculation to move it out of that range. And it might: the calculation was chosen precisely not because it was easy, but because it was hard. Harder is better.

Google's quantum CPU has 54 qubits, quantum bits that can stay in a state of being simultaneously one and zero. The active device itself is remarkably tiny, a silicon chip around a centimetre square, or four times the size of the Z80 die in your childhood ZX Spectrum. On top of the silicon, a nest of aluminium tickled by microwaves hosts the actual qubits. The aluminium becomes superconducting below around 100K, but the very coldest part of the circuit is just 15 millikelvins. At this temperature the qubits have low enough noise to survive long enough to be useful

By configuring the qubits in a circuit, setting up data and analysing the patterns that emerge when the superpositions are observed and thus collapse to either one or zero, Google can determine the probable correct outcome for the problem the circuit represents. 54 qubits, if represented in conventional computer terms, would need 254 bits of RAM to represent each step of the calculation, or two petabytes' worth. Manipulating this much data many times over gives the 10 millennia figure Google claims.

IBM, on the other hand, says that it has just enough disk space on Summit to store the complete calculation. However you do it, though, it's not very useful; the only application is in random number generation. That's a fun, important and curiously nuanced field, but you don't really need a refrigerator stuffed full of qubits to get there. You certainly don't need the 27,648 NVidia Tesla GPUs in Summit chewing through 16 megawatts of power.

What Google is actually doing is known in the trade as "pulling a Steve", from the marketing antics of the late Steve Jobs. In particular, his tour at NeXT Inc, the company he started in the late 1980s to annoy Apple and produce idiosyncratic workstations. Hugely expensive to make and even more so to buy, the NeXT systems were never in danger of achieving dominance - but you wouldn't know that from Jobs' pronouncements. He declared market supremacy at every opportunity, although in carefully crafted phrases that critics joked defined the market as "black cubic workstations running NeXTOS."

Much the same is true of Google's claim. The calculation is carefully crafted to do precisely the things that Google's quantum computer can do - the important thing isn't the result, but the journey. Perhaps the best analogy is with the Wright Brothers' first flight: of no practical use, but tremendous significance.

What happened to NeXT? It got out of hardware and concentrated on software, then Jobs sold it - and himself - to Apple, and folded in some of that software into MacOS development. Oh, and some cat called Berners-Lee built something called the World Wide Web on a Next Cube.

Nothing like this will happen with Google's technology. There's no new web waiting to be borne on the wings of supercooled qubits. Even some of the more plausible things, like quantum decryption of internet traffic, is a very long way from reality - and, once it happens, it's going to be relatively trivial to tweak conventional encryption to defeat it. But the raw demonstration, that a frozen lunchbox consuming virtually no power in its core can outperform a computer chewing through enough wattage to keep a small town going, is a powerful inducement for more work.

That's Google's big achievement. So many new and promising technologies have failed not because they could never live up to expectations but because they cant survive infancy. Existing, established technology has all the advantages: it generates money, it has distribution channels, it has an army of experts behind it, and it can adjust to close down challengers before they get going. To take just one company - Intel has tried for decades to break out of the x86 CPU prison. New wireless standards, new memory technologies, new chip architectures, new display systems, new storage and security ideas - year after year, the company casts about for something new that'll make money. It never gets there.

Google's "quantum supremacy" isn't there either, but it has done enough to protect its infant prince in its superconducting crib. That's worth a bit of hype.

Sponsored: Detecting cyber attacks as a small to medium business

Visit link:

Google and IBM square off in Schrodingers catfight over quantum supremacy - The Register

Posted in Quantum Computing | Comments Off on Google and IBM square off in Schrodingers catfight over quantum supremacy – The Register

Page 116«..1020..115116117118..130..»