HPC In 2020: Acquisitions And Mergers As The New Normal – The Next Platform

After a decade of vendor consolidation that saw some of the worlds biggest IT firms acquire first-class HPC providers such as SGI, Cray, and Sun Microsystems, as well as smaller players like Penguin Computing, WhamCloud, Appro, and Isilon, it is natural to wonder who is next. Or maybe, more to the point, who is left?

As it turns out, there are still plenty of companies, large and small, that can fill critical holes in the product portfolios of HPC providers, or those who want to be HPC players. These niche acquisitions will be especially important to these same providers as they expand into HPC-adjacent markets such as artificial intelligence, data analytics and edge computing.

One company that can play into all of these markets is FPGA-maker Xilinx. Since Intel acquired Altera in 2015, Xilinx is the only standalone company of any size that makes reconfigurable logic devices. Give that, the natural buyer for Xilinx would be AMD, Intels arch-nemesis. AMD, of course, already has a highly competitive lineup of CPUs and GPUs to challenge its much larger rival, and the addition of an FPGA portfolio would open a third front. It would also provide AMD entry into a whole array of new application markets where FPGAs operate: ASIC prototyping, IoT, embedded aerospace/automotive, 5G communications, AI inference, database acceleration, and computational storage, to name a few.

The only problem is that Xilinxs current market cap of around $25 billion, or about half the current market cap of AMD. And if youre wondering about AMDs piggy bank, the chipmaker has $1.2 billion cash on hand as of September 2019. Which means any deal would probably take the form of a merger rather than a straight acquisition. Theres nothing wrong with that, but a merger is a more complex decision and has greater ramifications for both parties. Thats why the rumors of a Xilinx acquisition have tended to center on larger semiconductor manufacturers that might be looking to diversify their offerings, like Broadcom or Qualcomm. Those acquisitions wouldnt offer the HPC and AI technology synergies that AMD could provide, but they would likely be easier to execute.

Another area that continues to be ripe for acquisitions is the storage market. In HPC, Panasas and DataDirect Networks stand alone well, stand together as the two HPC specialists left in the market. And of those two, the more modest-sized Panasas would be easier to swallow. But most HPC OEMs, including the biggies like Hewlett Packard Enterprise, Dell Technologies, and Lenovo already have their own HPC storage and file system offerings of one sort or another, although Lenovo is probably most deficient in this regard. For what its worth though, Panasas, which has been around since 1999, has never attracted the kind of suitor willing to fold the companys rather specialized parallel file system technologies into its own product portfolio. In all honesty, we dont expect that to change.

The real storage action in the coming years in HPC, as well as in the enterprise and the cloud, is going to be in the software defined space, where companies like WekaIO, VAST Data, Excelero, and DataCore Software have built products that can virtualize all sorts of hardware. Thats because the way storage is being used and deployed in the datacenter these days is being transformed by cheaper capacity (disks) and cheaper IOPS (NVM-Express and other SSD devices), the availability of cloud storage, and the inverse trends of disaggregation and hyperconvergence.

As we noted last July: While there are plenty of NAS and SAN appliances being sold into the enterprise to support legacy applications, modern storage tends to be either disaggregated with compute and storage broken free of each other at the hardware level but glued together on the fly with software to look local or hyperconverged with the compute and block storage virtualized and running on the same physical server clusters and atop the same server virtualization hypervisors.

Any of the aforementioned SDS companies, along with others, may find themselves courted by OEMs and storage-makers, and even cloud providers. DDN has been busy in that regard, having acquired software-defined storage maker Nexenta in May 2019. We expect to see more of such deals in the coming years. Besides DDN, other storage companies like NetApp should be looking hard at bringing more SDS in-house. The big cloud providers Amazon, Microsoft, Google, and so on will also be making some big investments in SDS technologies, even if theyre not buying such companies outright.

One market that is nowhere near the consolidation stage is quantum computing. However, that doesnt mean companies wont be looking to acquire some promising startups in this area, even at this early stage. While major tech firms such as IBM, Google, Intel, Fujitsu, Microsoft, and Baidu have already invested a lot on in-house development and are busy selecting technology partners, other companies have taken a more wait-and-see approach.

In the latter category, one that particularly stands out is HPE. In this case, the company is more focused on near-term R&D, like memristors or other memory-centric technologies. While there may be some logic in letting other companies spend their money figuring out the most promising approaches for quantum computing, and then swoop in and copy (or buy) whatever technology is most viable, there is also the risk of being left behind. Thats something HPE cannot afford.

That said, HPE has recently invested in IonQ, a promising quantum computing startup that has built workable prototype using ion trap technology. The investment was provided via Pathfinder, HPEs investment arm. In an internal blog post on the subject penned by Abhishek Shukla, managing director of global venture investments, and Ray Beausoleil, Senior Fellow of large scale integrated photonics, the authors extol the virtues of IonQs technical approach:

IonQs technology has already surpassed all other quantum computers now available, demonstrating the largest number of usable qubits in the market. Its gate fidelity, which measures the accuracy of logical operations, is greater than 98 percent for both one-qubit and two-qubit operations, meaning it can handle longer calculations than other commercial quantum computers. We believe IonQs qubits and methodology are of such high quality, they will be able to scale to 100 qubits (and 10,000 gate operations) without needing any error correction.

As far as we can tell, HPE has no plans to acquire the company (and it shares investment in the startup with other companies, including Amazon, Google, and Samsung, among others). But if HPE is truly convinced IonQ is the path forward, it would make sense to pull the acquisition trigger sooner rather than later.

We have no illusions that any of this comes to pass in 2020 or ever. As logical as the deals we have suggested seem to us, the world of acquisitions and mergers is a lot more mysterious and counterintuitive than wed like to admit (cases in point: Intel buying Whamcloud or essentially buying Cloudera through such heavy investment). More certain is the fact that these deals will continue to reshape the HPC vendor landscape in the coming decade as companies go after new markets and consolidate their hold on old ones. If anything, the number of businesses bought and sold will increase as high performance computing, driven by AI and analytics, will extend into more application domains. Or as the Greeks put it more succinctly, the only constant is change.

More:
HPC In 2020: Acquisitions And Mergers As The New Normal - The Next Platform

How to verify that quantum chips are computing correctly – MIT News

In a step toward practical quantum computing, researchers from MIT, Google, and elsewhere have designed a system that can verify when quantum chips have accurately performed complex computations that classical computers cant.

Quantum chips perform computations using quantum bits, called qubits, that can represent the two states corresponding to classic binary bits a 0 or 1 or a quantum superposition of both states simultaneously. The unique superposition state can enable quantum computers to solve problems that are practically impossible for classical computers, potentially spurring breakthroughs in material design, drug discovery, and machine learning, among other applications.

Full-scale quantum computers will require millions of qubits, which isnt yet feasible. In the past few years, researchers have started developing Noisy Intermediate Scale Quantum (NISQ) chips, which contain around 50 to 100 qubits. Thats just enough to demonstrate quantum advantage, meaning the NISQ chip can solve certain algorithms that are intractable for classical computers. Verifying that the chips performed operations as expected, however, can be very inefficient. The chips outputs can look entirely random, so it takes a long time to simulate steps to determine if everything went according to plan.

In a paper published today in Nature Physics, the researchers describe a novel protocol to efficiently verify that an NISQ chip has performed all the right quantum operations. They validated their protocol on a notoriously difficult quantum problem running on custom quantum photonic chip.

As rapid advances in industry and academia bring us to the cusp of quantum machines that can outperform classical machines, the task of quantum verification becomes time critical, says first author Jacques Carolan, a postdoc in the Department of Electrical Engineering and Computer Science (EECS) and the Research Laboratory of Electronics (RLE). Our technique provides an important tool for verifying a broad class of quantum systems. Because if I invest billions of dollars to build a quantum chip, it sure better do something interesting.

Joining Carolan on the paper are researchers from EECS and RLE at MIT, as well from the Google Quantum AI Laboratory, Elenion Technologies, Lightmatter, and Zapata Computing.

Divide and conquer

The researchers work essentially traces an output quantum state generated by the quantum circuit back to a known input state. Doing so reveals which circuit operations were performed on the input to produce the output. Those operations should always match what researchers programmed. If not, the researchers can use the information to pinpoint where things went wrong on the chip.

At the core of the new protocol, called Variational Quantum Unsampling, lies a divide and conquer approach, Carolan says, that breaks the output quantum state into chunks. Instead of doing the whole thing in one shot, which takes a very long time, we do this unscrambling layer by layer. This allows us to break the problem up to tackle it in a more efficient way, Carolan says.

For this, the researchers took inspiration from neural networks which solve problems through many layers of computation to build a novel quantum neural network (QNN), where each layer represents a set of quantum operations.

To run the QNN, they used traditional silicon fabrication techniques to build a 2-by-5-millimeter NISQ chip with more than 170 control parameters tunable circuit components that make manipulating the photon path easier. Pairs of photons are generated at specific wavelengths from an external component and injected into the chip. The photons travel through the chips phase shifters which change the path of the photons interfering with each other. This produces a random quantum output state which represents what would happen during computation. The output is measured by an array of external photodetector sensors.

That output is sent to the QNN. The first layer uses complex optimization techniques to dig through the noisy output to pinpoint the signature of a single photon among all those scrambled together. Then, it unscrambles that single photon from the group to identify what circuit operations return it to its known input state. Those operations should match exactly the circuits specific design for the task. All subsequent layers do the same computation removing from the equation any previously unscrambled photons until all photons are unscrambled.

As an example, say the input state of qubits fed into the processor was all zeroes. The NISQ chip executes a bunch of operations on the qubits to generate a massive, seemingly randomly changing number as output. (An output number will constantly be changing as its in a quantum superposition.) The QNN selects chunks of that massive number. Then, layer by layer, it determines which operations revert each qubit back down to its input state of zero. If any operations are different from the original planned operations, then something has gone awry. Researchers can inspect any mismatches between the expected output to input states, and use that information to tweak the circuit design.

Boson unsampling

In experiments, the team successfully ran a popular computational task used to demonstrate quantum advantage, called boson sampling, which is usually performed on photonic chips. In this exercise, phase shifters and other optical components will manipulate and convert a set of input photons into a different quantum superposition of output photons. Ultimately, the task is to calculate the probability that a certain input state will match a certain output state. That will essentially be a sample from some probability distribution.

But its nearly impossible for classical computers to compute those samples, due to the unpredictable behavior of photons. Its been theorized that NISQ chips can compute them fairly quickly. Until now, however, theres been no way to verify that quickly and easily, because of the complexity involved with the NISQ operations and the task itself.

The very same properties which give these chips quantum computational power makes them nearly impossible to verify, Carolan says.

In experiments, the researchers were able to unsample two photons that had run through the boson sampling problem on their custom NISQ chip and in a fraction of time it would take traditional verification approaches.

This is an excellent paper that employs a nonlinear quantum neural network to learn the unknown unitary operation performed by a black box, says Stefano Pirandola, a professor of computer science who specializes in quantum technologies at the University of York. It is clear that this scheme could be very useful to verify the actual gates that are performed by a quantum circuit [for example] by a NISQ processor. From this point of view, the scheme serves as an important benchmarking tool for future quantum engineers. The idea was remarkably implemented on a photonic quantum chip.

While the method was designed for quantum verification purposes, it could also help capture useful physical properties, Carolan says. For instance, certain molecules when excited will vibrate, then emit photons based on these vibrations. By injecting these photons into a photonic chip, Carolan says, the unscrambling technique could be used to discover information about the quantum dynamics of those molecules to aid in bioengineering molecular design. It could also be used to unscramble photons carrying quantum information that have accumulated noise by passing through turbulent spaces or materials.

The dream is to apply this to interesting problems in the physical world, Carolan says.

Read the original here:
How to verify that quantum chips are computing correctly - MIT News

The dark side of IoT, AI and quantum computing: Hacking, data breaches and existential threat – ZDNet

Emerging technologies like the Internet of Things, artificial intelligence and quantum computing have the potential to transform human lives, but could also bring unintended consequences in the form of making society more vulnerable to cyberattacks, the World Economic Forum (WEF) has warned.

Now in it's 15th year, the WEFGlobal Risks Report 2020 produced in collaboration with insurance broking and risk management firm Marsh details the biggest threats facing the world over the course of the next year and beyond.

Data breaches and cyberattacks featured in the top five most likely global risks in both 2018 and 2019, but while both still pose significant risks, they're now ranked at sixth and seventh respectively.

"I wouldn't underestimate the importance of technology risk, even though this year's report has a centre piece on climate," said John Drzik, chairman of Marsh & McLennan Insights.

SEE: A winning strategy for cybersecurity(ZDNet special report) |Download the report as a PDF(TechRepublic)

The 2020 edition of the Global Risks Report puts the technological risks behind five different environmental challenges: extreme weather, climate change action failure, natural disasters, biodiversity loss, and human-made environmental disasters.

But that isn't to say cybersecurity threats don't pose risks; cyberattacks and data breaches are still in the top ten and have the potential to cause big problems for individuals, businesses and society as a whole, with threats ranging from data breaches and ransomwareto hackers tampering with industrial and cyber-physical systems.

"The digital nature of 4IR [fourth industrial revolution] technologies makes them intrinsically vulnerable to cyberattacks that can take a multitude of formsfrom data theft and ransomware to the overtaking of systems with potentially large-scale harmful consequences," warns the report.

"Operational technologies are at increased risk because cyberattacks could cause more traditional, kinetic impacts as technology is being extended into the physical world, creating a cyber-physical system."

The report warns that, for many technology vendors, "security-by-design" is still a secondary concern compared with getting products out to the market.

Large numbers of Internet of Things product manufacturers have long had a reputation for putting selling the products ahead of ensuring they're secure and the WEF warns that the IoT is "amplifying the potential cyberattack surface", as demonstrated by the rise in IoT-based attacks.

In many cases, IoT devices collect and share private data that's highly sensitive, like medical records, information about the insides of homes and workplaces, or data on day-to-day journeys.

Not only could this data be dangerous if it falls into the hands of cyber criminals if it isn't collected and stored appropriately, the WEF also warns about the potential of IoT data being abused by data brokers. In both cases, the report warns the misuse of this data could be to create physical and psychological harm.

Artificial intelligence is also detailed as a technology that could have benefits as well as causing problems, with the report describing AI as "the most impactful invention" and our "biggest existential threat". The WEF even goes so far as to claim we're still not able to comprehend AI's full potential or full risk.

The report notes that risks around issues such as generating disinformation and deepfakes are well known, but suggests that more investigation is needed into the risks AI poses in areas including brain-computer interfaces.

A warning is also issued about the unintended consequences of quantum computing, should it arrive at some point over the course of the next decade, as some suggest. While, like other innovations, it will bring benefits to society, it also creates a problem for encryption in its current state.

SEE:Cybersecurity in an IoT and Mobile World (ZDNet sepcial report)

By dramatically reducing the time required to solve the mathematical problems that today's encryption relies on to potentially just seconds, it will render cybersecurity as we know it obsolete. That could have grave consequences for re-securing almost every aspect of 21st century life, the report warns especially if cyber criminals or other malicious hackers gain access to quantum technology that they could use to commit attacks against personal data, critical infrastructure and power grids,

"These technologies are really reshaping industry, technology and society at large, but we don't have the protocols around these to make sure of a positive impact on society," said Mirek Dusek, deputy head of the centre for geopolitical and regional affairs at member of the executive committee at the World Economic Forum.

However, it isn't all doom and gloom; because despite the challenges offered when it comes to cyberattacks, the World Economic Forum notes that efforts to address the security challenges posed by new technologies is "maturing" even if they're still sometimes fragmented.

"Numerous initiatives bring together businesses and governments to build trust, promote security in cyberspace, assess the impact of cyberattacks and assist victims," the report says.

Visit link:
The dark side of IoT, AI and quantum computing: Hacking, data breaches and existential threat - ZDNet

Quantum Computing and Israel’s Growing Tech Role | Sam Bocetta – The Times of Israel

Its time to adjust to a world that is changing from the digital landscape that we have grown accustomed to. Traditional computing is evolving as quantum computing takes center stage.

Traditional computing uses the binary system, a digital language made up of strings of 1s and 0s. Quantum computing is a nonbinary system that uses the qubit which has the ability to exist as both 1 and 0 simultaneously, giving it a near-infinite number of positions and combinations. This computational ability far exceeds any other similar technology on the market today.

This new technology threatens to outpace our efforts in cyber defense and poses an interesting challenge to VPN companies, web hosts, and other similar industries that rely on traditional methods of standard encryption.

While leading tech giants all over the globe continue to implement funding that pours hundreds of billions of dollars into their R&D programs for quantum computing, Israel is quick to recognize the importance of the emerging industry. The Startup Nations engineers can be found toiling away in the fight to be at the frontier of the worlds next big technological innovations.

Quantum computing provides unmatched efficiency at analyzing data. To understand the scope of it, consider the aforementioned classical computing style that encodes information in binary. Picture a string of 1s and 0s about 30 digits long. This string alone has almost one billion different combinations. A classical computer can only analyze each possibility one at a time. However, a quantum computer, thanks to a phenomenon known as superposition, can exist in each one of those billion states simultaneously. To match this unparalleled computing power, our classical computer would need 1 billion processors.

Consider how much time we spend using applications on the internet. Our data is constantly being stored, usually in large data centers far from us thanks to the ability of cloud computing, which allows information to be stored at data centers and analyzed at a great distance from the user.

Tech ventures, such as Microsoft Azure and Amazon AWS, compete for the newest developments in this technology knowing the positive effects it has on the web users experience, such as access to the fastest response times, speedy data transfer, and the most powerful processing capabilities for AI.

Quantum computing has future applications in almost every facet of civilian life imaginable, including pharmaceuticals, energy, space, and more. Quantum computers could offer scientists the ability to work up close with virtual models unlike any theyve had before, with the ability to analyze anything from complex chemical reactions to quantum systems. AI, the technology claiming to rival electricity in importance and implementation, is the ideal candidate for quantum computing due to it often requiring complex software too challenging for current systems.

Really, the world is quantum computings oyster.

The next Silicon Valley happens to be on the other side of the world from California. Israel has gained the attention of major players in the tech sector, including giants such as Intel, Amazon, Google, and Nvidia. The Startup Nation got its nickname due to a large number of startups compared to the population, with approximately 1 startup for every 1,400 residents. In a list of the top 50 global cities for the growing tech industry, Tel Aviv, Israel comes in at #15. Israel is wrapping up the year of 2019 with an astonishing 102% jump in the number of tech mergers and acquisitions as compared to the previous year, with no signs of slowing down.

Habana Labs and Annapurna Labs, both created by entrepreneur Avigdor Willenz, were recently acquired by Intel and Amazon respectively to further their development in the realm of quantum computing and more powerful processors. Google, Nvidia, Marvell, Huawei, Broadcom, and Cisco have also invested billions of capital into Israeli prospects.

One of Googles R&D centers located in Tel Aviv is actively heading the research on quantum computing. Just this year Google announced a major breakthrough that made other tech giants pick up the pace. They hinted at a computer chip that, with the power of quantum computing, was able to manage and analyze in one second the amount of data that would take a full day for any supercomputer.

While Israel is reaping the benefits of its current exposure thanks to big tech firms, an anonymous source is skeptical about the long-term success of Israels foray into the tech world without the increased education and government support to keep up with the demand. Similar to other parts of the world, Israel has a shortage of the necessary engineers to drive development.

Recognizing the need to act fast, in 2017 Professor Uri Sivan of the Technicon Israel Institute of Technology led a committee dedicated to documenting the strengths and weaknesses of the current state of Israels investment in quantum technology research and development. What the committee found was a lag in educational efforts and a need for more funding to keep pace with the fast growth of the industry.

In response to this need for funding, in 2018 Israels Defense Ministry and the Israel Science Foundation announced a multi-year fund that would dedicate in total $100 million to the research of quantum technologies in hopes that this secures Israels global position as a top contributor to new technologies.

Classic cryptography relies on the symbiotic relationship between a public-key, a private key, and a classical computers inability to reverse-engineer the private key to decrypt sensitive data. While the algorithms used thus far have proved too complex for classical computing, they are no match for the quantum computer.

Organizations are recognizing this potential crisis and jumping to find a solution. The National Institute for the Standards of Technology requested potential postquantum algorithms in 2016. IBM recently announced its own system for handling quantum encryption methods, known as CRYSTALS.

Current encryption methods are the walls in place that guard our personal information, from bank records and personal documents stored online to any data sent via the web, such as emails.

Just about any user with access to the web on a regular basis can benefit from the security that a VPN offers. A VPN not only protects the identity of your IP address but also secures sensitive data that we are wont to throw into the world wide web. To understand how this works, consider the concept of a tunnel. Your data is shifted through this VPN virtual tunnel that acts as a barrier to unwanted attacks and hackers. Now, this tunnel exists using standard encryption to hide your data. Quantum computing abilities, as they become more accessible and widespread, is going to essentially destroy any effectiveness provided by industries that rely on standard encryption.

Outside of the usual surfing and data-exposing that we do on the web, lots of us are also taking advantage of opportunities to create our own websites. However, even the best web hosts leave us high and dry with the new age of quantum computing abilities and the influx of spyware and malware. WordPress, one of the more popular web hosts, can easily fall vulnerable to SQL injections, cross-site scripting attacks, and cookie hijacking. The encryptions that can be used to prevent such attacks are, you guessed it, hopeless in the face of quantum technologies.

The current state of modern technology is unsurprisingly complex and requires cybersecurity professionals with strong problem-solving skills and creativity to abate the potential threats well be facing within the next decade. In order to stay ahead of the game and guarantee an effective solution for web-users, top VPN companies and web-hosts need to invest in the research necessary to find alternatives for standard encryption. ExpressVPN has taken it a step further with a kill switch if the VPN disconnects unexpectedly and also offers VPN tunneling.

The ability for constant advancements in any field related to science and technology is what makes our world interesting. Decades ago, the abilities afforded by quantum computing would have sounded like an idea only contingent within an Isaac Asimov novel.

The reality of it is that quantum computing has arrived and science waits for no one. Professionals across digital industries need to shift their paradigms in order to account for this young technology that promises to remap the world as we know it.

Israel is full to the brim with potential and now is the time to invest resources and encourage education to bridge the gap and continue to be a major player in the global economy of quantum computing.

Continue reading here:
Quantum Computing and Israel's Growing Tech Role | Sam Bocetta - The Times of Israel

Why India is falling behind in the Y2Q race – Livemint

NEW DELHI :Two decades ago, the world faced its first big computing scare. It was dubbed Y2K, a programming bug which raised widespread concerns that digital infrastructure would crumble at the turn of the new millennium. That moment passed without any major incident, thanks in large measure to work done by Indias software coders.

Now, the world faces a new scare that some scientists are calling the Y2Q (years to quantum") moment. Y2Q, say experts, could be the next major cyber disruption. When this moment will come is not certain; most predictive estimates range from 10 to 20 years. But one thing is certain: as things stand, India has not woken up to the implications (both positive and negative) of quantum computing.

What is quantum computing? Simply put, it is a future technology that will exponentially speed up the processing power of classical computers, and solve problems in a few seconds that todays fastest supercomputers cant.

Most importantly, a quantum computer would be able to factor the product of two big prime numbers. And that means the underlying assumptions powering modern encryption wont hold when a practical quantum computer becomes a reality. Encryption forms the backbone of a secure cyberspace. It helps to protect the data we send, receive or store.

So, a quantum computer could translate into a complete breakdown of current encryption infrastructure. Cybersecurity experts have been warning about this nightmarish scenario since the late 1990s.

In October, Google announced a major breakthrough, claiming its quantum computer can solve a problem in 200 seconds, which would take even the fastest classical computer 10,000 years. That means their computer had achieved quantum supremacy", claimed the companys scientists. IBM, its chief rival in the field, responded that the claims should be taken with a large dose of skepticism". Clearly, Googles news suggests a quantum future is not a question of if, but when.

India lags behind

As the US and China lead the global race in quantum technology, and other developed nations follow by investing significant intellectual and fiscal resources (see Future Danger), India lags far behind. Indian government is late, but efforts have begun in the last two years," said Debajyoti Bera, a professor at Indraprastha Institute of Information Technology (IIIT) Delhi, who researches quantum computing.

Mints interviews with academic researchers, private sector executives and government officials paint a bleak picture of Indias ability to be a competent participant. For one, the ecosystem is ill-equipped: just a few hundred researchers living in the country work in this domain, that too in discrete silos.

There are legacy reasons: Indias weakness in building hardware and manufacturing technology impedes efforts to implement theoretical ideas into real products. Whatever little is moving is primarily through the government: private sector participationand investmentremains lacklustre. And, of course, theres a funding crunch.

All this has left Indias top security officials concerned. Lieutenant General (retd) Rajesh Pant, national cybersecurity coordinator, who reports to the Prime Ministers Office, identified many gaps in the Indian quantum ecosystem. There is an absence of a quantum road map. There is no visibility in the quantum efforts and successes, and there is a lack of required skill power," Pant said at an event in December, while highlighting the advances China has made in the field. As the national cybersecurity coordinator, this is a cause of concern for me."

The task at hand

In a traditional computerfor instance, your phone and laptopevery piece of information, be it text or video, is ultimately a larger string of bits": each bit can be either zero or one. No other value is possible. In a quantum computer, bits" are replaced by qubits" where each unit can exist in both states, zero and one, at the same time. That makes the processing superfast: qubits can encode and process more information than bits.

Whats most vulnerable is information generated today that has long-term value: diplomatic and military secrets or sensitive financial and healthcare data. The information circulating on the internet that is protected with classical encryption can be harvested by an adversary. Whenever the decryption technology becomes available with the advent of quantum computers, todays secrets will break apart," explains Vadim Makarov, the chief scientist running Russias quantum hacking lab.

From a national security perspective, there are two threads in global efforts. One is to build a quantum computer: whoever gets there first will have the capability to decrypt secrets of the rest. Two, every country is trying to make ones own communications hack-proof and secure.

The Indian game plan

There are individual programmes operating across government departments in India. The ministry of electronics and information technology is interested in computing aspects; DRDO in encryption products and Isro in satellite communication," said a senior official at the department of science and technology (DST) who is directly involved in formulating Indias quantum policy initiatives, on condition of anonymity. DRDO is Defence Research and Development organisation, and Isro is Indian Space Research Organisation. DST, which works under the aegis of the central ministry of science and technology, mandate revolves around making advances in scientific research.

To that end, in 2019, DST launched Quantum Information Science and Technology (QuEST), a programme wherein the government will invest 80 crore in the next three years to fund research directed to build quantum computers, channels for quantum communication and cryptography, among other things. Some 51 projects were selected for funding under QuEST. A quarter of the money has been released, said the DST official.

K. VijayRaghavan, principal scientific adviser, declined to be interviewed for this story. However, in a recent interview to The Print, he said: It[QuEST] will ensure that the nation reaches, within a span of 10 years, the goal of achieving the technical capacity to build quantum computers and communications systems comparable with the best in the world, and hence earn a leadership role."

Not everyone agrees. While QuEST is a good initiative and has helped build some momentum in academia, it is too small to make any meaningful difference to the country," said Sunil Gupta, co-founder and chief executive of QNu Labs, a Bengaluru-based startup building quantum-safe encryption products. India needs to show their confidence and trust in startups." He added that the country needs to up the ante by committing at least $1 billion in this field for the next three years if India wants to make any impact on the global level".

More recently, DRDO announced a new initiative: of the five DRDO Young Scientists Laboratories that were launched by Prime Minister Narendra Modi in January with the aim to research and develop futuristic defence technologies. One lab set up at Indian Institute of Technology Bombay is dedicated to quantum technology.

The DST official said that the government is planning to launch a national mission on quantum technology. It will be a multi-departmental initiative to enable different agencies to work together and focus on the adoption of research into technology," the official said, adding that the mission will have clearly defined deliverables for the next 5 to 10 years." While the details are still in the works, the official said equipping India for building quantum-secure systems is on the cards.

The flaws in the plan

Why is India lagging behind? First, India doesnt have enough people working on quantum technology: the estimates differ, but they fall in the range of 100-200 researchers. That is not enough to compete with IBM," said Anirban Pathak, a professor at Jaypee Institute of Information Technology, and a recipient of DSTs QuEST funding.

Contrast that with China. One of my former students is now a faculty member in a Chinese university. She joined a group that started just two years ago and they are already 50 faculty members in the staff," added Pathak. In India, at no place, you will find more than three faculty members working in quantum."

IIIT Delhis Bera noted: A lot of Indians in quantum are working abroad. Many are working in IBM to build a quantum computer. India needs to figure out a way to get those people back here."

Secondly, theres the lack of a coordinated effort. There are many isolated communities in India working on various aspects: quantum hardware, quantum key distribution, information theory and other fields," said Bera. But there is not much communication across various groups. We cross each other mostly at conferences."

Jaypees Pathak added: In Delhi, there are eight researchers working in six different institutes. Quantum requires many kinds of expertise, and that is needed under one roof. We need an equivalent of Isro (for space) and Barc (for atomic research) for quantum."

Third is Indias legacy problem: strong on theory, but weak in hardware. That has a direct impact on the countrys ability to advance in building quantum technology. The lack of research is not the impediment to prepare for a quantum future, say experts. Implementation is the challenge, the real bottleneck. The DST official quoted earlier acknowledged that some Indian researchers he works with are frustrated.

They need infrastructure to implement their research. For that, we need to procure equipment, instal it and then set it up. That requires money and time," said the official. Indian government has recognized the gap and is working towards it."

Bera said that India should start building a quantum computer. But the problem is that the country doesnt even have good fabrication labs. If we want to design chips, Indians have to outsource," he said. Hardware has never been Indias strong point." QNu Labs is trying to fill that gap. The technology it is developing is based on research done over a decade ago: the effort is to build hardware and make it usable.

Finally, Indias private sector and investors have not stepped up in the game. If India wants something bigger, Indian tech giants like Wipro and Infosys need to step in. They have many engineers on the bench who can be involved. Academia alone or DST-funded projects cant compete with IBM," said Pathak.

The DST official agreed. R&D is good for building prototypes. But industry partnership is crucial for implementing it in the real world," he said. One aim of the national quantum mission that is under the works would be to spin-off startup companies and feed innovation into the ecosystem. We plan to bring venture capitalists (VCs) under one umbrella."

In conclusion

Pant, the national cybersecurity chief, minced no words at the event in December 2019 on quantum technology.

In 1993, there was an earthquake in Latur and we created the National Disaster Management Authority which now has a presence across the country." He added: Are we waiting for a cybersecurity earthquake to strike before we get our act together?"

Samarth Bansal is a freelance journalist based in Delhi. He writes about technology, politics and policy

Visit link:
Why India is falling behind in the Y2Q race - Livemint

‘How can we compete with Google?’: the battle to train quantum coders – The Guardian

There is a laboratory deep within University College London (UCL) that looks like a cross between a rebel base in Star Wars and a scene imagined by Jules Verne. Hidden within the miles of cables, blinking electronic equipment and screens is a gold-coloured contraption known as a dilution refrigerator. Its job is to chill the highly sensitive equipment needed to build a quantum computer to close to absolute zero, the coldest temperature in the known universe.

Standing around the refrigerator are students from Germany, Spain and China, who are studying to become members of an elite profession that has never existed before: quantum engineering. These scientists take the developments in quantum mechanics over the past century and turn them into revolutionary real-world applications in, for example, artificial intelligence, self-driving vehicles, cryptography and medicine.

The problem is that there is now what analysts call a quantum bottleneck. Owing to the fast growth of the industry, not enough quantum engineers are being trained in the UK or globally to meet expected demand. This skills shortage has been identified as a crucial challenge and will, if unaddressed, threaten Britains position as one of the worlds top centres for quantum technologies.

The lack of access to a pipeline of talent will pose an existential threat to our company, and others like it, says James Palles-Dimmock, commercial director of London- and Oxford-based startup Quantum Motion. You are not going to make a quantum computer with 1,000 average people you need 10 to 100 incredibly good people, and thatll be the case for everybody worldwide, so access to the best talent is going to define which companies succeed and which fail.

This doesnt just matter to niche companies; it affects everyone. If the UK is to remain at the leading edge of the world economy then it has to compete with the leading technological and scientific developments, warns Professor Paul Warburton, director of the CDT in Delivering Quantum Technologies. This is the only way we can maintain our standard of living.

This quantum bottleneck is only going to grow more acute. Data is scarce, but according to research by the Quantum Computing Report and the University of Wisconsin-Madison, on one day in June 2016 there were just 35 vacancies worldwide for commercial quantum companies advertised. By December, that figure had leapt to 283.

In the UK, Quantum Motion estimates that the industry will need another 150200 quantum engineers over the next 18 months. In contrast, Bristol Universitys centre for doctoral training produces about 10 qualified engineers each year.

In the recent past, quantum engineers would have studied for their PhDs in small groups inside much larger physics departments. Now there are interdisciplinary centres for doctoral training at UCL and Bristol University, where graduates in such subjects as maths, engineering and computer science, as well as physics, work together. As many of the students come with limited experience of quantum technologies, the first year of their four-year course is a compulsory introduction to the subject.

Rather than work with three or four people inside a large physics department its really great to be working with lots of people all on quantum, whether they are computer scientists or engineers. They have a high level of knowledge of the same problems, but a different way of thinking about them because of their different backgrounds, says Bristol student Naomi Solomons.

While Solomons is fortunate to study on an interdisciplinary course, these are few and far between in the UK. We are still overwhelmingly recruiting physicists, says Paul Warburton. We really need to massively increase the number of PhD students from outside the physics domain to really transform this sector.

The second problem, according to Warburton, is competition with the US. Anyone who graduates with a PhD in quantum technologies in this country is well sought after in the USA. The risk of lucrative US companies poaching UK talent is considerable. How can we compete with Google or D-Wave if it does get into an arms race? says Palles-Dimmock. They can chuck $300,000-$400,000 at people to make sure they have the engineers they want.

There are parallels with the fast growth of AI. In 2015, Ubers move to gut Carnegie Mellon Universitys world-leading robotics lab of nearly all its staff (about 50 in total) to help it build autonomous cars showed what can happen when a shortage of engineers causes a bottleneck.

Worryingly, Doug Finke, managing editor at Quantum Computing Report, has spotted a similar pattern emerging in the quantum industry today. The large expansion of quantum computing in the commercial space has encouraged a number of academics to leave academia and join a company, and this may create some shortages of professors to teach the next generation of students, he says.

More needs to be done to significantly increase the flow of engineers. One way is through diversity: Bristol has just held its first women in quantum event with a view to increasing its number of female students above the current 20%.

Another option is to create different levels of quantum engineers. A masters degree or a four-year dedicated undergraduate degree could be the way to mass-produce engineers because industry players often dont need a PhD-trained individual, says Turner. But I think you would be training more a kind of foot soldier than an industry leader.

One potential roadblock could be growing threats to the free movement of ideas and people. Nations seem to be starting to get a bit protective about what theyre doing, says Prof John Morton, founding director of Quantum Motion. [They] are often using concocted reasons of national security to justify retaining a commercial advantage for their own companies.

Warburton says he has especially seen this in the US. This reinforces the need for the UK to train its own quantum engineers. We cant rely on getting our technology from other nations. We need to have our own quantum technology capability.

Originally posted here:
'How can we compete with Google?': the battle to train quantum coders - The Guardian

2020s — the Decade of AI and Quantum – Inside Higher Ed

Too often, we look ahead assuming that the technologies and structures of today will be in place for years to come. Yet a look back confirms that change has moved at a dramatic pace in higher education.

Reviewing the incredible progress each decade brings makes me wonder, if I knew at the beginning of the decade what was coming, how might I have better prepared?

Make no mistake, we have crossed the threshold into the fourth industrial revolution that will most markedly advance this decade through maturing artificial intelligence, ultimately driven by quantum computing. The changes will come at an ever-increasing rate as the technologies and societal demands accelerate. Digital computers advanced over the past half century at approximately the rate described by Moores Law, with processing power doubling every two years. Now we are entering the era of Nevens Law, which predicts the speed of progress of quantum computing at a doubly exponential rate. This means change at a dizzyingly rapid rate that will leave many of us unable to comprehend the why and barely able to digest the daily advances that will describe reality. New platforms, products and processes will proliferate in this new decade.

That includes higher education. The centuries-old model of the faculty member at a podium addressing a class of students who are inconsistently and inaccurately taking notes on paper or laptop will seem so quaint, inefficient and impractical that it will be laughable. Observers in 2030 will wonder how any significant learning even took place in that environment.

Semesters and seat time will not survive the coming decade. Based in 19th- and 20th-century societal needs, these are long overdue to pass away. The logical and efficient structure of outcomes-based adaptive learning will quickly overtake the older methods, doing away with redundancy for the advanced students and providing developmental learning for those in need. Each student will be at the center of their learning experience, with AI algorithms fed by rich data about each student mapping progress and adjusting the pathway for each learner. This will lead to personalized learning where the courses and curriculum will be custom-made to meet the needs of the individual learner. Yet, it also will also serve to enhance the social experience for learners meeting face-to-face. In a report from Brookings on the topic, researchers stated that technology can help education leapfrog in a number of ways. It can provide individualized learning by tracking progress and personalizing activities to serve heterogeneous classrooms.

Early implementations of adaptive learning in the college setting have shown that this AI-driven process can result in greater equity success for the students. In addition, the faculty members see that their role has become even more important as they directly interact with the individual students to enable and facilitate their learning.

Increasingly we are gathering data about our students as they enter and progress through learning at our institutions. That big data is the "food" upon which artificial intelligence thrives. Sorting through volumes and varieties of data that in prior decades we could not efficiently process, AI can now uncover cause and effect pairs and webs. It can lead us to enhancements and solutions that previously were beyond our reach. As the pool of data grows and becomes more and more diverse -- not just numbers, but also videos and anecdotes -- the role of quantum computing comes into play.

While it is unlikely we will see quantum computers physically on the desks of university faculty and staff in the coming decade, we certainly will see cloud use of quantum computers to solve increasingly complex problems and opportunities. Quantum computers will interact with digital computers to apply deep learning at an as yet unseen scale. We will be able to pose challenges such as "what learning will researchers need to best prepare for the next generation of genetic advancement?" Faster than a blink of an eye, the quantum computers will respond.

It turns out that major developments are occurring every day in the advancement of quantum computing. Johns Hopkins University researchers recently discovered a superconducting material that may more effectively host qubits in the future. And Oxford University researchers just uncovered ways in which strontium ions can be much more efficiently entangled for scaling quantum computers. Advancements such as these will pave the path to ever more powerful computers that will enable ever more effective adaptive, individualized and personalized learning.

We know that change is coming. We know the direction of that change. We know some of the actual tools that will be instrumental in that change. Armed with that knowledge, what can we do today to prepare for the decade of the 2020s? Rather than merely reacting to changes after the fact, can we take steps to anticipate and prepare for that change? Can our institutions be better configured to adapt to the changes that are on the horizon? And who will lead that preparation at your institution?

Original post:
2020s -- the Decade of AI and Quantum - Inside Higher Ed

Five Ways Business Directors Can Prepare For The Future Of Cybersecurity – Forbes

By Stefan Deutscher, Partner and Associate Director for Cybersecurity and IT Infrastructure, Boston Consulting Group, and Daniel Dobrygowski, Head of Governance and Policy, World Economic Forum Centre for Cybersecurity

Is your company prepared?

In a business environment where a companys reputation increasingly depends on how well it acts as a steward of customer, client and partner information, boards of directors must be able to make informed decisions about cybersecurity.

Boards exist, among many other important tasks, to set risk appetite, hold managers accountable, and create appropriate boundary conditions for employees to live up to the expectations placed on them. In an increasingly digital world, cybersecurity must be a key component of these responsibilities and business leaders need to set the example that cybersecurity is important for long-term resilience.

Here are five things that board members could do to enhance their companys cybersecurity.

1. Learn about cyber risks

Board members dont need to be experts in cybersecurity, but they do need to become more knowledgeable about cyber risk. Today, when only one-third of board meetings regularly cover cyber issues, this knowledge can be brought into the board in a number of ways. Tabletop exercises, wargaming cyber crises and other on-going training need to be part of every boards common practice. Some companies, including those as varied as Hewlett Packard Enterprise, Goldman Sachs and Spirit Airlines, are adding an experienced board member responsible for cyber risk.

Boards also need to hear from internal and external cyber experts. Every major company would be wise to have an executive responsible for assessing and managing their cyber risk. They should, on a defined regular basis, report to the board and be able to do so frankly, with integrity.

2. Dont assume your industry is safe

Financial services firms have long known that ensuring maximum cybersecurity is a vital corporate goal and critical infrastructure companies, like electricity utilities, have quickly adapted. Industries such as automotive, aviation and healthcare are also recognizing that their reliance on devices and the internet of things has vastly increased their likelihood of being the target of cyberattacks and as such have changed their risk profile.

But even across industries aware of the importance of cybersecurity, cyber resilience capability and maturity vary widely. And industries that have so far been less targeted for cyberattack, like the extractive industries, will need to improve their cybersecurity posture to protect IP and other private or confidential information.

3. Include cybersecurity from the start

It is no longer possible for companies to innovate first and provide for security and privacy second. When a company is considering adapting or, even more importantly, creating new technologies, boards must demand that these technologies conform to their cyber-risk determinations and that cybersecurity be included by design from the outset.

Artificial intelligence (AI), which can change and act in ways that even the creator cannot anticipate, will particularly challenge the risk assessments of even the most cyber-savvy board. For example, while many executives look to AI as a tool to strengthen cyber defence, which it certainly can be, they often dont realize that AI is already being used by malicious actors as a tool for attack, and worse, AI itself can become a target of attack. Board members need to understand the degree of risk their companies can face with regard to AI.

Similarly, quantum computing is moving from science fiction to reality faster than mobile telephony did. Quantum computing not only has the potential for enormous value creation in certain use cases, but it also has the potential to obliterate many of the established forms of practical cryptography currently used in business environments to secure data and transactions. Companies concerned with data security must start preparing for what is called post quantum cryptography or encryption methods that do not rely on popular and common public-key algorithms that can be efficiently broken by quantum computers. Boards much ensure that their managers have their backing to experiment with these new methods to ensure future security.

4. Familiarize yourself with cyber ratings and assessments

For years, many corporate leaders believed that by adding yet another cybersecurity tool or service, their company would automatically become more secure. Today, with greater experience and sophistication, analysts can move from inputs (what tools do they use) to outcomes (what do the tools achieve) to effectively and accurately assess how well a company is ensuring its cyber resilience.

Boards will need to become familiar with cybersecurity and cyber-resilience ratings quickly. In a context where people want transparency about how well a company is protecting their data, cyber reputation is company reputation. Equally important, insurers, procurement departments and credit-rating agencies are understanding the significance of such ratings, using them and making them their own. In the very near future, ensuring effective cybersecurity will become a prerequisite for obtaining a reasonable insurance rate, a contract or a good credit score.

5. Embrace cooperation

Cyberattacks used to largely be the work of isolated individuals, such as criminals or hacktivists, but today they are increasingly caused by networked adversaries, such as organized crime groups and nation-state-backed actors, making individual defence consistently more challenging. As Stanley McChrystal, former United States Army General and Senior Fellow of Yales Jackson Institute for Global Affairs, , has said in reference to modern warfare, to defeat a networked enemy we have to become a network ourselves.

To succeed in managing the cultural shift that boards and their companies need to make if they are to thrive in the hyperconnected world of the Fourth Industrial Revolution, they cant simply act alone. Boards of directors set company culture and they need to demonstrate from the top how to partner. This means taking an active role in working with peers at other companies and across their ecosystem to develop and share best governance practices.

It also means working with government leaders and ensuring that company management does too. For some companies, it may even mean becoming part of the new global architecture of cybersecurity cooperation, as evidenced by new alliances, such as the Charter of Trust or Cybersecurity Tech Accord, which are attracting hundreds of companies around the world. Moving forward, cooperation will be the key to success in a time of increased cyber risk.

This article is related to the World Economic Forums Annual Meeting in Davos-Klosters, Switzerland, 21-24 January 2020.

See more here:
Five Ways Business Directors Can Prepare For The Future Of Cybersecurity - Forbes

IBM Gears Up to Report Q4 Earnings: What’s in the Cards? – Yahoo Finance

International Business Machines IBM is set to report fourth-quarter 2019 results on Jan 21.

The Zacks Consensus Estimate for fourth-quarter earnings is pegged at $4.69, unchanged for the past seven days. The estimate indicates a fall of about 3.7% from the year-ago quarters reported figure. For quarterly sales, the consensus mark stands at $21.7 billion that suggests a year-over-year decline of 0.3%.

Notably, the company has four-quarter positive earnings surprise of 2%, on average. In the last reported quarter, IBM delivered positive earnings surprise of 1.5%.

In the last reported quarter, the company delivered non-GAAP earnings of $2.68 per share, which surpassed the Zacks Consensus Estimate by 1.5%. However, the bottom line fell 22% from the year-ago quarters tally.

Revenues of $18.03 billion missed the Zacks Consensus Estimate by 1.2% and declined 3.9% on a year-over-year basis. At constant currency (cc), the metric dropped 0.6%.

International Business Machines Corporation Price, Consensus and EPS Surprise

International Business Machines Corporation Price, Consensus and EPS Surprise

International Business Machines Corporation price-consensus-eps-surprise-chart | International Business Machines Corporation Quote

Things to Watch Out

IBM is likely to have benefited from robust adoption of its cloud computing, mobile, security, analytics, cognitive technologies and AI related solutions in the to-be-reported quarter.

Markedly, deal wins and acquisitions are expected to have played an important role in boosting the companys portfolio and expand clientele in the cloud market. The buyout of Red Hat is likely to have paved way for growth in the hybrid cloud business. In fact, Red Hats expanding foothold across Asia Pacific is anticipated to have bolstered IBMs revenues in the cloud segment.

Further, IBM is striving to enhance efficiency of its quantum computing systems and services. In this respect, growing clientele of IBM Q Network is a positive. With quantum computing initiatives, the company attempts to help enterprises accelerate difficult financial and technical problems in real time.

Additionally, IBMs growth in industry verticals like health, key areas of analytics and security is likely to have boosted fourth-quarter performance. Notably, Watson Health has been witnessing broad-based growth in Payer, Provider, Imaging and Life Sciences domains.

However, we note that pricing pressures related to the companys legacy hardware business and ballooning debt levels have been headwinds.

Moreover, the company has been facing declines in its IBM Z product cycle and storage business.

These downsides along with adverse impacts from currency rates might have exerted pressure on the to-be-reported quarter's results.

What Our Model Says

Our proven model doesnt conclusively predict an earnings beat for IBM this time around. The combination of a positive Earnings ESP and a Zacks Rank #1 (Strong Buy), 2 (Buy) or 3 (Hold) increases the odds of an earnings beat. But thats not the case here. You can uncover the best stocks to buy or sell before theyre reported with our Earnings ESP Filter.

IBM has a Zacks Rank #3 and an Earnings ESP of -0.11.

Stocks to Consider

Here are some stocks you may consider as our proven model shows that these have the right mix of elements to beat estimates this time:

Apple AAPL has an Earnings ESP of +4.08% and a Zacks Rank of 2. You can see the complete list of todays Zacks #1 Rank stocks here.

Adobe Systems ADBE has an Earnings ESP of +1.08% and a Zacks Rank of 2.

Broadcom AVGO has an Earnings ESP of +5.37% and a Zacks Rank of 3.

Today's Best Stocks from Zacks

Would you like to see the updated picks from our best market-beating strategies? From 2017 through Q3 2019, while the S&P 500 gained +39.6%, five of our strategies returned +51.8%, +57.5%, +96.9%, +119.0%, and even +158.9%.

This outperformance has not just been a recent phenomenon. From 2000 Q3 2019, while the S&P averaged +5.6% per year, our top strategies averaged up to +54.1% per year.

See their latest picks free >>

Want the latest recommendations from Zacks Investment Research? Today, you can download 7 Best Stocks for the Next 30 Days. Click to get this free reportInternational Business Machines Corporation (IBM) : Free Stock Analysis ReportApple Inc. (AAPL) : Free Stock Analysis ReportAdobe Systems Incorporated (ADBE) : Free Stock Analysis ReportBroadcom Inc. (AVGO) : Free Stock Analysis ReportTo read this article on Zacks.com click here.

Follow this link:
IBM Gears Up to Report Q4 Earnings: What's in the Cards? - Yahoo Finance

Bleeding edge information technology developments – IT World Canada

What are some bleeding-edge information technology developments that a forward-thinking CIO should keep an eye on?

Here are a few emerging technologies that have caught my attention. These are likely to have an increasing impact on the world of business in the future. Consider which ones you should follow a little more closely.

A recent advance in quantum computing that a Google team achieved indicates that quantum computing technology is making progress out of the lab and closing in on practical business applications. Quantum computing is not likely to change routine business transaction processing or data analytics applications. However, quantum computing is likely to dramatically change computationally intense applications required for:

Since most businesses can benefit from at least a few of these applications, quantum computing is worth evaluating. For a more detailed discussion of specific applications in various topic areas, please read: Applying Paradigm-Shifting Quantum Computers to Real-World Issues.

Machine learning is the science of computers acting without software developers writing detailed code to handle every case in the data that the software will encounter. Machine learning software develops its own algorithms that discover knowledge from specific data and the softwares prior experience. Machine learning is based on statistical concepts and computational principles.

The leading cloud computing infrastructure providers machine learning routines that are quite easy to integrate into machine learning applications. These routines greatly reduce expertise barriers that have slowed machine learning adoption at many businesses.

Selected business applications of machine learning include:

For summary descriptions of specific applications, please read: 10 Companies Using Machine Learning in Cool Ways.

Distributed ledger technology is often called blockchain. It enables new business and trust models. A distributed ledger enables all parties in a business community to see agreed information about all transactions, not just their own. That visibility builds trust within the community.

Bitcoin, a cryptocurrency, is the mostly widely known example application of blockchain.

Distributed ledger technology has great potential to revolutionize the way governments, institutions, and corporations interact with each other and with their clients or customers.Selected business applications of distributed ledger technology include:

For descriptions of industry-specific distributed ledger applications, please read: 17 Blockchain Applications That Are Transforming Society.

The Industrial Internet of Things (IIoT) is a major advance on Supervisor Control and Data Acquisition (SCADA). SCADA, in many forms, has been used for decades to safely operate major industrial facilities including oil refineries, petrochemical plants, electrical power generation stations, and assembly lines of all kinds.

IIOT is a major advance over relatively expensive SCADA. IIoT relies on dramatically cheaper components including sensors, network bandwidth, storage and computing resources. As a result, IIoT is feasible in many smaller facilities and offers a huge increase in data points for larger facilities. Business examples where IIoT delivers considerable value include production plants, trucks, cars, jet engines, elevators, and weather buoys.

The aggressive implementation of IIoT can:

For summary descriptions of specific IIOT applications, please read: The Top 20 Industrial IoT Applications.

RISC-V is an open-source hardware instruction set architecture (ISA) for CPU microprocessors that is growing in importance. Its based on established reduced instruction set computer (RISC) principles. The open-source aspect of the RISC-V ISA is a significant change compared to the proprietary ISA designs of the dominant computer chip manufacturers Intel and Arm.

RISC-V offers a way around paying ISA royalties for CPU microprocessors to either of the monopolists. The royalties may not be significant for chips used in expensive servers or smartphones, but they are significant for the cheap chips required in large numbers to implement the IIOT applications listed above.

For an expanded discussion of RISC-V, please read: A new blueprint for microprocessors challenges the industrys giants.

What bleeding edge information technology developments would you add to this list? Let us know in the comments below.

Here is the original post:
Bleeding edge information technology developments - IT World Canada