Quantum Computing and Israel’s Growing Tech Role | Sam Bocetta – The Times of Israel

Its time to adjust to a world that is changing from the digital landscape that we have grown accustomed to. Traditional computing is evolving as quantum computing takes center stage.

Traditional computing uses the binary system, a digital language made up of strings of 1s and 0s. Quantum computing is a nonbinary system that uses the qubit which has the ability to exist as both 1 and 0 simultaneously, giving it a near-infinite number of positions and combinations. This computational ability far exceeds any other similar technology on the market today.

This new technology threatens to outpace our efforts in cyber defense and poses an interesting challenge to VPN companies, web hosts, and other similar industries that rely on traditional methods of standard encryption.

While leading tech giants all over the globe continue to implement funding that pours hundreds of billions of dollars into their R&D programs for quantum computing, Israel is quick to recognize the importance of the emerging industry. The Startup Nations engineers can be found toiling away in the fight to be at the frontier of the worlds next big technological innovations.

Quantum computing provides unmatched efficiency at analyzing data. To understand the scope of it, consider the aforementioned classical computing style that encodes information in binary. Picture a string of 1s and 0s about 30 digits long. This string alone has almost one billion different combinations. A classical computer can only analyze each possibility one at a time. However, a quantum computer, thanks to a phenomenon known as superposition, can exist in each one of those billion states simultaneously. To match this unparalleled computing power, our classical computer would need 1 billion processors.

Consider how much time we spend using applications on the internet. Our data is constantly being stored, usually in large data centers far from us thanks to the ability of cloud computing, which allows information to be stored at data centers and analyzed at a great distance from the user.

Tech ventures, such as Microsoft Azure and Amazon AWS, compete for the newest developments in this technology knowing the positive effects it has on the web users experience, such as access to the fastest response times, speedy data transfer, and the most powerful processing capabilities for AI.

Quantum computing has future applications in almost every facet of civilian life imaginable, including pharmaceuticals, energy, space, and more. Quantum computers could offer scientists the ability to work up close with virtual models unlike any theyve had before, with the ability to analyze anything from complex chemical reactions to quantum systems. AI, the technology claiming to rival electricity in importance and implementation, is the ideal candidate for quantum computing due to it often requiring complex software too challenging for current systems.

Really, the world is quantum computings oyster.

The next Silicon Valley happens to be on the other side of the world from California. Israel has gained the attention of major players in the tech sector, including giants such as Intel, Amazon, Google, and Nvidia. The Startup Nation got its nickname due to a large number of startups compared to the population, with approximately 1 startup for every 1,400 residents. In a list of the top 50 global cities for the growing tech industry, Tel Aviv, Israel comes in at #15. Israel is wrapping up the year of 2019 with an astonishing 102% jump in the number of tech mergers and acquisitions as compared to the previous year, with no signs of slowing down.

Habana Labs and Annapurna Labs, both created by entrepreneur Avigdor Willenz, were recently acquired by Intel and Amazon respectively to further their development in the realm of quantum computing and more powerful processors. Google, Nvidia, Marvell, Huawei, Broadcom, and Cisco have also invested billions of capital into Israeli prospects.

One of Googles R&D centers located in Tel Aviv is actively heading the research on quantum computing. Just this year Google announced a major breakthrough that made other tech giants pick up the pace. They hinted at a computer chip that, with the power of quantum computing, was able to manage and analyze in one second the amount of data that would take a full day for any supercomputer.

While Israel is reaping the benefits of its current exposure thanks to big tech firms, an anonymous source is skeptical about the long-term success of Israels foray into the tech world without the increased education and government support to keep up with the demand. Similar to other parts of the world, Israel has a shortage of the necessary engineers to drive development.

Recognizing the need to act fast, in 2017 Professor Uri Sivan of the Technicon Israel Institute of Technology led a committee dedicated to documenting the strengths and weaknesses of the current state of Israels investment in quantum technology research and development. What the committee found was a lag in educational efforts and a need for more funding to keep pace with the fast growth of the industry.

In response to this need for funding, in 2018 Israels Defense Ministry and the Israel Science Foundation announced a multi-year fund that would dedicate in total $100 million to the research of quantum technologies in hopes that this secures Israels global position as a top contributor to new technologies.

Classic cryptography relies on the symbiotic relationship between a public-key, a private key, and a classical computers inability to reverse-engineer the private key to decrypt sensitive data. While the algorithms used thus far have proved too complex for classical computing, they are no match for the quantum computer.

Organizations are recognizing this potential crisis and jumping to find a solution. The National Institute for the Standards of Technology requested potential postquantum algorithms in 2016. IBM recently announced its own system for handling quantum encryption methods, known as CRYSTALS.

Current encryption methods are the walls in place that guard our personal information, from bank records and personal documents stored online to any data sent via the web, such as emails.

Just about any user with access to the web on a regular basis can benefit from the security that a VPN offers. A VPN not only protects the identity of your IP address but also secures sensitive data that we are wont to throw into the world wide web. To understand how this works, consider the concept of a tunnel. Your data is shifted through this VPN virtual tunnel that acts as a barrier to unwanted attacks and hackers. Now, this tunnel exists using standard encryption to hide your data. Quantum computing abilities, as they become more accessible and widespread, is going to essentially destroy any effectiveness provided by industries that rely on standard encryption.

Outside of the usual surfing and data-exposing that we do on the web, lots of us are also taking advantage of opportunities to create our own websites. However, even the best web hosts leave us high and dry with the new age of quantum computing abilities and the influx of spyware and malware. WordPress, one of the more popular web hosts, can easily fall vulnerable to SQL injections, cross-site scripting attacks, and cookie hijacking. The encryptions that can be used to prevent such attacks are, you guessed it, hopeless in the face of quantum technologies.

The current state of modern technology is unsurprisingly complex and requires cybersecurity professionals with strong problem-solving skills and creativity to abate the potential threats well be facing within the next decade. In order to stay ahead of the game and guarantee an effective solution for web-users, top VPN companies and web-hosts need to invest in the research necessary to find alternatives for standard encryption. ExpressVPN has taken it a step further with a kill switch if the VPN disconnects unexpectedly and also offers VPN tunneling.

The ability for constant advancements in any field related to science and technology is what makes our world interesting. Decades ago, the abilities afforded by quantum computing would have sounded like an idea only contingent within an Isaac Asimov novel.

The reality of it is that quantum computing has arrived and science waits for no one. Professionals across digital industries need to shift their paradigms in order to account for this young technology that promises to remap the world as we know it.

Israel is full to the brim with potential and now is the time to invest resources and encourage education to bridge the gap and continue to be a major player in the global economy of quantum computing.

Continue reading here:
Quantum Computing and Israel's Growing Tech Role | Sam Bocetta - The Times of Israel

‘How can we compete with Google?’: the battle to train quantum coders – The Guardian

There is a laboratory deep within University College London (UCL) that looks like a cross between a rebel base in Star Wars and a scene imagined by Jules Verne. Hidden within the miles of cables, blinking electronic equipment and screens is a gold-coloured contraption known as a dilution refrigerator. Its job is to chill the highly sensitive equipment needed to build a quantum computer to close to absolute zero, the coldest temperature in the known universe.

Standing around the refrigerator are students from Germany, Spain and China, who are studying to become members of an elite profession that has never existed before: quantum engineering. These scientists take the developments in quantum mechanics over the past century and turn them into revolutionary real-world applications in, for example, artificial intelligence, self-driving vehicles, cryptography and medicine.

The problem is that there is now what analysts call a quantum bottleneck. Owing to the fast growth of the industry, not enough quantum engineers are being trained in the UK or globally to meet expected demand. This skills shortage has been identified as a crucial challenge and will, if unaddressed, threaten Britains position as one of the worlds top centres for quantum technologies.

The lack of access to a pipeline of talent will pose an existential threat to our company, and others like it, says James Palles-Dimmock, commercial director of London- and Oxford-based startup Quantum Motion. You are not going to make a quantum computer with 1,000 average people you need 10 to 100 incredibly good people, and thatll be the case for everybody worldwide, so access to the best talent is going to define which companies succeed and which fail.

This doesnt just matter to niche companies; it affects everyone. If the UK is to remain at the leading edge of the world economy then it has to compete with the leading technological and scientific developments, warns Professor Paul Warburton, director of the CDT in Delivering Quantum Technologies. This is the only way we can maintain our standard of living.

This quantum bottleneck is only going to grow more acute. Data is scarce, but according to research by the Quantum Computing Report and the University of Wisconsin-Madison, on one day in June 2016 there were just 35 vacancies worldwide for commercial quantum companies advertised. By December, that figure had leapt to 283.

In the UK, Quantum Motion estimates that the industry will need another 150200 quantum engineers over the next 18 months. In contrast, Bristol Universitys centre for doctoral training produces about 10 qualified engineers each year.

In the recent past, quantum engineers would have studied for their PhDs in small groups inside much larger physics departments. Now there are interdisciplinary centres for doctoral training at UCL and Bristol University, where graduates in such subjects as maths, engineering and computer science, as well as physics, work together. As many of the students come with limited experience of quantum technologies, the first year of their four-year course is a compulsory introduction to the subject.

Rather than work with three or four people inside a large physics department its really great to be working with lots of people all on quantum, whether they are computer scientists or engineers. They have a high level of knowledge of the same problems, but a different way of thinking about them because of their different backgrounds, says Bristol student Naomi Solomons.

While Solomons is fortunate to study on an interdisciplinary course, these are few and far between in the UK. We are still overwhelmingly recruiting physicists, says Paul Warburton. We really need to massively increase the number of PhD students from outside the physics domain to really transform this sector.

The second problem, according to Warburton, is competition with the US. Anyone who graduates with a PhD in quantum technologies in this country is well sought after in the USA. The risk of lucrative US companies poaching UK talent is considerable. How can we compete with Google or D-Wave if it does get into an arms race? says Palles-Dimmock. They can chuck $300,000-$400,000 at people to make sure they have the engineers they want.

There are parallels with the fast growth of AI. In 2015, Ubers move to gut Carnegie Mellon Universitys world-leading robotics lab of nearly all its staff (about 50 in total) to help it build autonomous cars showed what can happen when a shortage of engineers causes a bottleneck.

Worryingly, Doug Finke, managing editor at Quantum Computing Report, has spotted a similar pattern emerging in the quantum industry today. The large expansion of quantum computing in the commercial space has encouraged a number of academics to leave academia and join a company, and this may create some shortages of professors to teach the next generation of students, he says.

More needs to be done to significantly increase the flow of engineers. One way is through diversity: Bristol has just held its first women in quantum event with a view to increasing its number of female students above the current 20%.

Another option is to create different levels of quantum engineers. A masters degree or a four-year dedicated undergraduate degree could be the way to mass-produce engineers because industry players often dont need a PhD-trained individual, says Turner. But I think you would be training more a kind of foot soldier than an industry leader.

One potential roadblock could be growing threats to the free movement of ideas and people. Nations seem to be starting to get a bit protective about what theyre doing, says Prof John Morton, founding director of Quantum Motion. [They] are often using concocted reasons of national security to justify retaining a commercial advantage for their own companies.

Warburton says he has especially seen this in the US. This reinforces the need for the UK to train its own quantum engineers. We cant rely on getting our technology from other nations. We need to have our own quantum technology capability.

Originally posted here:
'How can we compete with Google?': the battle to train quantum coders - The Guardian

Why India is falling behind in the Y2Q race – Livemint

NEW DELHI :Two decades ago, the world faced its first big computing scare. It was dubbed Y2K, a programming bug which raised widespread concerns that digital infrastructure would crumble at the turn of the new millennium. That moment passed without any major incident, thanks in large measure to work done by Indias software coders.

Now, the world faces a new scare that some scientists are calling the Y2Q (years to quantum") moment. Y2Q, say experts, could be the next major cyber disruption. When this moment will come is not certain; most predictive estimates range from 10 to 20 years. But one thing is certain: as things stand, India has not woken up to the implications (both positive and negative) of quantum computing.

What is quantum computing? Simply put, it is a future technology that will exponentially speed up the processing power of classical computers, and solve problems in a few seconds that todays fastest supercomputers cant.

Most importantly, a quantum computer would be able to factor the product of two big prime numbers. And that means the underlying assumptions powering modern encryption wont hold when a practical quantum computer becomes a reality. Encryption forms the backbone of a secure cyberspace. It helps to protect the data we send, receive or store.

So, a quantum computer could translate into a complete breakdown of current encryption infrastructure. Cybersecurity experts have been warning about this nightmarish scenario since the late 1990s.

In October, Google announced a major breakthrough, claiming its quantum computer can solve a problem in 200 seconds, which would take even the fastest classical computer 10,000 years. That means their computer had achieved quantum supremacy", claimed the companys scientists. IBM, its chief rival in the field, responded that the claims should be taken with a large dose of skepticism". Clearly, Googles news suggests a quantum future is not a question of if, but when.

India lags behind

As the US and China lead the global race in quantum technology, and other developed nations follow by investing significant intellectual and fiscal resources (see Future Danger), India lags far behind. Indian government is late, but efforts have begun in the last two years," said Debajyoti Bera, a professor at Indraprastha Institute of Information Technology (IIIT) Delhi, who researches quantum computing.

Mints interviews with academic researchers, private sector executives and government officials paint a bleak picture of Indias ability to be a competent participant. For one, the ecosystem is ill-equipped: just a few hundred researchers living in the country work in this domain, that too in discrete silos.

There are legacy reasons: Indias weakness in building hardware and manufacturing technology impedes efforts to implement theoretical ideas into real products. Whatever little is moving is primarily through the government: private sector participationand investmentremains lacklustre. And, of course, theres a funding crunch.

All this has left Indias top security officials concerned. Lieutenant General (retd) Rajesh Pant, national cybersecurity coordinator, who reports to the Prime Ministers Office, identified many gaps in the Indian quantum ecosystem. There is an absence of a quantum road map. There is no visibility in the quantum efforts and successes, and there is a lack of required skill power," Pant said at an event in December, while highlighting the advances China has made in the field. As the national cybersecurity coordinator, this is a cause of concern for me."

The task at hand

In a traditional computerfor instance, your phone and laptopevery piece of information, be it text or video, is ultimately a larger string of bits": each bit can be either zero or one. No other value is possible. In a quantum computer, bits" are replaced by qubits" where each unit can exist in both states, zero and one, at the same time. That makes the processing superfast: qubits can encode and process more information than bits.

Whats most vulnerable is information generated today that has long-term value: diplomatic and military secrets or sensitive financial and healthcare data. The information circulating on the internet that is protected with classical encryption can be harvested by an adversary. Whenever the decryption technology becomes available with the advent of quantum computers, todays secrets will break apart," explains Vadim Makarov, the chief scientist running Russias quantum hacking lab.

From a national security perspective, there are two threads in global efforts. One is to build a quantum computer: whoever gets there first will have the capability to decrypt secrets of the rest. Two, every country is trying to make ones own communications hack-proof and secure.

The Indian game plan

There are individual programmes operating across government departments in India. The ministry of electronics and information technology is interested in computing aspects; DRDO in encryption products and Isro in satellite communication," said a senior official at the department of science and technology (DST) who is directly involved in formulating Indias quantum policy initiatives, on condition of anonymity. DRDO is Defence Research and Development organisation, and Isro is Indian Space Research Organisation. DST, which works under the aegis of the central ministry of science and technology, mandate revolves around making advances in scientific research.

To that end, in 2019, DST launched Quantum Information Science and Technology (QuEST), a programme wherein the government will invest 80 crore in the next three years to fund research directed to build quantum computers, channels for quantum communication and cryptography, among other things. Some 51 projects were selected for funding under QuEST. A quarter of the money has been released, said the DST official.

K. VijayRaghavan, principal scientific adviser, declined to be interviewed for this story. However, in a recent interview to The Print, he said: It[QuEST] will ensure that the nation reaches, within a span of 10 years, the goal of achieving the technical capacity to build quantum computers and communications systems comparable with the best in the world, and hence earn a leadership role."

Not everyone agrees. While QuEST is a good initiative and has helped build some momentum in academia, it is too small to make any meaningful difference to the country," said Sunil Gupta, co-founder and chief executive of QNu Labs, a Bengaluru-based startup building quantum-safe encryption products. India needs to show their confidence and trust in startups." He added that the country needs to up the ante by committing at least $1 billion in this field for the next three years if India wants to make any impact on the global level".

More recently, DRDO announced a new initiative: of the five DRDO Young Scientists Laboratories that were launched by Prime Minister Narendra Modi in January with the aim to research and develop futuristic defence technologies. One lab set up at Indian Institute of Technology Bombay is dedicated to quantum technology.

The DST official said that the government is planning to launch a national mission on quantum technology. It will be a multi-departmental initiative to enable different agencies to work together and focus on the adoption of research into technology," the official said, adding that the mission will have clearly defined deliverables for the next 5 to 10 years." While the details are still in the works, the official said equipping India for building quantum-secure systems is on the cards.

The flaws in the plan

Why is India lagging behind? First, India doesnt have enough people working on quantum technology: the estimates differ, but they fall in the range of 100-200 researchers. That is not enough to compete with IBM," said Anirban Pathak, a professor at Jaypee Institute of Information Technology, and a recipient of DSTs QuEST funding.

Contrast that with China. One of my former students is now a faculty member in a Chinese university. She joined a group that started just two years ago and they are already 50 faculty members in the staff," added Pathak. In India, at no place, you will find more than three faculty members working in quantum."

IIIT Delhis Bera noted: A lot of Indians in quantum are working abroad. Many are working in IBM to build a quantum computer. India needs to figure out a way to get those people back here."

Secondly, theres the lack of a coordinated effort. There are many isolated communities in India working on various aspects: quantum hardware, quantum key distribution, information theory and other fields," said Bera. But there is not much communication across various groups. We cross each other mostly at conferences."

Jaypees Pathak added: In Delhi, there are eight researchers working in six different institutes. Quantum requires many kinds of expertise, and that is needed under one roof. We need an equivalent of Isro (for space) and Barc (for atomic research) for quantum."

Third is Indias legacy problem: strong on theory, but weak in hardware. That has a direct impact on the countrys ability to advance in building quantum technology. The lack of research is not the impediment to prepare for a quantum future, say experts. Implementation is the challenge, the real bottleneck. The DST official quoted earlier acknowledged that some Indian researchers he works with are frustrated.

They need infrastructure to implement their research. For that, we need to procure equipment, instal it and then set it up. That requires money and time," said the official. Indian government has recognized the gap and is working towards it."

Bera said that India should start building a quantum computer. But the problem is that the country doesnt even have good fabrication labs. If we want to design chips, Indians have to outsource," he said. Hardware has never been Indias strong point." QNu Labs is trying to fill that gap. The technology it is developing is based on research done over a decade ago: the effort is to build hardware and make it usable.

Finally, Indias private sector and investors have not stepped up in the game. If India wants something bigger, Indian tech giants like Wipro and Infosys need to step in. They have many engineers on the bench who can be involved. Academia alone or DST-funded projects cant compete with IBM," said Pathak.

The DST official agreed. R&D is good for building prototypes. But industry partnership is crucial for implementing it in the real world," he said. One aim of the national quantum mission that is under the works would be to spin-off startup companies and feed innovation into the ecosystem. We plan to bring venture capitalists (VCs) under one umbrella."

In conclusion

Pant, the national cybersecurity chief, minced no words at the event in December 2019 on quantum technology.

In 1993, there was an earthquake in Latur and we created the National Disaster Management Authority which now has a presence across the country." He added: Are we waiting for a cybersecurity earthquake to strike before we get our act together?"

Samarth Bansal is a freelance journalist based in Delhi. He writes about technology, politics and policy

Visit link:
Why India is falling behind in the Y2Q race - Livemint

2020s — the Decade of AI and Quantum – Inside Higher Ed

Too often, we look ahead assuming that the technologies and structures of today will be in place for years to come. Yet a look back confirms that change has moved at a dramatic pace in higher education.

Reviewing the incredible progress each decade brings makes me wonder, if I knew at the beginning of the decade what was coming, how might I have better prepared?

Make no mistake, we have crossed the threshold into the fourth industrial revolution that will most markedly advance this decade through maturing artificial intelligence, ultimately driven by quantum computing. The changes will come at an ever-increasing rate as the technologies and societal demands accelerate. Digital computers advanced over the past half century at approximately the rate described by Moores Law, with processing power doubling every two years. Now we are entering the era of Nevens Law, which predicts the speed of progress of quantum computing at a doubly exponential rate. This means change at a dizzyingly rapid rate that will leave many of us unable to comprehend the why and barely able to digest the daily advances that will describe reality. New platforms, products and processes will proliferate in this new decade.

That includes higher education. The centuries-old model of the faculty member at a podium addressing a class of students who are inconsistently and inaccurately taking notes on paper or laptop will seem so quaint, inefficient and impractical that it will be laughable. Observers in 2030 will wonder how any significant learning even took place in that environment.

Semesters and seat time will not survive the coming decade. Based in 19th- and 20th-century societal needs, these are long overdue to pass away. The logical and efficient structure of outcomes-based adaptive learning will quickly overtake the older methods, doing away with redundancy for the advanced students and providing developmental learning for those in need. Each student will be at the center of their learning experience, with AI algorithms fed by rich data about each student mapping progress and adjusting the pathway for each learner. This will lead to personalized learning where the courses and curriculum will be custom-made to meet the needs of the individual learner. Yet, it also will also serve to enhance the social experience for learners meeting face-to-face. In a report from Brookings on the topic, researchers stated that technology can help education leapfrog in a number of ways. It can provide individualized learning by tracking progress and personalizing activities to serve heterogeneous classrooms.

Early implementations of adaptive learning in the college setting have shown that this AI-driven process can result in greater equity success for the students. In addition, the faculty members see that their role has become even more important as they directly interact with the individual students to enable and facilitate their learning.

Increasingly we are gathering data about our students as they enter and progress through learning at our institutions. That big data is the "food" upon which artificial intelligence thrives. Sorting through volumes and varieties of data that in prior decades we could not efficiently process, AI can now uncover cause and effect pairs and webs. It can lead us to enhancements and solutions that previously were beyond our reach. As the pool of data grows and becomes more and more diverse -- not just numbers, but also videos and anecdotes -- the role of quantum computing comes into play.

While it is unlikely we will see quantum computers physically on the desks of university faculty and staff in the coming decade, we certainly will see cloud use of quantum computers to solve increasingly complex problems and opportunities. Quantum computers will interact with digital computers to apply deep learning at an as yet unseen scale. We will be able to pose challenges such as "what learning will researchers need to best prepare for the next generation of genetic advancement?" Faster than a blink of an eye, the quantum computers will respond.

It turns out that major developments are occurring every day in the advancement of quantum computing. Johns Hopkins University researchers recently discovered a superconducting material that may more effectively host qubits in the future. And Oxford University researchers just uncovered ways in which strontium ions can be much more efficiently entangled for scaling quantum computers. Advancements such as these will pave the path to ever more powerful computers that will enable ever more effective adaptive, individualized and personalized learning.

We know that change is coming. We know the direction of that change. We know some of the actual tools that will be instrumental in that change. Armed with that knowledge, what can we do today to prepare for the decade of the 2020s? Rather than merely reacting to changes after the fact, can we take steps to anticipate and prepare for that change? Can our institutions be better configured to adapt to the changes that are on the horizon? And who will lead that preparation at your institution?

Original post:
2020s -- the Decade of AI and Quantum - Inside Higher Ed

Five Ways Business Directors Can Prepare For The Future Of Cybersecurity – Forbes

By Stefan Deutscher, Partner and Associate Director for Cybersecurity and IT Infrastructure, Boston Consulting Group, and Daniel Dobrygowski, Head of Governance and Policy, World Economic Forum Centre for Cybersecurity

Is your company prepared?

In a business environment where a companys reputation increasingly depends on how well it acts as a steward of customer, client and partner information, boards of directors must be able to make informed decisions about cybersecurity.

Boards exist, among many other important tasks, to set risk appetite, hold managers accountable, and create appropriate boundary conditions for employees to live up to the expectations placed on them. In an increasingly digital world, cybersecurity must be a key component of these responsibilities and business leaders need to set the example that cybersecurity is important for long-term resilience.

Here are five things that board members could do to enhance their companys cybersecurity.

1. Learn about cyber risks

Board members dont need to be experts in cybersecurity, but they do need to become more knowledgeable about cyber risk. Today, when only one-third of board meetings regularly cover cyber issues, this knowledge can be brought into the board in a number of ways. Tabletop exercises, wargaming cyber crises and other on-going training need to be part of every boards common practice. Some companies, including those as varied as Hewlett Packard Enterprise, Goldman Sachs and Spirit Airlines, are adding an experienced board member responsible for cyber risk.

Boards also need to hear from internal and external cyber experts. Every major company would be wise to have an executive responsible for assessing and managing their cyber risk. They should, on a defined regular basis, report to the board and be able to do so frankly, with integrity.

2. Dont assume your industry is safe

Financial services firms have long known that ensuring maximum cybersecurity is a vital corporate goal and critical infrastructure companies, like electricity utilities, have quickly adapted. Industries such as automotive, aviation and healthcare are also recognizing that their reliance on devices and the internet of things has vastly increased their likelihood of being the target of cyberattacks and as such have changed their risk profile.

But even across industries aware of the importance of cybersecurity, cyber resilience capability and maturity vary widely. And industries that have so far been less targeted for cyberattack, like the extractive industries, will need to improve their cybersecurity posture to protect IP and other private or confidential information.

3. Include cybersecurity from the start

It is no longer possible for companies to innovate first and provide for security and privacy second. When a company is considering adapting or, even more importantly, creating new technologies, boards must demand that these technologies conform to their cyber-risk determinations and that cybersecurity be included by design from the outset.

Artificial intelligence (AI), which can change and act in ways that even the creator cannot anticipate, will particularly challenge the risk assessments of even the most cyber-savvy board. For example, while many executives look to AI as a tool to strengthen cyber defence, which it certainly can be, they often dont realize that AI is already being used by malicious actors as a tool for attack, and worse, AI itself can become a target of attack. Board members need to understand the degree of risk their companies can face with regard to AI.

Similarly, quantum computing is moving from science fiction to reality faster than mobile telephony did. Quantum computing not only has the potential for enormous value creation in certain use cases, but it also has the potential to obliterate many of the established forms of practical cryptography currently used in business environments to secure data and transactions. Companies concerned with data security must start preparing for what is called post quantum cryptography or encryption methods that do not rely on popular and common public-key algorithms that can be efficiently broken by quantum computers. Boards much ensure that their managers have their backing to experiment with these new methods to ensure future security.

4. Familiarize yourself with cyber ratings and assessments

For years, many corporate leaders believed that by adding yet another cybersecurity tool or service, their company would automatically become more secure. Today, with greater experience and sophistication, analysts can move from inputs (what tools do they use) to outcomes (what do the tools achieve) to effectively and accurately assess how well a company is ensuring its cyber resilience.

Boards will need to become familiar with cybersecurity and cyber-resilience ratings quickly. In a context where people want transparency about how well a company is protecting their data, cyber reputation is company reputation. Equally important, insurers, procurement departments and credit-rating agencies are understanding the significance of such ratings, using them and making them their own. In the very near future, ensuring effective cybersecurity will become a prerequisite for obtaining a reasonable insurance rate, a contract or a good credit score.

5. Embrace cooperation

Cyberattacks used to largely be the work of isolated individuals, such as criminals or hacktivists, but today they are increasingly caused by networked adversaries, such as organized crime groups and nation-state-backed actors, making individual defence consistently more challenging. As Stanley McChrystal, former United States Army General and Senior Fellow of Yales Jackson Institute for Global Affairs, , has said in reference to modern warfare, to defeat a networked enemy we have to become a network ourselves.

To succeed in managing the cultural shift that boards and their companies need to make if they are to thrive in the hyperconnected world of the Fourth Industrial Revolution, they cant simply act alone. Boards of directors set company culture and they need to demonstrate from the top how to partner. This means taking an active role in working with peers at other companies and across their ecosystem to develop and share best governance practices.

It also means working with government leaders and ensuring that company management does too. For some companies, it may even mean becoming part of the new global architecture of cybersecurity cooperation, as evidenced by new alliances, such as the Charter of Trust or Cybersecurity Tech Accord, which are attracting hundreds of companies around the world. Moving forward, cooperation will be the key to success in a time of increased cyber risk.

This article is related to the World Economic Forums Annual Meeting in Davos-Klosters, Switzerland, 21-24 January 2020.

See more here:
Five Ways Business Directors Can Prepare For The Future Of Cybersecurity - Forbes

IBM Gears Up to Report Q4 Earnings: What’s in the Cards? – Yahoo Finance

International Business Machines IBM is set to report fourth-quarter 2019 results on Jan 21.

The Zacks Consensus Estimate for fourth-quarter earnings is pegged at $4.69, unchanged for the past seven days. The estimate indicates a fall of about 3.7% from the year-ago quarters reported figure. For quarterly sales, the consensus mark stands at $21.7 billion that suggests a year-over-year decline of 0.3%.

Notably, the company has four-quarter positive earnings surprise of 2%, on average. In the last reported quarter, IBM delivered positive earnings surprise of 1.5%.

In the last reported quarter, the company delivered non-GAAP earnings of $2.68 per share, which surpassed the Zacks Consensus Estimate by 1.5%. However, the bottom line fell 22% from the year-ago quarters tally.

Revenues of $18.03 billion missed the Zacks Consensus Estimate by 1.2% and declined 3.9% on a year-over-year basis. At constant currency (cc), the metric dropped 0.6%.

International Business Machines Corporation Price, Consensus and EPS Surprise

International Business Machines Corporation Price, Consensus and EPS Surprise

International Business Machines Corporation price-consensus-eps-surprise-chart | International Business Machines Corporation Quote

Things to Watch Out

IBM is likely to have benefited from robust adoption of its cloud computing, mobile, security, analytics, cognitive technologies and AI related solutions in the to-be-reported quarter.

Markedly, deal wins and acquisitions are expected to have played an important role in boosting the companys portfolio and expand clientele in the cloud market. The buyout of Red Hat is likely to have paved way for growth in the hybrid cloud business. In fact, Red Hats expanding foothold across Asia Pacific is anticipated to have bolstered IBMs revenues in the cloud segment.

Further, IBM is striving to enhance efficiency of its quantum computing systems and services. In this respect, growing clientele of IBM Q Network is a positive. With quantum computing initiatives, the company attempts to help enterprises accelerate difficult financial and technical problems in real time.

Additionally, IBMs growth in industry verticals like health, key areas of analytics and security is likely to have boosted fourth-quarter performance. Notably, Watson Health has been witnessing broad-based growth in Payer, Provider, Imaging and Life Sciences domains.

However, we note that pricing pressures related to the companys legacy hardware business and ballooning debt levels have been headwinds.

Moreover, the company has been facing declines in its IBM Z product cycle and storage business.

These downsides along with adverse impacts from currency rates might have exerted pressure on the to-be-reported quarter's results.

What Our Model Says

Our proven model doesnt conclusively predict an earnings beat for IBM this time around. The combination of a positive Earnings ESP and a Zacks Rank #1 (Strong Buy), 2 (Buy) or 3 (Hold) increases the odds of an earnings beat. But thats not the case here. You can uncover the best stocks to buy or sell before theyre reported with our Earnings ESP Filter.

IBM has a Zacks Rank #3 and an Earnings ESP of -0.11.

Stocks to Consider

Here are some stocks you may consider as our proven model shows that these have the right mix of elements to beat estimates this time:

Apple AAPL has an Earnings ESP of +4.08% and a Zacks Rank of 2. You can see the complete list of todays Zacks #1 Rank stocks here.

Adobe Systems ADBE has an Earnings ESP of +1.08% and a Zacks Rank of 2.

Broadcom AVGO has an Earnings ESP of +5.37% and a Zacks Rank of 3.

Today's Best Stocks from Zacks

Would you like to see the updated picks from our best market-beating strategies? From 2017 through Q3 2019, while the S&P 500 gained +39.6%, five of our strategies returned +51.8%, +57.5%, +96.9%, +119.0%, and even +158.9%.

This outperformance has not just been a recent phenomenon. From 2000 Q3 2019, while the S&P averaged +5.6% per year, our top strategies averaged up to +54.1% per year.

See their latest picks free >>

Want the latest recommendations from Zacks Investment Research? Today, you can download 7 Best Stocks for the Next 30 Days. Click to get this free reportInternational Business Machines Corporation (IBM) : Free Stock Analysis ReportApple Inc. (AAPL) : Free Stock Analysis ReportAdobe Systems Incorporated (ADBE) : Free Stock Analysis ReportBroadcom Inc. (AVGO) : Free Stock Analysis ReportTo read this article on Zacks.com click here.

Follow this link:
IBM Gears Up to Report Q4 Earnings: What's in the Cards? - Yahoo Finance

2020: The year of seeing clearly on AI and machine learning – ZDNet

Tom Foremski

Late last year, I complained toRichard Socher, chief scientist at Salesforce and head of its AI projects, about the term "artificial intelligence" and that we should use more accurate terms such as machine learning or smart machine systems, because "AI" creates unreasonably high expectations when the vast majority of applications are essentially extremely specialized machine learning systems that do specific tasks -- such as image analysis -- very well but do nothing else.

Socher said that when he was a post-graduate it rankled him also, and he preferred other descriptions such as statistical machine learning. He agrees that the "AI" systems that we talk about today are very limited in scope and misidentified, but these days he thinks of AI as being "Aspirational Intelligence." He likes the potential for the technology even if it isn't true today.

I like Socher's designation of AI as Aspirational Intelligence but I'd prefer not to further confuse the public, politicians and even philosophers about what AI is today: It is nothing more than software in a box -- a smart machine system that has no human qualities or understanding of what it does. It's a specialized machine that is nothing to do with systems that these days are called Artificial General Intelligence (AGI).

Before ML systems co-opted it, the term AI was used to describe what AGI is used to describe today: computer systems that try to mimic humans, their rational and logical thinking, and their understanding of language and cultural meanings to eventually become some sort of digital superhuman, which is incredibly wise and always able to make the right decisions.

There has been a lot of progress in developing ML systems but very little progress on AGI. Yet the advances in ML are being attributed to advances in AGI. And that leads to confusion and misunderstanding of these technologies.

Machine learning systems unlike AGI, do not try to mimic human thinking -- they use very different methods to train themselves on large amounts of specialist data and then apply their training to the task at hand. In many cases, ML systems make decisions without any explanation and it's difficult to determine the value of their black box decisions. But if those results are presented as artificial intelligence then they get far higher respect from people than they likely deserve.

For example, when ML systems are being used in applications such as recommending prison sentences but are described as artificial intelligence systems -- they gain higher regard from the people using them. It implies that the system is smarter than any judge. But if the term machine learning is used it would underline that these are fallible machines and allow people to treat the results with some skepticism in key applications.

Even if we do develop future advanced AGI systems we should continue to encourage skepticism and we should lower our expectations for their abilities to augment human decision making. It is difficult enough to find and apply human intelligence effectively -- how will artificial intelligence be any easier to identify and apply? Dumb and dumber do not add up to a genius. You cannot aggregate IQ.

As things stand today, the mislabeled AI systems are being discussed as if they were well on their way of jumping from highly specialized non-human tasks to becoming full AGI systems that can mimic human thinking and logic. This has resulted in warnings from billionaires and philosophers that those future AI systems will likely kill us all -- as if a sentient AI would conclude that genocide is rational and logical. It certainly might appear to be a winning strategy if the AI system was trained on human behavior across recorded history but that would never happen.

There is no rational logic for genocide. Future AI systems would be designed to love humanity and be programmed to protect and avoid human injury. They would likely operate very much in the vein of Richard Brautigan's 1967 poemAll Watched Over By Machines Of Loving Grace--the last stanza:

I like to think(it has to be!)of a cybernetic ecologywhere we are free of our laborsand joined back to nature,returned to our mammalbrothers and sisters,and all watched overby machines of loving grace.

Let us not fear AI systems and in 2020, let's be clear and call them machine learning systems -- because words matter.

Link:
2020: The year of seeing clearly on AI and machine learning - ZDNet

Raleys Drive To Be Different Gets an Assist From Machine Learning – Winsight Grocery Business

Raleys has brought artificial intelligence to pricing not to necessarily to go toe-to-toe with competitors, but to differentiate from them, President and CEO Keith Knopf said.

Speaking in a presentation at the National Retail Federation show in New York, Knopf described how the West Sacramento, Calif.-based food retailer is using machine learning algorithms from partner Eversight to help manage its price perception amid larger, and often cheaper, competitorswhile optimizing revenue by driving unit share growth and margin dollars. That benefit is going toward what he described as a differentiated positioning behind health and wellness.

This is not just about pricing for the sake of pricing. This is pricing within a business strategy to differentiateand afford the investment in price in a way that is both financially sustainable and also relevant to the customer, Knopf said.

Raleyshas been working with Eversight for about four years, and has since invested in the Palo Alto, Calif.-based provider of AI-led pricing and promotion management. Knopf described using insights and recommendations derived from Eversights data crunching to support its merchants, helping to strategically manage the Rubiks Cube of pricing and promoting 40,000 items, each with varying elasticity, in stores with differing customer bases, price zones and competitive characteristics.

Raleys, Knopf said, is high-priced relative to its competitors, a reflection of its sizeand its ambitions. Were a $3 billion to $4 billion retailer competing against companies much larger than us, with much greater purchasing power and so for us, [AI pricing] is about optimization within our brand framework. We aspire to be a differentiated operator with a differentiated customer experience and a differentiated product assortment, which is guided more toward health and wellness. We have strong position in fresh that is evolving through innovation. But we also understand that we are a high-priced, high-cost retailer.

David Moran, Eversights co-founder, was careful to put his companys influence in perspective. Algorithms don't replace merchants or set a strategy, he said, but can support them by bringing new computing power that exceeds the work a merchant could do alone and has allowed for experimentation with pricing strategies across categories.In an example he shared, a mix of price changessome going up, others downhelped to drive overall unit growth and profits in the olive oil category.

The merchants still own the art: They are still the connection between the brand positioning, the price value perception, and they also own the execution, Knopf said. This technology gets us down that road much faster and with greater confidence.

Knopf said he believes that pricing science, in combination with customer relationship management, will eventually trigger big changes in the nature of promotional spending by vendors, with a shift toward so-called below the line programs, such as everyday pricing and personalized pricing, and less above the line mass promotions, which he believes are ultimately ineffective at driving long-term growth.

Every time we promote above the line, and everybody sees what everybody else does, no more units are sold in totality in the marketplace, it's just a matter of whos going to sell this week at what price, Knopf said.I believe that its in in the manufacturers best interest, and the retailers best interest, to make pricing personalized and relevant, and the dollars that are available today will shift from promotions into a more personalized, one-on-one, curated relationship that a vendor, the retailer and the customer will share.

More:
Raleys Drive To Be Different Gets an Assist From Machine Learning - Winsight Grocery Business

Going Beyond Machine Learning To Machine Reasoning – Forbes

From Machine Learning to Machine Reasoning

The conversation around Artificial Intelligence usually revolves around technology-focused topics: machine learning, conversational interfaces, autonomous agents, and other aspects of data science, math, and implementation. However, the history and evolution of AI is more than just a technology story. The story of AI is also inextricably linked with waves of innovation and research breakthroughs that run headfirst into economic and technology roadblocks. There seems to be a continuous pattern of discovery, innovation, interest, investment, cautious optimism, boundless enthusiasm, realization of limitations, technological roadblocks, withdrawal of interest, and retreat of AI research back to academic settings. These waves of advance and retreat seem to be as consistent as the back and forth of sea waves on the shore.

This pattern of interest, investment, hype, then decline, and rinse-and-repeat is particularly vexing to technologists and investors because it doesn't follow the usual technology adoption lifecycle. Popularized by Geoffrey Moore in his book "Crossing the Chasm", technology adoption usually follows a well-defined path. Technology is developed and finds early interest by innovators, and then early adopters, and if the technology can make the leap across the "chasm", it gets adopted by the early majority market and then it's off to the races with demand by the late majority and finally technology laggards. If the technology can't cross the chasm, then it ends up in the dustbin of history. However, what makes AI distinct is that it doesn't fit the technology adoption lifecycle pattern.

But AI isn't a discrete technology. Rather it's a series of technologies, concepts, and approaches all aligning towards the quest for the intelligent machine. This quest inspires academicians and researchers to come up with theories of how the brain and intelligence works, and their concepts of how to mimic these aspects with technology. AI is a generator of technologies, which individually go through the technology lifecycle. Investors aren't investing in "AI, but rather they're investing in the output of AI research and technologies that can help achieve the goals of AI. As researchers discover new insights that help them surmount previous challenges, or as technology infrastructure finally catches up with concepts that were previously infeasible, then new technology implementations are spawned and the cycle of investment renews.

The Need for Understanding

It's clear that intelligence is like an onion (or a parfait) many layers. Once we understand one layer, we find that it only explains a limited amount of what intelligence is about. We discover there's another layer thats not quite understood, and back to our research institutions we go to figure out how it works. In Cognilyticas exploration of the intelligence of voice assistants, the benchmark aims to tease at one of those next layers: understanding. That is, knowing what something is recognizing an image among a category of trained concepts, converting audio waveforms into words, identifying patterns among a collection of data, or even playing games at advanced levels, is different from actually understanding what those things are. This lack of understanding is why users get hilarious responses from voice assistant questions, and is also why we can't truly get autonomous machine capabilities in a wide range of situations. Without understanding, there's no common sense. Without common sense and understanding, machine learning is just a bunch of learned patterns that can't adapt to the constantly evolving changes of the real world.

One of the visual concepts thats helpful to understand these layers of increasing value is the "DIKUW Pyramid":

DIKUW Pyramid

While the Wikipedia entry above conveniently skips the Understanding step in their entry, we believe that understanding is the next logical threshold of AI capability. And like all previous layers of this AI onion, tackling this layer will require new research breakthroughs, dramatic increases in compute capabilities, and volumes of data. What? Don't we have almost limitless data and boundless computing power? Not quite. Read on.

The Quest for Common Sense: Machine Reasoning

Early in the development of artificial intelligence, researchers realized that for machines to successfully navigate the real world, they would have to gain an understanding of how the world works and how various different things are related to each other. In 1984, the world's longest-lived AI project started. The Cyc project is focused on generating a comprehensive "ontology" and knowledge base of common sense, basic concepts and "rules of thumb" about how the world works. The Cyc ontology uses a knowledge graph to structure how different concepts are related to each other, and an inference engine that allows systems to reason about facts.

The main idea behind Cyc and other understanding-building knowledge encodings is the realization that systems can't be truly intelligent if they don't understand what the underlying things they are recognizing or classifying are. This means we have to dig deeper than machine learning for intelligence. We need to peel this onion one level deeper, scoop out another tasty parfait layer. We need more than machine learning - we need machine reasoning.

Machine reason is the concept of giving machines the power to make connections between facts, observations, and all the magical things that we can train machines to do with machine learning. Machine learning has enabled a wide range of capabilities and functionality and opened up a world of possibility that was not possible without the ability to train machines to identify and recognize patterns in data. However, this power is crippled by the fact that these systems are not really able to functionally use that information for higher ends, or apply learning from one domain to another without human involvement. Even transfer learning is limited in application.

Indeed, we're rapidly facing the reality that we're going to soon hit the wall on the current edge of capabilities with machine learning-focused AI. To get to that next level we need to break through this wall and shift from machine learning-centric AI to machine reasoning-centric AI. However, that's going to require some breakthroughs in research that we haven't realized yet.

The fact that the Cyc project has the distinction as being the longest-lived AI project is a bit of a back-handed compliment. The Cyc project is long lived because after all these decades the quest for common sense knowledge is proving elusive. Codifying commonsense into a machine-processable form is a tremendous challenge. Not only do you need to encode the entities themselves in a way that a machine knows what you're talking about but also all the inter-relationships between those entities. There are millions, if not billions, of "things" that a machine needs to know. Some of these things are tangible like "rain" but others are intangible such as "thirst". The work of encoding these relationships is being partially automated, but still requires humans to verify the accuracy of the connections... because after all, if machines could do this we would have solved the machine recognition challenge. It's a bit of a chicken and egg problem this way. You can't solve machine recognition without having some way to codify the relationships between information. But you can't scalable codify all the relationships that machines would need to know without some form of automation.

Are we still limited by data and compute power?

Machine learning has proven to be very data-hungry and compute-intensive. Over the past decade, many iterative enhancements have lessened compute load and helped to make data use more efficient. GPUs, TPUs, and emerging FPGAs are helping to provide the raw compute horsepower needed. Yet, despite these advancements, complicated machine learning models with lots of dimensions and parameters still require intense amounts of compute and data. Machine reasoning is easily one order or more of complexity beyond machine learning. Accomplishing the task of reasoning out the complicated relationships between things and truly understanding these things might be beyond today's compute and data resources.

The current wave of interest and investment in AI doesn't show any signs of slowing or stopping any time soon, but it's inevitable it will slow at some point for one simple reason: we still don't understand intelligence and how it works. Despite the amazing work of researchers and technologists, we're still guessing in the dark about the mysterious nature of cognition, intelligence, and consciousness. At some point we will be faced with the limitations of our assumptions and implementations and we'll work to peel the onion one more layer and tackle the next set of challenges. Machine reasoning is quickly approaching as the next challenge we must surmount on the quest for artificial intelligence. If we can apply our research and investment talent to tackling this next layer, we can keep the momentum going with AI research and investment. If not, the pattern of AI will repeat itself, and the current wave will crest. It might not be now or even within the next few years, but the ebb and flow of AI is as inevitable as the waves upon the shore.

Read more here:
Going Beyond Machine Learning To Machine Reasoning - Forbes

Christiana Care offers tips to ‘personalize the black box’ of machine learning – Healthcare IT News

For all the potential benefits of artificial intelligence and machine learning, one of the biggest and, increasingly most publicized challenges with the technology is the potential for algorithmic bias.

But an even more basic challenge for hospitals and health systems looking to deploy AI and ML can be the skepticism from frontline staff a hesitance to use predictive models that, even if they aren't inherently biased, are certainly hard to understand.

At Delaware-based Christiana Care Health System, the past few years have seen efforts to "simplify the model without sacrificing precision," says Dr. Terri Steinberg, its chief health information officer and VP of population health informatics.

"The simpler the model, the more human beings will accept it," said Steinberg, who will talk more about this notion in a March 12 presentation at HIMSS20.

When it comes to pop health programs, the data sets used to drive the analytics matter, she explains. Whether it's EHR data, social determinants of health, claims data or even wearables information, it's key to select the most relevant data sources, use machine learning to segment the population and then, crucially, present those findings to care managers in a way that's understandable and fits their workflow.

At HIMSS20, Steinberg, alongside Health Catalyst Chief Data Scientist Jason Jones, will show how Christiana Care has been working to streamline its machine learning processes, to ensure they're more approachable and thus more liable to be embraced by its care teams.

Dr. Terri Steinberg, Christiana Care Health System

They'll explain how to assign relative value to pop health data and discuss some of the challenges associated with integrating them; they'll show how ML can segment populations and spotlight strategies for using new data sources that will boost the value and utility of predictive models.

"We've been doing this since 2012," said Steinberg. And now we have significant time under our belts, so we wanted to come back to HIMSS and talk about what we were doing in terms of programming for care management and, more important, how we're segmenting our population with machine learning."

"There are a couple of patterns that we've seen repeated across engagements that are a little bit counter to how people typically go about building these models today, which is to sort of throw everything at them and hope for the best," said Jones, of Health Catalyst, Christiana Care's vendor partner.

At Christiana Care, he said, the goal instead has been to "help people understand as much as they would like about how the models are working, so that they will trust and actually use them.

"We've found repeatedly that we can build technically fantastic models that people just don't trust and won't use," he added. "In that case, we might as well not bother in the first place. So we're going to go through and show how it is that we can build models in such a way that they're technically excellent but also well-trusted by the people who are going to use them."

In years past, "when we built the model and put it in front of our care managers and said, 'Here you go, now customize your treatment plans based on the risk score,' what we discovered is that they basically ignored the score and did what they wanted," Steinberg explained.

But by simplifying a given model to the "smallest number of participants and data elements that can be," that enables the development of something "small enough for people to understand the list of components, so that they think that they know why the model has made a specific prediction," she said.

That has more value than many population health professionals realize.

"The goal is to simplify the model as much as you can, so human beings understand the components," said Steinberg.

"People like understanding why a particular individual falls into a risk category," she said. "And then they sometimes would even like to know what the feature is that has resulted in the risk. The take home message is that the more human beings understand what the machine is doing, the more likely they are to trust the machine. We want to personalize the black box."

Steinberg and Jones will talk more about making machine learning meaningful at a HIMSS20 session titled "Machine Learning and Data Selection for Population Health." It's scheduled for Thursday, March 12, from 10-11 a.m. in room W414A.

Here is the original post:
Christiana Care offers tips to 'personalize the black box' of machine learning - Healthcare IT News