The Evolution Of Data Protection – Forbes

Data protection solutions are finally evolving to the current state of data: distributed, cloud-centric and always-on. Data used to only exist within the corporate network on devices that never left the physical protection of the company.

Data loss prevention (DLP) has been the default solution for protecting data. It's literally in the name. What countless organizations have determined is that DLP doesn't stop breaches, but it does generate extremely high operational overhead. The same is true for other legacy solutions such as pretty good privacy (PGP) and information rights management (IRM).

DLP is only as good as the classification rigidity enforced by the organization. Classification is always too rigid and can't keep up with fluid data movement. For DLP to prevent data from egress, data must be classified correctly. Classification is complicated and fragile. What is sensitive today is not sensitive tomorrow and vice versa. Classification turns into an endless battle of users trying to manage the classification of data. Ultimately, classification and DLP deteriorate over time. DLP adds an extremely high operational overhead, as it requires users to be classification superstars, and even then, mistakes will happen. Desjardins Group, a Canadian bank, recently made news for a malicious insider who obtained information on 2.7 million customers and over 170,000 businesses. The exact details of the breach haven't been made public yet, but DLP solutions are standard in all financial institutions.

PGP's encryption is a privacy tool. Users can encrypt their data so others can't access it, but PGP fails once users try to share data with other users. Once a user distributes the encryption key, the user has completely lost control of the data. Anyone with the key can decrypt the data and transfer the unprotected data as they wish. PGP was never intended to secure an organization's data set. Wired Magazine went as far as claiming PGP is dead.

IRM is limited to a small set of applications. Typically focused on Office documents, IRM can protect data with significant depth of protection such as blocking copy and paste, blocking save as, blocking print, etc. Blocking copy and paste adds overhead to users, however. For organizations that only work with Word and Excel files, IRM may be an acceptable solution. Organizations that need to protect any non-Office will need to find another solution. IRM only works with a limited list of applications and versions. Even Microsoft Azure Information Protection has significant restrictions on file types and sizes.

A New Approach to Data Protection

A new wave of solutions has appeared in the market to significantly shift the focus of data protection. Here are four criteria to measure data protection in the solutions you're currently considering:

Data Protection Vs. File Protection

Protecting files is no longer the focus. Data should be protected and continue to be protected as it moves from file to file and format to format. A file is simply the container to store data. Ensure solutions are capable of automatically protecting derivative work, including copy and paste and save-as.

Identity Authentication Vs. Device Or Location Authentication

Access control should be associated with a user identity and not devices, locations, or networks. Having a unified and centralized identity and access management solution will allow for all security permissions to be applied across multiple data protection solutions.

Data DNA Vs. Classification

Protection criteria should not be based on file classification, but rather the actual data DNA. As sensitive data is moved, protection needs to follow data. Classification is too manual and adds too much operational overhead to users.

Transparency Vs. Usability

Legacy solutions added operational overhead to end-users. The best data security solutions are the ones that are not visible to end-users. Don't ask users to change their behavior in the name of data protection. Only unauthorized users should notice security is in place. Data protection solutions also have to protect a broad range of applications, file types, sizes, etc. The more limitations the solution has, the less practical it will be.

With the rise of new data protection solutions, organizations need to review new solutions and replace legacy solutions that aren't capable of protecting data in today's data workflow and increased scrutiny on data security and privacy.

Read the original:
The Evolution Of Data Protection - Forbes

Extending the Circle of Trust with Confidential Computing – Infosecurity Magazine

The benefits of operational efficiency and flexibility delivered by public cloud resources have encouraged todays organizations to migrate applications and data to external computing platforms located outside the perceived security of on-premises infrastructures. Many businesses are now adopting a cloud-first design approach that emphasizes elastic scalability and cost reduction above ownership and management, and, in some cases, security.

Analyzing global trends in public cloud services, Gartner has predicted that spending on these resources will increase from $182.4B in 2018 to $331.2B in 2022, with 30 percent of all new software investments being cloud native by the end of 2019.

Trusting Someone Else to Guard Your Secrets

The benefits of third-party infrastructure and applications, however, come with risks. Deploying sensitive applications and data on computing platforms that are outside of an organizations owned and managed infrastructure requires trust in the service providers hardware and software used to process, and ultimately protect, that data.

Trusting a cloud provider can be disastrous for an organization financially and reputation-wise if they are the subject of a successful cyber-attack. In its Ninth Annual Cost of Cybercrime Study, Accenture reported that in 2018 the average cost of cyber-attacks involving either a malicious insider or the execution of malicious code was $3M per year, according to participants.

Confidential Computing

One response to the problem of the trustworthiness of the cloud when it comes to data protection has been the emergence of the Trusted Execution Environment (TEE), which has led to the concept of confidential computing. Industry leaders joined together to form the Confidential Computing Consortium (CCC) in October.

The Confidential Computing Consortium looks to address the security issues around data in use, enabling encrypted data to be processed in memory without exposing it to the rest of the system. This is the first industry-wide initiative by industry leaders to address data in use, since todays encryption security approaches mostly focus on data at rest or data in transit. The work of the Confidential Computing Consortium is especially important as companies move more workloads to multiple environments, including on premises, public cloud, hybrid, and edge environments.

Secure Enclaves

One of the most important technologies for addressing the problem of protecting data in use can be found in the form of secure enclaves, such as the protected memory regions established by Intel Software Guard Extensions (SGX). Secure enclaves allow applications to execute securely and be enforced at the hardware level by the CPU itself. All data is encrypted in memory and decrypted only while being used inside the CPU: the data remains completely protected, even if the operating system, hypervisor or root user is compromised. With secure enclaves, data can be fully protected across its entire lifecycle at rest, in motion and in use for the first time.

Secure enclaves can offer further security benefits using a process called attestation to verify that the CPU is genuine, and that the deployed application is the correct one and hasnt been altered.

Operating in secure enclaves with attestation gives users complete confidence that code is running as intended and that data is completely protected during processing. This approach is gaining traction, for example it enables sensitive applications, including data analytics, Machine Learning, and Artificial Intelligence, to run safely in the cloud with regulatory compliance.

Runtime Encryption

Encryption is a proven approach for effective data security, particularly when protecting data at rest and data in motion. However, as discussed above, a key requirement for confidential computing, and the focus of the Confidential Computing Consortium, is protecting data in use. When an application starts to run, its data is vulnerable to a variety of attacks, including malicious insiders, root users, credential compromise, OS zero-day, and network intruders.

Runtime encryption provides deterministic security with hardware-aided memory encryption for applications to protect data in use. Through optimization of the Trusted Computing Base (TCB), it enables encrypted data to be processed in memory without exposing it to the rest of the system.

This reduces the risks to sensitive data and provides greater control and transparency for users. Runtime encryption provides complete cryptographic protection for applications by running them securely inside a TEE and defending them even from root users and physical access to the server.

Expanding the Circle of Trust

The number one concern cited by enterprises in their move to the cloud continues to be security. Confidential computing and protecting data in use gives sensitive applications a safe place that protects them from todays infrastructure attacks.

Confidential computing is critical for protecting cloud data, and it is fundamentally helping establish and expand the circle of trust in cloud computing. It creates isolated runtime environments that allow execution of sensitive applications in a protected state, keeping cloud apps and data completely secure when in use.

With secure enclaves and runtime encryption supporting confidential computing, customers know that, no matter what happens, their data remains cryptographically protected. No amount of zero-day attacks, infrastructure compromises, and even government subpoenas can compromise the data. Confidential computing expands the deterministic security needed for the most sensitive cloud applications, at the performance level demanded by modern Internet-scale applications.

A Secure Cloud Future

As Gartner has reported, businesses are migrating their sensitive data and applications to public cloud services, a practice that saves them from ownership and maintenance of infrastructure that will inevitably be obsolete in the future.

Leading technology providers have recognized that confidential computing provides a security model ready to address the problems of untrusted hardware and software that have hampered this transition to the cloud.

With a growing number of use cases, and interest and deployments surging, confidential computing environments will be relied on to protect data in growing areas such as industry 4.0, digital health, the Internet of Things (IoT), and federated machine learning systems.

As the Confidential Computing Consortium continues its work, individuals and businesses may at some point expect a confidential computing architecture as a prerequisite for the exchange and processing of our private data.

Link:
Extending the Circle of Trust with Confidential Computing - Infosecurity Magazine

Encryption Software Market Size 2019, Trends, Share, Outlook and forecast to 2026 – CupMint

The report is an all-inclusive research study of the global Encryption Software Market taking into account the growth factors, recent trends, developments, opportunities, and competitive landscape. The market analysts and researchers have done extensive analysis of the global Encryption Software Market with the help of research methodologies such as PESTLE and Porters Five Forces analysis. They have provided accurate and reliable market data and useful recommendations with an aim to help the players gain an insight into the overall present and future market scenario. The report comprises in-depth study of the potential segments including product type, application, and end user and their contribution to the overall market size.

In addition, market revenues based on region and country are provided in the report. The authors of the report have also shed light on the common business tactics adopted by players. The leading players of the global Encryption Software Market and their complete profiles are included in the report. Besides that, investment opportunities, recommendations, and trends that are trending at present in the global Encryption Software Market are mapped by the report. With the help of this report, the key players of the global Encryption Software Market will be able to make sound decisions and plan their strategies accordingly to stay ahead of the curve.

Global Encryption Software Market was valued at USD 3.32 billion in 2016 and is projected to reach USD 30.54 billion by 2025, growing at a CAGR of 27.96% from 2017 to 2025.

Report Enquiry For Download Sample

Topmost Leading Key Players in this report :

Dell, Thales E-Security, Eset, Symantec, IBM Corporation, Sophos, Ciphercloud, Pkware, Mcafee, Gemalto, Trend Micro, Microsoft Corporation

As part of primary research, our analysts interviewed a number of primary sources from the demand and supply sides of the global Encryption Software Market . This helped them to obtain both quantitative and qualitative data and information. On the demand side of the global Encryption Software Market are end users, whereas on the supply side are distributors, vendors, and manufacturers.

During our secondary research, we collected information from different sources such as databases, regulatory bodies, gold and silver-standard websites, articles by recognized authors, certified publications, white papers, investor presentations and press releases of companies, and annual reports.

The research report includes segmentation of the global Encryption Software Market on the basis of application, technology, end users, and region. Each segment gives a microscopic view of the market. It delves deeper into the changing political scenario and the environmental concerns that are likely to shape the future of the market. Furthermore, the segment includes graphs to give the readers a birds eye view.

Last but not the least, the research report on global Encryption Software Market profiles some of the leading companies. It mentions their strategic initiatives and provides a brief about their structure. Analysts have also mentioned the research and development statuses of these companies and their provided complete information about their existing products and the ones in the pipeline.

Based on regions, the market is classified into North America, Europe, Asia Pacific, Middle East & Africa and Latin America. The study will provide detailed qualitative and quantitative information on the above mentioned segments for every region and country covered under the scope of the study.

Finally, Encryption Software Market report gives you details about the market research finding and conclusion which helps you to develop profitable market strategies to gain competitive advantage. Supported by comprehensive primary as well as secondary research, the Encryption Software Market report is then verified using expert advice, quality check and final review. The market data was analyzed and forecasted using market dynamics and consistent models.

Verified Market Research has been providing Research Reports, with up to date information, and in-depth analysis, for several years now, to individuals and companies alike that are looking for accurate Research Data. Our aim is to save your Time and Resources, providing you with the required Research Data, so you can only concentrate on Progress and Growth. Our Data includes research from various industries, along with all necessary statistics like Market Trends, or Forecasts from reliable sources.

Mr. Edwyne Fernandes

Call: +1 (650) 781 4080

Email:[emailprotected]

Continue reading here:
Encryption Software Market Size 2019, Trends, Share, Outlook and forecast to 2026 - CupMint

Quantum Computing Breakthrough: Silicon Qubits Interact at Long-Distance – SciTechDaily

Researchers at Princeton University have made an important step forward in the quest to build a quantum computer using silicon components, which are prized for their low cost and versatility compared to the hardware in todays quantum computers. The team showed that a silicon-spin quantum bit (shown in the box) can communicate with another quantum bit located a significant distance away on a computer chip. The feat could enable connections between multiple quantum bits to perform complex calculations. Credit: Felix Borjans, Princeton University

Princeton scientists demonstrate that two silicon quantum bits can communicate across relatively long distances in a turning point for the technology.

Imagine a world where people could only talk to their next-door neighbor, and messages must be passed house to house to reach far destinations.

Until now, this has been the situation for the bits of hardware that make up a silicon quantum computer, a type of quantum computer with the potential to be cheaper and more versatile than todays versions.

Now a team based at Princeton University has overcome this limitation and demonstrated that two quantum-computing components, known as silicon spin qubits, can interact even when spaced relatively far apart on a computer chip. The study was published today (December 25, 2019) in the journal Nature.

The ability to transmit messages across this distance on a silicon chip unlocks new capabilities for our quantum hardware, said Jason Petta, the Eugene Higgins Professor of Physics at Princeton and leader of the study. The eventual goal is to have multiple quantum bits arranged in a two-dimensional grid that can perform even more complex calculations. The study should help in the long term to improve communication of qubits on a chip as well as from one chip to another.

Quantum computers have the potential to tackle challenges beyond the capabilities of everyday computers, such as factoring large numbers. A quantum bit, or qubit, can process far more information than an everyday computer bit because, whereas each classical computer bit can have a value of 0 or 1, a quantum bit can represent a range of values between 0 and 1 simultaneously.

To realize quantum computings promise, these futuristic computers will require tens of thousands of qubits that can communicate with each other. Todays prototype quantum computers from Google, IBM and other companies contain tens of qubits made from a technology involving superconducting circuits, but many technologists view silicon-based qubits as more promising in the long run.

Silicon spin qubits have several advantages over superconducting qubits. The silicon spin qubits retain their quantum state longer than competing qubit technologies. The widespread use of silicon for everyday computers means that silicon-based qubits could be manufactured at low cost.

The challenge stems in part from the fact that silicon spin qubits are made from single electrons and are extremely small.

The wiring or interconnects between multiple qubits is the biggest challenge towards a large scale quantum computer, said James Clarke, director of quantum hardware at Intel, whose team is building silicon qubits using using Intels advanced manufacturing line, and who was not involved in the study. Jason Pettas team has done great work toward proving that spin qubits can be coupled at long distances.

To accomplish this, the Princeton team connected the qubits via a wire that carries light in a manner analogous to the fiber optic wires that deliver internet signals to homes. In this case, however, the wire is actually a narrow cavity containing a single particle of light, or photon, that picks up the message from one qubit and transmits it to the next qubit.

The two qubits were located about half a centimeter, or about the length of a grain of rice, apart. To put that in perspective, if each qubit were the size of a house, the qubit would be able to send a message to another qubit located 750 miles away.

The key step forward was finding a way to get the qubits and the photon to speak the same language by tuning all three to vibrate at the same frequency. The team succeeded in tuning both qubits independently of each other while still coupling them to the photon. Previously the devices architecture permitted coupling of only one qubit to the photon at a time.

You have to balance the qubit energies on both sides of the chip with the photon energy to make all three elements talk to each other, said Felix Borjans, a graduate student and first author on the study. This was the really challenging part of the work.

Each qubit is composed of a single electron trapped in a tiny chamber called a double quantum dot. Electrons possess a property known as spin, which can point up or down in a manner analogous to a compass needle that points north or south. By zapping the electron with a microwave field, the researchers can flip the spin up or down to assign the qubit a quantum state of 1 or 0.

This is the first demonstration of entangling electron spins in silicon separated by distances much larger than the devices housing those spins, said Thaddeus Ladd, senior scientist at HRL Laboratories and a collaborator on the project. Not too long ago, there was doubt as to whether this was possible, due to the conflicting requirements of coupling spins to microwaves and avoiding the effects of noisy charges moving in silicon-based devices. This is an important proof-of-possibility for silicon qubits because it adds substantial flexibility in how to wire those qubits and how to lay them out geometrically in future silicon-based quantum microchips.'

The communication between two distant silicon-based qubits devices builds on previous work by the Petta research team. In a 2010 paper in the journal Science, the team showed it is possible to trap single electrons in quantum wells. In the journal Nature in 2012, the team reported the transfer of quantum information from electron spins in nanowires to microwave-frequency photons, and in 2016 in Science they demonstrated the ability to transmit information from a silicon-based charge qubit to a photon. They demonstrated nearest-neighbor trading of information in qubits in 2017 in Science. And the team showed in 2018 in Nature that a silicon spin qubit could exchange information with a photon.

Jelena Vuckovic, professor of electrical engineering and the Jensen Huang Professor in Global Leadership at Stanford University, who was not involved in the study, commented: Demonstration of long-range interactions between qubits is crucial for further development of quantum technologies such as modular quantum computers and quantum networks. This exciting result from Jason Pettas team is an important milestone towards this goal, as it demonstrates non-local interaction between two electron spins separated by more than 4 millimeters, mediated by a microwave photon. Moreover, to build this quantum circuit, the team employed silicon and germanium materials heavily used in the semiconductor industry.

###

Reference: Resonant microwave-mediated interactions between distant electron spins by F. Borjans, X. G. Croot, X. Mi, M. J. Gullans and J. R. Petta, 25 December 2019, Nature.DOI: 10.1038/s41586-019-1867-y

In addition to Borjans and Petta, the following contributed to the study: Xanthe Croot, a Dicke postdoctoral fellow; associate research scholar Michael Gullans; and Xiao Mi, who earned his Ph.D. at Princeton in Pettas group and is now a research scientist at Google.

The study was funded by Army Research Office (grant W911NF-15-1-0149) and the Gordon and Betty Moore Foundations EPiQS Initiative (grant GBMF4535).

Continued here:

Quantum Computing Breakthrough: Silicon Qubits Interact at Long-Distance - SciTechDaily

How This Breakthrough Makes Silicon-Based Qubit Chips The Future of Quantum Computing – Analytics India Magazine

Quantum computing has come a long way since its first introduction in the 1980s. Researchers have always been on a lookout for a better way to enhance the ability of quantum computing systems, whether it is in making it cheaper or the quest of making the present quantum computers last longer. With the latest technological advancements in the world of quantum computing which superconducting bits, a new way of improving the world of silicon quantum computing has come to light, making use of the silicon spin qubits for better communication.

Until now, the communication between different qubits was relatively slow. It could be done by passing the messages to the next bit to get the communication over to another chip at a relatively far distance.

Now, researches at Princeton University have explored the idea of two quantum computing silicon components known as silicon spin qubits interacting in a relatively spaced environment, that is with a relatively large distance between them. The study was presented in the journal Nature on December 25, 2019.

The silicon quantum spin qubits give the ability to the quantum hardware to interact and transmit messages across a certain distance which will provide the hardware new capabilities. With transmitting signals over a distance, multiple quantum bits can be arranged in two-dimensional grids that can perform more complex calculations than the existing hardware of quantum computers can do. This study will help in better communications of qubits not only on a chip but also from one to another, which will have a massive impact on the speed.

The computers require as many qubits as possible to communicate effectively with each other to take the full advantage of quantum computings capabilities. The quantum computer that is used by Google and IBM contains around 50 qubits which make use of superconducting circuits. Many researchers believe that silicon-based qubit chips are the future in quantum computing in the long run.

The quantum state of silicon spin qubits lasts longer than the superconducting qubits, which is one of their significant disadvantages (around five years). In addition to lasting longer, silicon which has a lot of application in everyday computers is cheaper, another advantage over the superconducting qubits because these cost a ton of money. Single qubit will cost around $10,000, and thats before you consider research and development costs. With these costs in mind a universal quantum computer hardware alone will be around at least $10bn.

But, silicon spin cubits have their challenges which are part of the fact that they are incredibly small, and by small we mean, these are made out from a single electron. This problem is a huge factor when it comes to establishing an interconnect between multiple qubits when building a large scale computer.

To counter the problem of interconnecting these extremely small silicon spin qubits, the Princeton team connected these qubits with a wire which are similar to the fibre optic (for internet delivery at houses) wires and these wires carry light. This wire contains photon that picks up a message from a single qubit and transmits it the next qubit. To understand this more accurately, if the qubits are placed at a distance of half-centimetre apart from each other for the communication, in real-world, it would be like these qubits are around 750 miles away.

The next step forward for the study was to establish a way of getting qubits and photons to communicate the same language by tuning both the qubits and the photon to the same frequency. Where previously the devices architecture allowed tuning only one qubit to one photon at a time, the team now succeeded in tuning both the qubits independent from each other while still coupling them to the photon.

You have to balance the qubit energies on both sides of the chip with the photon energy to make all three elements talk to each other,

Felix Borjans, a graduate student and first author on the study on what he describes as the challenging part of the work.

The researchers demonstrated entangling of electrons spins in silicon separated by distances more substantial than the device housing, this was a significant development when it comes to wiring these qubits and how to lay them out in silicon-based quantum microchips.

The communication between the distant silicon-based qubits devices builds on the works of Petta research team in 2010 which shows how to trap s single electron in quantum wells and also from works in the journal Nature from the year 2012 (transfer of quantum information from electron spins)

From the paper in Science 2016 (demonstrated the ability to transmit information from a silicon-based charge qubit to a photon), from Science 2017 (nearest-neighbour trading of information in qubits) and 2018 Nature (silicon spin qubit can exchange information with a photon).

This demonstration of interactions between two silicon spin qubits is essential for the further development of quantum tech. This demonstration will help technologies like modular quantum computers and quantum networks. The team has employed silicon and germanium, which is widely available in the market.

comments

Sameer is an aspiring Content Writer. Occasionally writes poems, loves food and is head over heels with Basketball.

See the original post:

How This Breakthrough Makes Silicon-Based Qubit Chips The Future of Quantum Computing - Analytics India Magazine

Information teleported between two computer chips for the first time – New Atlas

Scientists at the University of Bristol and the Technical University of Denmark have achieved quantum teleportation between two computer chips for the first time. The team managed to send information from one chip to another instantly without them being physically or electronically connected, in a feat that opens the door for quantum computers and quantum internet.

This kind of teleportation is made possible by a phenomenon called quantum entanglement, where two particles become so entwined with each other that they can communicate over long distances. Changing the properties of one particle will cause the other to instantly change too, no matter how much space separates the two of them. In essence, information is being teleported between them.

Hypothetically, theres no limit to the distance over which quantum teleportation can operate and that raises some strange implications that puzzled even Einstein himself. Our current understanding of physics says that nothing can travel faster than the speed of light, and yet, with quantum teleportation, information appears to break that speed limit. Einstein dubbed it spooky action at a distance.

Harnessing this phenomenon could clearly be beneficial, and the new study helps bring that closer to reality. The team generated pairs of entangled photons on the chips, and then made a quantum measurement of one. This observation changes the state of the photon, and those changes are then instantly applied to the partner photon in the other chip.

We were able to demonstrate a high-quality entanglement link across two chips in the lab, where photons on either chip share a single quantum state, says Dan Llewellyn, co-author of the study. Each chip was then fully programmed to perform a range of demonstrations which utilize the entanglement. The flagship demonstration was a two-chip teleportation experiment, whereby the individual quantum state of a particle is transmitted across the two chips after a quantum measurement is performed. This measurement utilizes the strange behavior of quantum physics, which simultaneously collapses the entanglement link and transfers the particle state to another particle already on the receiver chip.

The team reported a teleportation success rate of 91 percent, and managed to perform some other functions that will be important for quantum computing. That includes entanglement swapping (where states can be passed between particles that have never directly interacted via a mediator), and entangling as many as four photons together.

Information has been teleported over much longer distances before first across a room, then 25 km (15.5 mi), then 100 km (62 mi), and eventually over 1,200 km (746 mi) via satellite. Its also been done between different parts of a single computer chip before, but teleporting between two different chips is a major breakthrough for quantum computing.

The research was published in the journal Nature Physics.

Source: University of Bristol

Excerpt from:

Information teleported between two computer chips for the first time - New Atlas

Same Plastic That Make Legos Could Also Be The Best Thermal Insulators Used in Quantum Computers – KTLA Los Angeles

If you thought that Legos were the coolest toys on the planet while you were growing up, it turns out that you were right.

Scientists at Lancaster University in England conducted an experimentin which they froze several Lego blocks to the lowest possible temperature, and what they discovered could be useful in the development of quantum computing.

Led by Dr. Dmitry Zmeev, the scientists used a custom-made dilution refrigerator,which the university saysis the most effective refrigerator in the world. The dilution refrigerator at Lancaster University can reach 1.6 millidegrees above absolute zero, or minus 459.67 degrees Fahrenheit (minus 273.15 Celsius). That is 200,000 times colder than room temperature and 2,000 times colder than deep space,according to the university.

The team of scientists placed a Lego figure along with four Lego blocks inside the dilution refrigerator to see if Legos could be a good thermal insulator.

We were trying to find a material that would be a thermal insulator at extremely low temperatures, yet would be relatively strong, Zmeev told CNN.

The Lego blocks looked like good candidates: the contact area between two blocks clamped together is very small, which prompts poor thermal conduction, yet the resulting structure is very robust. And indeed, our measurements confirmed this.

Legos aremade from ABS plastic, or acrylonitrile butadiene styrene. The plastic is known for its strength and durability. Among its other common uses are keys for computer keyboards.

Thermal insulation is critical to cryogenic engineering and low-temperature physics, but the materials for these applications are extremely expensive and are difficult to mold.

The very instrument the experiment was conducted with could benefit from its results. By allowing for a potentially more cost-effective solution to producing dilution refrigerators, using ABS as a thermal insulator in those refrigerators could aid in the development of quantum computing.

Very low temperatures provided by the dilution refrigerator are necessary for the operation of existing quantum computers, such as Googles, to cool down their qubits, Zmeev said.

A qubit is the basic unit of quantum information in quantum computing.

While its unlikely that Lego blocks per se will be used as a part of a quantum computer, weve found the right direction for creating cheap thermal insulators: 3D printing, Zmeev said. Lego is made from ABS plastic and one can also create ABS structures simply by 3D printing them. We are currently studying the properties of such 3D printed structures at ultralow temperatures close to absolute zero.

54.010394-2.787729

Read more from the original source:

Same Plastic That Make Legos Could Also Be The Best Thermal Insulators Used in Quantum Computers - KTLA Los Angeles

2020 will be the beginning of the tech industry’s radical revisioning of the physical world – TechCrunch

These days its easy to bemoan the state of innovation and the dynamism coming from Americas cradle of technological development in Silicon Valley.

The same companies that were praised for reimagining how people organized and accessed knowledge, interacted publicly, shopped for goods and services, conducted business, and even the devices on which all of these things are done, now find themselves criticized for the ways in which theyve abused the tools theyve created to become some of the most profitable and wealthiest ventures in human history.

Before the decade was even half over, the concern over the poverty of purpose inherent in Silicon Valleys inventions were given voice by Peter Thiel a man who has made billions financing the creation of the technologies whose paucity he then bemoaned.

We are no longer living in a technologically accelerating world, Thiel told an audience at Yale University in 2013. There is an incredible sense of deceleration.

In the six years since Thiel spoke to that audience, the only acceleration has been the pace of technologys contribution to the worlds decline.

However, there are some investors who think that the next wave of big technological breakthroughs are just around the corner and that 2020 will be the year that they enter the public consciousness in a real way.

These are the venture capitalists who invest in companies that develop so-called frontier technologies (or deep tech) things like computational biology, artificial intelligence or machine learning, robotics, the space industry, advanced manufacturing using 3D printing, and quantum computing.

Continued advancements in computational power, data management, imaging and sensing technologies, and materials science are bridging researchers ability to observe and understand phenomena with the potential to manipulate them in commercially viable ways.

As a result increasing numbers of technology investors are seeing less risk and more rewards in the formerly arcane areas of investing in innovations.

Established funds will spin up deep tech teams and more funds will be founded to address this market, especially where deep tech meets sustainability, according to Fifty Years investor, Seth Bannon. This shift will be driven from the bottom up (its where the best founder talent is heading) and also from the top down (as more and more institutional LPs want to allocate capital to this space).

In some ways, these investments are going to be driven by political necessity as much as technological advancement, according to Matt Ocko, a managing partner at the venture firm DCVC.

Earlier this year, DCVC closed on $725 million for two investment funds focused on deep technology investing. For Ocko, the geopolitical reality of continuing tensions with China will drive adoption of new technologies that will remake the American industrial economy.

Whether we like it or not, US-government-driven scrutiny of China-based technology will continue in 2020. Less of it will be allowed to be deployed in the US, especially in areas of security, networking, autonomous transportation and space intelligence, writes Ocko, in an email. At the same time, US DoD efforts to streamline procurement processes will result in increasingly tighter partnerships between the DoD and tech sector. The need to bring complex manufacturing, comms, and semiconductor technology home to the US will support a renaissance in distributed manufacturing/advanced manufacturing tech and a strong wave of semiconductor and robotic innovation.

View original post here:

2020 will be the beginning of the tech industry's radical revisioning of the physical world - TechCrunch

From space tourism to robo-surgeries: Investors are betting on the future like there’s no tomorrow – Financial Post

It may be difficult to envision, but there is a potential future be it 10, 20 or even 30 years down the line where humans are able to plan a cozy vacation into space, blast by a series of satellites that now provide them with Internet access and have their most serious illnesses treated by allowing bioengineers to alter their DNA.

Its one possible future that proactive investors, even those in typically reactive institutional settings, have begun to place large and risky bets on becoming a reality.

In April, the Ontario Teachers Pension Plan created a new department called the Teachers Innovation Platform that has a mandate to invest in disruptive tech and made its first big splash in June by backing Elon Musks SpaceX. The pension plan has particular interest in the companys Starlink project, one that aims to fire more than 11,000 satellites into low orbit, interlink them all and have them act as a new provider of Internet connectivity.

For investing ... you want to look 15 to 20 years down the line and say: 'Is this still going to be impacting peoples' lives?

The Canada Pension Plan Investment Board has put a similar emphasis on investing in disruptive technology, announcing in late 2018 that it had made a private investment in Zoox, a California-based company that aims to operate a fleet of robo-taxis. Only months ago, the pension plan bought US$162 million worth of Skyworks Solutions Inc., a semiconductor firm creating chips that will allow the next wave of phones to work in 5G networks.

As for retail investors, theyve likely never had as many options to hedge their portfolio toward the future. The 2019 IPO market offered them even more, bringing a basket of futuristic options to the market, including Beyond Meat Inc., a producer of plant-based meat, and Virgin Galactic Holdings Inc., the latest brainchild of Richard Branson, which is developing spacecraft that may allow for the development of a space tourism sector.

But the investors buying these stocks arent buying them with the hope that theyll hit their peak in 2020.

You have to recognize the world is changing, said Hans Albrecht, the portfolio manager for Horizons ETFs Industry 4.0 fund. Theres nothing wrong in investing in Pokemon cards if theyre hot now or whatever the latest trend may be, but thats a trade. For investing you want to look 15 to 20 years down the line and say: Is this still going to be impacting peoples lives?

It wont be long, Albrecht suspects, before his coffee maker is able to receive signals from his mug that tell it to begin brewing a new serving once hes three-quarters of the way through his first cup in the morning.

If that scenario sounds too futuristic, its one that only scratches the surface, he said. When hes running low on espresso packages, a chip in his pantry keeping track of stock may be able to automatically order more from Amazon, which at that point, may have implemented one-hour shipping, to ensure hell never run out.

Thousands of consumers already have access to smart home technology through Google Home or Amazon.com Inc.s Alexa, which allow for the linking of devices such as thermostats, lights and televisions. Its advancements in artificial intelligence and edge computing, which will effectively replace the cloud and allow for individual items in a home to process data, that will bring this technology into the future.

Figuring out how to play technology like edge computing which may very well become mainstream in a decade isnt exactly simple.

Investors will have two options: they can bet on the end point user of the technology in Albrechts coffee scenario, that would mean investing in the company that produces the coffee maker or they can look to the firms that are developing the components that power it.

Albrecht leans towards the latter, suggesting that there would be far too much competition among the end point companies while there would only be a handful of leaders on the components side. A company like Analog Devices Inc., may play a central role in the implementation of that technology because its building everything from the sensors and their networks to processors.

Investors may be able to apply similar logic with 5G, according to CIBC World Markets tech strategist Todd Coupland.

Consumers will likely only begin to see the wide rollout of 5G, which would enable devices to operate at speeds that as much as 100 times faster than the current 4G tech, in 2020. That means that it might be a bit early to invest in device producers such as Apple Inc. or Samsung Electronics Co Ltd. for that exposure. Instead, Coupland suggested investors eye a company like Keysight Technologies Inc. which builds the equipment that carriers have been using to test out their services ahead of launch.

Goldman Sachs expects 50 million to 120 million 5G devices to be active in 2020 and if that should be the case, components manufacturers in Qualcomm Inc. and Marvell Technology Group Ltd. may warrant attention as would providers such as Nokia Ovj, which already has 50 deals in place to install its radio access equipment, AirScale, around the world. The equipment supports multiple frequencies and allows for a quick transition over to 5G.

That list doesnt include the Canadian telcos and for good reason.

In Canada, Rogers and Bell, their attitude is: See how it goes in the U.S. and well be at least one year behind, Coupland said.

5Gs full potential likely wont be reached for a decade, he said, and the futuristic possibilities it opens up will likely only be reached in the second half. When combined with the power of quantum computing, managing a fleet of self-driving cars and, who knows, removing traffic lights from the streets becomes a possibility, according to Christian Weedbrook, the CEO of Toronto-based quantum computing company Xanadu.

Weedbrooks company has gained the attention of Georgian Partners, a private-sector venture capital firm that has invested hundreds of millions of dollars in upstart Canadian tech companies.

What makes quantum computing, a draw for Jason Brenier, Georgians vice-president of strategy, is its ability to solve previously unsolveable problems.

Weedbrook imagines a future where quantum computers control hundreds of autonomous vehicles for Uber Inc. or Lyft Inc. and provide each individual car with the fastest route to its destination, analyzing traffic, time a trip perfectly so that red lights can be mostly or completely avoided, and in the case of a pool scenario, figure out how to do that with multiple stops.

Investing in early stage technology comes with its challenges. Because Georgian focuses on private investments, there is no stock performance to point to and not much in the way of fundamentals to rely on.

Many of these tech companies that are seeking funding from the firm may show promise but wont pan out in the future. Brenier knows this and says thats one of the reasons why Georgian has its own scientists on staff.

Instead of making blind bets on the future, Georgian turns to its applied research and development team to identify new opportunities based on new academic research and to even conduct their own in order to determine whether a new idea is actually viable.

That gives us some unique insight into how some of these things are taking off, how practical they are from an investment perspective and determining the timing of some of them, Brenier said.

The Georgian team is futurist, but theres still a limit on how far in advance they want to support a new wave of tech. We dont want to work on things that take 20 years to make a breakthrough, Brenier said.

Where breakthroughs may be even more rare for futurist investors, but the potential returns all the sweeter is in health care. The possibilities here, especially when tech plays a part of the equation, appear to be boundless.

Albrecht sees the potential in robots being able to perform surgery on humans. The portfolio manager highlighted Intuitive Surgical Inc. and its da Vinci Surgical System as an example of how this is already occurring. Through a console that offers them a 3D view of the surface area theyll be operating on, surgeons can use controllers to perform procedure with four robotic arms that offer a greater range of motion than human limbs.

Intuitive doesnt just sell the machines, it sells the accessories like scalpels which are replaceable and need to be repeatedly ordered. So the more da Vinci units it sells, the more it opens itself up for further gains to its bottom line through accessory sales.

The next step, Albrecht said, is for this technology to allow surgeons to perform surgeries around the world remotely. After thats accomplished, humans may be removed from the equation altogether with AI.

You take the smartest doctors in the world and they might just have the slightest tremor in their hand and might not get it perfect, but a machine will come as close to that as possible, he said.

Heathcare now makes up about a quarter of the CIBC Global Technology Fund, which is co-managed by Michel Marszal, who has a particular interest in gene therapy.

The technology may still be in development, but Marszal said scientists will soon be able to treat certain conditions, specifically those that plague humans as a result of mutated genes, by biologically engineering new sequences to replace them.

Take haemophilia, a condition that reduces the ability of a persons blood to clot. Treating haemophilia A, which is caused due to a deficiency of a protein called factor VII, may soon be possible by removing cells from the patient, biologically engineering gene sequences with the protein in them and reinserting them.

Gilead Sciences Inc., a company that is in Marszals mutual fund, is working on gene therapy that might even be able to fight cancer. According to Marszal, the process involves removing immune cells from a human body and genetically modify them so that they become supercharged and are better positioned to fight cancer.

The returns on investment in successful therapies are extremely high, Marszal said. Thats really the next decade or 25 years in medicine.

Thinking that far ahead may be difficult for the average investor, who is often concerned with year-end returns. But it might be worth stopping as some futurists do, even during a quiet moment like a morning coffee, to consider just how different the world will look in a decade and perhaps selfishly, how theres profit to be made from it.

Email: vferreira@nationalpost.com | Twitter:

View post:

From space tourism to robo-surgeries: Investors are betting on the future like there's no tomorrow - Financial Post

What’s Not Likely To Happen In 2020 – RTInsights

The hype surrounding technology trends in 2020 is palpable, but not every concept will make it to market. Here are a few guesses on the failures.

Next year is anticipated to be a pivotal one for a lot of technologies, such as 5G, autonomous vehicles, and IoT, as they move forward into the consumer marketplace, in some form.

However, its unlikely that all the trends mentioned in the past few years will come to fruition in 2020. In a new whitepaper, ABI Research has compiled a list of some of those that are unlikely to make the cut.

SEE ALSO: Are You Getting the Best Results from RPA?

IoT is already a huge market for consumers (in the form of smart home devices) and enterprise (in the form of tiny modules that take all sorts of measurements and track item efficiency), but it is rather unconsolidated.

Dan Shey, VP of enabling platforms at ABI, thinks this will remain the same:

For many years, there have been predictions that the IoT platform supplier market will begin to consolidate, and it just wont happen. The simple reason is that there are more than 100 companies that offer device-to-cloud IoT platform services and for every one that is acquired, there are always new ones that come to market.

Another misconception from the industry is edge computing may overtake or cannibalize cloud growth. Kateryna Dubrova, IoT analyst at ABI, doesnt see it that way: In fact, in the future, we will see a rapid development of edge-cloud-fog continuum, where technology will complement each other, rather than cross-cannibalize.

Throughout 2019, we saw headlines confirming the arrivals of self-driving cars on our streets, but ABI analyst Susan Beardslee doubts there will be any commercially available units next year.

Quantum computing is also not coming next year, says AI and ML analyst, Lian Jye Su: Despite claims from Google in achieving quantum supremacy, the tech industry is still far away from the democratization of quantum computing technology. Quantum computing is definitely not even remotely close to the large-scale commercial deployment stage.

Read this article:

What's Not Likely To Happen In 2020 - RTInsights