Both sides of the Bitcoin: Optimism and caution as the cryptocurrency touches $35,000 Moneycontrol
The rest is here:
Both sides of the Bitcoin: Optimism and caution as the cryptocurrency touches $35,000 - Moneycontrol
Edward Joseph Snowden (born June 21, 1983) is an American-Russian former contractor who worked for the National Security Agency (NSA) of the United States. He released top secret NSA documents proving the United States Government was monitoring phone calls, emails, webcams of its own citizens. His job at the NSA allowed him access to them. He has said, "I do not want to live in a world where anything I do or say is recorded."
Snowden traveled to Moscow after giving the documents to American journalists in Hong Kong. He had asylum in Russia for one year. Then he was granted permission to stay in the country for three more years.[2] This has increased tensions between Russia and the United States. In October 2020 he was granted permanent residency in Russia.[3]
In 2013 Snowden was voted Person of the Year by The Guardian.[4]
After Snowden published information that pointed to mass surveillance by the US government, NPR reported sales of the novel Nineteen Eighty-Four by George Orwell had gone up.[5]
View post:
Edward Snowden - Simple English Wikipedia, the free encyclopedia
Quantum computing is a new generation of technology that involves a type of computer 158 million times faster than the most sophisticated supercomputer we have in the world today. It is a device so powerful that it could do in four minutes what it would take a traditional supercomputer 10,000 years to accomplish.
For decades, our computers have all been built around the same design. Whether it is the huge machines at NASA, or your laptop at home, they are all essentially just glorified calculators, but crucially they can only do one thing at a time.
The key to the way all computers work is that they process and store information made of binary digits called bits. These bits only have two possible values, a one or a zero. It is these numbers that create binary code, which a computer needs to read in order to carry out a specific task, according to the book Fundamentals of Computers.
Quantum theory is a branch of physics which deals in the tiny world of atoms and the smaller (subatomic) particles inside them, according to the journal Documenta Mathematica. When you delve into this minuscule world, the laws of physics are very different to what we see around us. For instance, quantum particles can exist in multiple states at the same time. This is known as superposition.
Instead of bits, quantum computers use something called quantum bits, 'qubits' for short. While a traditional bit can only be a one or a zero, a qubit can be a one, a zero or it can be both at the same time, according to a paper published from IEEE International Conference on Big Data.
This means that a quantum computer does not have to wait for one process to end before it can begin another, it can do them at the same time.
Imagine you had lots of doors which were all locked except for one, and you needed to find out which one was open. A traditional computer would keep trying each door, one after the other, until it found the one which was unlocked. It might take five minutes, it might take a million years, depending on how many doors there were. But a quantum computer could try all the doors at once. This is what makes them so much faster.
As well as superposition, quantum particles also exhibit another strange behaviour called entanglement which also makes this tech so potentially ground-breaking. When two quantum particles are entangled, they form a connection to each other no matter how far apart they are. When you alter one, the other responds the same way even if they're thousands of miles apart. Einstein called this particle property "spooky action at a distance", according to the journal Nature.
As well as speed, another advantage quantum computers have over traditional computers is size. According to Moore's Law, computing power doubles roughly every two years, according to the journal IEEE Annals of the History of Computing. But in order to enable this, engineers have to fit more and more transistors onto a circuit board. A transistor is like a microscopic light switch which can be either off or on. This is how a computer processes a zero or a one that you find in binary code.
To solve more complex problems, you need more of those transistors. But no matter how small you make them there's only so many you can fit onto a circuit board. So what does that mean? It means sooner or later, traditional computers are going to be as smart as we can possibly make them, according to the Young Scientists Journal. That is where quantum machines can change things.
The quest to build quantum computers has turned into something of a global race, with some of the biggest companies and indeed governments on the planet vying to push the technology ever further, prompting a rise in interest in quantum computing stocks on the money markets.
One example is the device created by D-Wave. It has built the Advantage system which it says is the first and only quantum computer designed for business use, according to a press release from the company.
D-wave said it has been designed with a new processor architecture with over 5,000 qubits and 15-way qubit connectivity, which it said enables companies to solve their largest and most complex business problems.
The firm claims the machine is the first and only quantum computer that enables customers to develop and run real-world, in-production quantum applications at scale in the cloud. The firm said the Advantage is 30 times faster and delivers equal or better solutions 94% of the time compared to its previous generation system.
But despite the huge, theoretical computational power of quantum computers, there is no need to consign your old laptop to the wheelie bin just yet. Conventional computers will still have a role to play in any new era, and are far more suited to everyday tasks such as spreadsheets, emailing and word processing, according to Quantum Computing Inc. (QCI).
Where quantum computing could really bring about radical change though is in predictive analytics. Because a quantum computer can make analyses and predictions at breakneck speeds, it would be able to predict weather patterns and perform traffic modelling, things where there are millions if not billions of variables that are constantly changing.
Standard computers can do what they are told well enough if they are fed the right computer programme by a human. But when it comes to predicting things, they are not so smart. This is why the weather forecast is not always accurate. There are too many variables, too many things changing too quickly for any conventional computer to keep up.
Because of their limitations, there are some computations which an ordinary computer may never be able to solve, or it might take literally a billion years. Not much good if you need a quick prediction or piece of analysis.
But a quantum computer is so fast, almost infinitely so, that it could respond to changing information quickly and examine a limitless number of outcomes and permutations simultaneously, according to research by Rigetti Computing.
Quantum computers are also relatively small because they do not rely on transistors like traditional machines. They also consume comparatively less power, meaning they could in theory be better for the environment.
You can read about how to get started in quantum computing in this article by Nature. To learn more about the future of quantum computing, you can watch this TED Talk by PhD student Jason Ball.
See more here:
Quantum computing: Definition, facts & uses | Live Science
No one has shown the best way to build a fault-tolerant quantum computer, and multiple companies and research groups are investigating different types of qubits. We give a brief example of some of these qubit technologies below.
A gate-based quantum computer is a device that takes input data and transforms it according to a predefined unitary operation. The operation is typically represented by a quantum circuit and is analogous to gate operations in traditional electronics. However, quantum gates are totally different from electronic gates.
Trapped ion quantum computers implement qubits using electronic states of charged atoms called ions. The ions are confined and suspended above the microfabricated trap using electromagnetic fields. Trapped-ion based systems apply quantum gates using lasers to manipulate the electronic state of the ion. Trapped ion qubits use atoms that come from nature, rather than manufacturing the qubits synthetically.
Superconductivity is a set of physical properties that you can observe in certain materials like mercury and helium at very low temperatures. In these materials, you can observe a characteristic critical temperature below which electrical resistance is zero and magnetic flux fields are expelled. An electric current through a loop of superconducting wire can persist indefinitely with no power source.
Superconducting quantum computing is an implementation of a quantum computer in superconducting electronic circuits. Superconducting qubits are built with superconducting electric circuits that operate at cryogenic temperatures.
Neutral atom qubit technology is similar to trapped ion technology. However, it uses light instead of electromagnetic forces to trap the qubit and hold it in position. The atoms are not charged and the circuits can operate at room temperatures
A Rydberg atom is an excited atom with one or more electrons that are further away from the nucleus, on average. Rydberg atoms have a number of peculiar properties including an exaggerated response to electric and magnetic fields, and long life. When used as qubits, they offer strong and controllable atomic interactions that you can tune by selecting different states.
Quantum annealing uses a physical process to place a quantum system's qubits in an absolute energy minimum. From there, the hardware gently alters the system's configuration so that its energy landscape reflects the problem that needs to be solved. The advantage of quantum annealers is that the number of qubits can be much larger than those available in a gate-based system. However, their use is limited to specific cases only.
See the article here:
What is Quantum Computing? - Quantum Computing Explained - AWS
Harnessing the quantum realm for NASAs future complex computing needs
NASAs Ames Research Center in Californias Silicon Valley is the heart of the agencys advanced computing efforts, including its exploration and research of quantum computing. Ames leverages its location in the heart of Silicon Valley to forge partnerships with private industry as well. Using these collaborations, the NASA Advanced Supercomputing facilitys resources, and expertise in quantum computing, Ames works to evaluate the potential of quantum computing for NASA missions.
The properties that govern physics at the extremely small scales and low temperatures of the quantum realm are puzzling and unique. Quantum computing is the practice of harnessing those properties to enable revolutionary algorithms that traditional computers wouldnt be able to run. Algorithms are a set of instructions to solve a problem or accomplish a task in computing. Quantum algorithms require descriptions of what operations should do during computation on a quantum computer, which often takes the form of a software program called a quantum circuit.
NASAs computing needs are escalating as the agency aims for more complex missions across the solar system, as well as continued research in the Earth sciences and aeronautics. Quantum computing, as it matures in the coming years, could provide powerful solutions.
Quantum mechanics describes effects such as superposition, where a particle can be in many different states at once. Quantum entanglement allows particles to be correlated with each other in unique ways that can be utilized by quantum computing. Though why these properties and more occur is still a mystery of science, the way in which they function has been well characterized and researched, allowing quantum computing experts to design hardware and algorithms to use these properties to their advantage.
Ames Role
Since 1972, when Ames center director Hans Mark brought the first massively parallel computer a kind of computer that uses multiple processors at the same time, or in parallelthe center has been at the forefront of developing advances in computing.
Today, the Quantum Artificial Intelligence Laboratory (QuAIL), is where NASA conducts research to determine the capabilities of quantum computers and their potential to support the agencys goals in the decades to come. Located at Ames, the lab conducts research on quantum applications and algorithms, develops tools for quantum computing, and investigates the fundamental physics behind quantum computing. The lab also partners with other quantum labs across the country, such as those at Google; Oak Ridge National Laboratory, or ORNL; Rigetti; and is part of two of the Department of Energys centers under the National Quantum Initiative, specifically the Co-design Center for Quantum Advantage and Superconducting Quantum Materials and Systems Center.
Applications and Algorithms
What future missions could quantum computing help realize?
Quantum computing is a field of study in its infancy. So far, it is too early to implement quantum computing into NASA missions. The role of QuAIL is to investigate quantum computings potential to serve the agencys future needs, for missions yet to be proposed or even imagined.
The key to quantum computing is quantum algorithms special algorithms uniquely constructed to take advantage of quantum properties, like quantum superposition and quantum entanglement. The properties of the quantum world allow for computations that would take billions of years on classical machines. By experimenting with designing quantum algorithms, QuAIL hopes to use quantum computers to tackle calculations that otherwise would be impossible.
Current research looks into applying quantum algorithms to optimize the planning and scheduling of mission operations, machine learning for Earth science data, and simulations for the design of new materials for use in aeronautics and space exploration. In the future, quantum algorithms could impact NASAs missions broadly. QuAILs role is to help define that future.
Quantum Computing Tools
How can software support quantum algorithms and their applications?
There are a variety of tools QuAIL is developing to support quantum computing. Those tools can help characterize noise in quantum devices, assist in error mitigation, compile algorithms for specific hardware, and simulate quantum algorithms.
Because quantum computers need extremely precise and stable conditions to operate, seemingly small issues such as impurities on a superconducting chip or accumulated charged particles can impact a computation. Thus, error mitigation will play a critical role in realizing mature quantum computers.
By modeling what kind of errors occur and the effect they have on calculations, a process called noise characterization, quantum researchers can design error mitigation techniques that can run alongside quantum algorithms to keep them on track.
All algorithms need to be compiled for use on specific hardware. Because quantum hardware is so distinct from traditional computers, researchers must make special efforts to compile quantum algorithms for quantum hardware. In the same way software needs to be coded to a particular operating system, quantum algorithms need to be coded to function on a quantum computers specific operating system, which also takes hardware into account.
Tools that allow researchers to simulate quantum circuits using non-quantum hardware are key to QuAILs objective to evaluate the potential of quantum hardware. By testing the same algorithm on both a traditional supercomputer using a quantum circuit simulator and on real quantum hardware, researchers can find the limits of the supercomputer.
NASA can also use these simulated quantum circuits to check the work of quantum hardware, ensuring that algorithms are being properly executed up until the limit at which the simulated quantum circuit is reached. This was an essential component of confirming that a recent milestone achieved by Google in collaboration with NASA and ORNL, demonstrating the ability to compute in seconds what would take even the largest and most advanced supercomputers days or weeks, had indeed been achieved.
Learn more:
For researchers:
For news media:
Members of the news media interested in covering this topic should reach out to theAmes newsroom.
Visit link:
What is Quantum Computing? - NASA
Since the 1940s, classical computers have improved at breakneck speed. Today you can buy a wristwatch with more computing power than the state-of-the-art, room-sized computer from half a century ago. These advances have typically come through electrical engineers ability to fashion ever smaller transistors and circuits, and to pack them ever closer together.
But that downsizing will eventually hit a physical limit as computer electronics approach the atomic level, it will become impossible to control individual components without impacting neighboring ones. Classical computers cannot keep improving indefinitely using conventional scaling.
Quantum computing, an idea spawned in the 1980s, could one day carry the baton into a new era of powerful high-speed computing. The method uses quantum mechanical phenomena to run complex calculations not feasible for classical computers. In theory, quantum computing could solve problems in minutes that would take classical computers millennia. Already, Google has demonstrated quantum computings ability to outperform the worlds best supercomputer for certain tasks.
But its still early days quantum computing must clear a number of science and engineering hurdles before it can reliably solve practical problems. More than 100 researchers across MIT are helping develop the fundamental technologies necessary scale up quantum computing and turn its potential into reality.
What is quantum computing?
It helps to first understand the basics of classical computers, like the one youre using to read this story. Classical computers store and process information in binary bits, each of which holds a value of 0 or 1. A typical laptop could contain billions of transistors that use different levels of electrical voltage to represent either of these two values. While the shape, size, and power of classical computers vary widely, they all operate on the same basic system of binary logic.
Quantum computers are fundamentally different. Their quantum bits, called qubits, can each hold a value of 0, 1, or a simultaneous combination of the two states. Thats thanks to a quantum mechanical phenomenon called superposition. A quantum particle can act as if its in two places at once, explains John Chiaverini, a researcher at the MIT Lincoln Laboratorys Quantum Information and Integrated Nanosystems Group.
Particles can also be entangled with each other, as their quantum states become inextricably linked. Superposition and entanglement allow quantum computers to solve some kinds of problems exponentially faster than classical computers, Chiaverini says.
Chiaverini points to particular applications where quantum computers can shine. For example, theyre great at factoring large numbers, a vital tool in cryptography and digital security. They could also simulate complex molecular systems, which could aid drug discovery. In principle, quantum computers could turbocharge many areas of research and industry if only we could build reliable ones.
How do you build a quantum computer?
Quantum systems are not easy to manage, thanks to two related challenges. The first is that a qubits superposition state is highly sensitive. Minor environmental disturbances or material defects can cause qubits to err and lose their quantum information. This process, called decoherence, limits the useful lifetime of a qubit.
The second challenge lies in controlling the qubit to perform logical functions, often achieved through a finely tuned pulse of electromagnetic radiation. This manipulation process itself can generate enough incidental electromagnetic noise to cause decoherence. To scale up quantum computers, engineers will have to strike a balance between protecting qubits from potential disturbance and still allowing them to be manipulated for calculations. This balance could theoretically be attained by a range of physical systems, though two technologies currently show the most promise: superconductors and trapped ions.
A superconducting quantum computer uses the flow of paired electrons called Cooper pairs through a resistance-free circuit as the qubit. A superconductor is quite special, because below a certain temperature, its resistance goes away, says William Oliver, who is an associate professor in MITs Department of Electrical Engineering and Computer Science, a Lincoln Laboratory Fellow, and the director of the MIT Center for Quantum Engineering.
The computers Oliver engineers use qubits composed of superconducting aluminum circuits chilled close to absolute zero. The system acts as an anharmonic oscillator with two energy states, corresponding to 0 and 1, as current flows through the circuit one way or the other. These superconducting qubits are relatively large, about one tenth of a millimeter along each edge thats hundreds of thousands of times larger than a classical transistor. A superconducting qubits bulk makes it easy to manipulate for calculations.
But it also means Oliver is constantly fighting decoherence, seeking new ways to protect the qubits from environmental noise. His research mission is to iron out these technological kinks that could enable the fabrication of reliable superconducting quantum computers. I like to do fundamental research, but I like to do it in a way thats practical and scalable, Oliver says. Quantum engineering bridges quantum science and conventional engineering. Both science and engineering will be required to make quantum computing a reality.
Another solution to the challenge of manipulating qubits while protecting them against decoherence is a trapped ion quantum computer, which uses individual atoms and their natural quantum mechanical behavior as qubits. Atoms make for simpler qubits than supercooled circuits, according to Chiaverini. Luckily, I dont have to engineer the qubits themselves, he says. Nature gives me these really nice qubits. But the key is engineering the system and getting ahold of those things.
Chiaverinis qubits are charged ions, rather than neutral atoms, because theyre easier to contain and localize. He uses lasers to control the ions quantum behavior. Were manipulating the state of an electron. Were promoting one of the electrons in the atom to a higher energy level or a lower energy level, he says.
The ions themselves are held in place by applying voltage to an array of electrodes on a chip. If I do that correctly, then I can create an electromagnetic field that can hold on to a trapped ion just above the surface of the chip. By changing the voltages applied to the electrodes, Chiaverini can move the ions across the surface of the chip, allowing for multiqubit operations between separately trapped ions.
So, while the qubits themselves are simple, fine-tuning the system that surrounds them is an immense challenge. You need to engineer the control systems things like lasers, voltages, and radio frequency signals. Getting them all into a chip that also traps the ions is what we think is a key enabler.
Chiaverini notes that the engineering challenges facing trapped ion quantum computers generally relate to qubit control rather than preventing decoherence; the reverse is true for superconducting-based quantum computers. And of course, there are myriad other physical systems under investigation for their feasibility as quantum computers.
Where do we go from here?
If youre saving up to buy a quantum computer, dont hold your breath. Oliver and Chiaverini agree that quantum information processing will hit the commercial market only gradually in the coming years and decades as the science and engineering advance.
In the meantime, Chiaverini notes another application of the trapped ion technology hes developing: highly precise optical clocks, which could aid navigation and GPS. For his part, Oliver envisions a linked classical-quantum system, where a classical machine could run most of an algorithm, sending select calculations for the quantum machine to run before its qubits decohere. In the longer term, quantum computers could operate with more independence as improved error-correcting codes allow them to function indefinitely.
Quantum computing has been the future for several years, Chiaverini says. But now the technology appears to be reaching an inflection point, shifting from solely a scientific problem to a joint science and engineering one quantum engineering a shift aided in part by Chiaverini, Oliver, and dozens of other researchers at MITs Center for Quantum Engineering (CQE) and elsewhere.
The rest is here:
Explained: Quantum engineering - MIT News
Overall, the most notable advancements in AI are the development and release of GPT 3.5 and GPT 4. But there have been many other revolutionary achievements in artificial intelligence -- too many, in fact, to include all of them here.
Here are some of the most notable:
ChatGPT is an AI chatbot capable of natural language generation, translation, and answering questions. Though it's arguably the most popular AI tool, thanks to its widespread accessibility, OpenAI made significant waves in the world of artificial intelligence with the creation of GPTs 1, 2, and 3.
Also:5 ways to use chatbots to make your life easier
GPT stands for Generative Pre-trained Transformer, and GPT-3 was the largest language model in existence at the time of its 2020 launch, with 175 billion parameters. The latest version, GPT-4, accessible through ChatGPT Plus or Bing Chat, has one trillion parameters.
Though the safety of self-driving cars is a top concern of potential users, the technology continues to advance and improve with breakthroughs in AI. These vehicles use machine-learning algorithms to combine data from sensors and cameras to perceive their surroundings and determine the best course of action.
Also: An autonomous car that wakes up and greets you could be in your future
Tesla's autopilot feature in its electric vehicles is probably what most people think of when considering self-driving cars, but Waymo, from Google's parent company, Alphabet, makes autonomous rides, like a taxi without a taxi driver, in San Francisco, CA, and Phoenix, AZ.
Cruise is another robotaxi service, and auto companies like Apple, Audi, GM, and Ford are also presumably working on self-driving vehicle technology.
The achievements of Boston Dynamics stand out in the area of AI and robotics. Though we're still a long way away from creating AI at the level of technology seen in the moive Terminator, watching Boston Dyanmics' robots use AI to navigate and respond to different terrains is impressive.
Google sister companyDeepMindis an AI pioneer making strides toward the ultimate goal of artificial general intelligence (AGI). Though not there yet, the company initially made headlines in 2016 with AlphaGo, a system that beat a human professional Go player.
Since then, DeepMind has created a protein-folding prediction system, which can predict the complex 3D shapes of proteins, and it's developed programs that can diagnose eye diseases as effectively as the top doctors around the world.
Go here to see the original:
What is AI? Everything to know about artificial intelligence